var/home/core/zuul-output/0000755000175000017500000000000015136663642014541 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015136701650015475 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000363204515136701533020267 0ustar corecore[{ikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD |٘I_翪|mvşo#oVݏKf+ovpZjB% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿-h QZ*U1|t5wKOؾ{mk b2 ܨ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߾)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7siv'=k 9Q2@vN ( R['>v*;o57sp$3ncx!>t®W>]tF-iܪ%GYbaRvHa}dkD̶*';ک|s_}8yj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}zӲ$k3jS|C7;A)͎V.r?t\WU1ojjr<~Tq> `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞ~J|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()¾5m'p/\խX\=z,Mw˭x:qu礛WԓL!I xӤ1(5AKRVF2ɌУլ F "vuhc=JS\kkZAY`R"Hr1]%oR[^oI]${&L8<=#0yaKL: JJl r;t#H+B|ɧJiM cm)>H=l}.^\ݧM<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~* ^g/5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyֻΗ|ִ0-6dAC5t[OM91c:VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtWǭKv4@Va3"a`R@gbu%_J5Ґ 3?lm$K/$s_. WM]̍"W%`lO2-"ew@E=! I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3p.%#g*{K.ު s^ &7`d[9ɃO>z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓI+mj(^>c/"ɭex^k$# $V :]PGszyH(^jJ=䄸-m!AdEږG)շj#v;#y/hbv BO Iߒ {I7!UՆGIl HƗbd#HAF:iI }+2kK:Sov3b:1)'A.B}E NB au)T܂;S䤽|7,)CfHCH#IY]tNWA̕uF&Ix.Tpׯnn|ޞʚ[Ưy.xF%ڄPw5fc=f짩Q{rhbԉ]eH'm%=X |hM6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċ+@@twˍZ}"Mǹq6H6y#w>MTu;ve!"GFZz[Q *48-1v@-v't*9,8n_0CnW\j1<}e=+T395t.VBP #nVLOeP5tQ gbjLd+([H aK[8`r{u`dvX ޲p6*ֳ @ Q0!5(6tSH ? ;*h} ]laoe-Zwqe~^;#G>xk1a}'R `,GLf kJ:\``c -)+DZ(v >F m<X-' < QKstX*uvKQrs9X e`6З*%b1#kc8,1FohItrmsjTn䧝*lV⅍Ll S`ŐN'X'60ӐBD.ZA-;)_EH1 f,C`{޸eAE&?Y?dWɺ|%9VA7~߮ ?܅)iVL YcYgA!A ew6οHWIwpulm: gJwg39`qLGF9vZ V K8&хZهB@k06n7V((j;E*4Qzjyg}Bqn|&g0>nc!Xd,Xj/'C"O=Cg RcwvC w#;dq@AEZHKۜ˜rZ{ؑ ,) ԋw,A BY1MUges"s#Pި8*,Z]i c/9,D#amCH!d-R"!CB]X*^srLXOgh\1Dq)^gV6E`?>CH!ja yE~Xɷcot.d5 O鞰#K>~+Z au"S@#=;XK*4bnضJ5~Ah+1ަLґC6 aQ(Lߏ3˯դ*EC`A@Ws\hxT [9\~#'e".a~.Y~;m9٧b=\5} zړ&. A!D%-b1X܇p rBX+ g W/ ,ô#A8.3\k$r{c>G=u@8 {y ;I\qAjr+d38> ǝ{KSxX m[‡U/B_8 ; @ij`z}-X=.뭳RS˟āJIאsdV@"P`Ywp5X9ѩsc%|XtRA^J݊jwfآG fP<xav#Vb@U#>Jȁ#nǐ0zzA`)X%h9!J@W[GUaK~1A:H QK=bg`3rT^({jxRȚQ8ӿ_y.nB?!]wm:c-jJGKZ7/(EmC/GO2Ϻ<\=>=4.t_*~Cju_=M!aF+o6gQk*Փ= k-ߊ۫R>oDq<^^'qR-- r-\wp~߅Eߓ[GqHLM' B$uoa m1? NWyGKd)@YY b8;ю,UȊ|,1GA۫. ѓ6R3bGQR3RWB$d9]T@K1tZh%N@%X1<Y5xK" koIY$d [BT9`C֢@>LJVO+U!<C@Bcxd *0s|hx$"V"sBCB5xMP[(N ? -X,DzԞwdQu&/q2-G&E/3lWk1 CCm˱>HEWm1<,i K@Kaw7FuIc͖= A ۈ a)h%^wWoK {و|D |y 򽌐`'s- 1<;*êD#lriW Qi!X⇕;N8m!eXǕa5 R5OǥsSZJTlf td%f/W:M 6TriI!bﺸvL ti9x!h8)ɟsO` 8& Uf B}m:ZHN5zBw'`tXva+n>RW\NV3tq1T j+h2Tb=X-\R19Z6#/8yb~!|^<}N >E%@KGm GaE0.]aniߦ!]crXl%YKC)<{Izߖ}sYdIX +ӝ:RN;K]a;E&|>8e lݍ3)7 6j9ZJDRTY{`Nq[‡Uu [~.E xJ0n@%J۶[2Z.)iĹ pZtG݃bÖ v{.])q4,mKMFVDÃ4Haett y0vB;j1uX Vw{s1Å-6m\j3rX1f?q-T)J2&V$ ktZJ5SW3߃ᢖy 0\o |4iF*@ B,`]=*7pw\6Vc 2XR"҅1igsSIA0CUN.ңP.j-fpbi})j bb.5#`ܰŷ3bstX%EK(NFGU@(4zߞ@V$8 {^ )9@N74Eq}:II9,H+ttp [t fe++1;ܰݙMj1WMZUX[N_'!LÖ▚e~= aEAZZXgsu7 a>  <,myS;޳yȰ%Iy >aǕ`QyfLhK" 2rV-Uzc0c" dAb ȌZtsP?&H$J QGԤpuhu$9| qgD f 4ZlZL ^dY~FbUkRb0GGVxİg/{IV-MD-*Nw rJ f lzF)VD-:- ljA>^%6 wBLYwAk6u(*5%|T:JlziQUDb s;֌V췐jo")z j)!x.HaE(,a%X%c ꡗOo-%`7iSOLeͲUb)h)Zm `-/%k [i msdAj!%`hs魲ܰwܩz*p j]Fѕ @a']i/#X#C# w F(kD H Y)sXc~g }2ssXw^d6=!VH $4lٽ%_؂$󝋵=I 2xi.}˱>@B ޏ=$ 31`:^$4l"3Wa~Pxۀ6 8LL~SQ>#.f/EpihM?8*4McpWd0-XǕ!KJK`-aa˯]0ji]?-e8"? sb\+]\P05u4ެKPYV 6\r`\߸:3 )Ik*Qk!^AWQ B\`t!b^H,ê΁9aY {Tn qʶUpL[##P3zo7.@|>h-T%v)VS=:3rTjO/uQslu{yBPR6ĿH;qoRA]J~q\7W7'i\|_lJGpEq{S^,+ŷ?᪤ ?>T+▿h#7t],k?VwˇˢڨͦOۧx}ኟn!E!|:h o`o OjeF7"t/׃xd *Ova!d!.]2A!Uٗ`FJ`e-ZBkJh6;Y+|y>T=ƽt| CG!m8V@":eE{.YU{0\Rt=H᳿h-`ŰSCK*G2,#Y1fs%J${ GCeѼ8DX +Gdf/r#} %ptfC(/, P0]E~T]oHW/g `gƉon13_Cg")YrlbL;vX]]*Nh$2т&NGIXUgOvX&o$MQbJ/4_2?V5(xƏUX"4xGEWrTؔciUQ*p,v )V1j^MY՛fk*8+nϒOyTH?!Uaj/o"KGq*Sh-ţ2R,& qHi ƪ7ay[ CUn9nk |34HFwc7U]A#\ȱ ^`4Ldz͘|i=N!`Ǚ7e?"'9Q_U@tJʰ𔔝=));areF9L#d/Z4_WF+ݗ3 ;csuF eqe9#xY\zd~t`m, 넖`j.HWfڀ1ͯ }a\n= +Zә1}P;gޫ5ɗsޠaYBt ~h=*EM0lODuOV1xME&{q6 q.兖o/HsM?%_NngD a$~S)!]i$=}Oi8!I..69kvǸq}'wGm\)B-Oں{'iqJ7~ T#xO=R Y~V3Rd2B-dnpW;y7u;%qTXd ž] i\Ēfg*,V ^/Vyxu8B0e\zzZQ]TYgFd.y#m#4H [7-v0l=-FoF{͓֨#} Ddc~g;&?tWiMR 2*`DCYS^'U3Njzvm| }]$9j9؝"!bZ A[p@.kXt->V_ aHGqHw<(k__",1DNhЀ89e *d9D+YC!ĕUр$@ZDD8Ԛ˺<=Ks9HK+ tUN{@MS)+)HPLr!b40eS|އ}ݱ7k WM~}w뉦q!кG$jуԒ$BkGum7WG!R"] -ZW$NЉf*)Vezuv-E_E<"ph<|dzEC>$/4;>:Im{HymWhEEix,2 bў>>e*̮Pw2+y&t GP~;#dn2}FDКK@mH[a99D;>^F4Mp {'%<#")oREWli(H&1Bi:SQ\[0eBn25%gmT58#gQY< ',,)~NCػ }Y鐋e\]O-W f ȥ]̡uҭ`Eh]+7R-ޞ%^i}hI4]/0m7@.FZL_<.h_Mp-SPbN|*kk[ L mxFoG$& )vkØ_ YZW`=Tu; V`lp}_l7FE | -wUVGjxE&[tu"]{Fw)F"UE+jL9ĤxVS%ځ{H=+&Z}[80[fb+ Ծ΃VP]1DHaՔai>XMiJt* b*XW@h.Ԙ 2\gq$Zkʄh|7I cE_ux󵏅RDC曺6ΖXojXXre)IhO65Hۋp{lk@C:1D^~Q3~OoyM5t16uN]Mj7ɦi3gm/ 2O^c_K9>'y)"̉2TiUmdO;ݦ%5嫲,yjU . 4*ka,lh mukhZrX\w^[cͤir?kyl'u4Ћ|3̰\׳BY\&LڛJXtakyjj7+!UGqzHR-2eXpAjhh3%. 2}ΫMl}C萖gSth$UzC! w$X );>Dvetc\{~_v0TH\2#.eK#)EʟME]C=zM\%ѽC> Uwh;Dե"1aD;ȫn< LKI=ξϺa, RmD n@Z3|75=0?Sa5*yZA:2њIwPV2*g|dqv{Dp|Qrٌ㣶|#x ^U9BEK#: yt-04=)nizfc0GڴRacκ3=$Τ L=6 r$dJ)^q R4Ǧd,謁ǖǯz7}A? |fIW{hU,3UUe$ J+R9TՓ\Ӯ Re*"O7ʘ8ar^99Z%iUg(ŵҷ\S. gf/>z\[6~dsoؼnl`"1Y1x:O 8&/C]G10kMPZ(G งѬ9yXChu`24=`ta sUb=y Pk.-sC0v{ԙCh>pȘDx! P#x_uci[(@s3f7m5:lwΘesels WƜÛ C\ʶ} 2C` ;3X?d9`gciGg.aWs C xW7İ'`g K/4GQ $~p 3~祸tYH)h j\GiP؁tfm qK3s i a* Of}/<ģsNY\\VhXU5:4&WW:+} MZ0m>}V۽ ]fZvf81ALI hTcm^Oٻq,WPݝP.]N<>rQ$e#R} )[2ٝEwN`![իgkBֲP ir$$2Jt8S=ni14U0ڏ{uzXO$mzSwTXe{:SUV^i~Ihq$SԂTUHHU &+FQYU PFHIk/KqzDm.* abG/ =?v~_vr} x0 x{8O~jrm8" -h[s|`W+ecE$n76x ytш#5r:Fn6`pkտhī\%_`b:A%ߓsxo:i7w O:c rXhx[PS:h$k8'(\Y`x1v^(84ص vVn۶^XjAr鳀jDD76[pv@nWKAx8hC@-څ)^\j[&P~q4m+ _+hSOWR9P8Ze<4a ztD#YIɛjŃ=A `\D#)]%?eS?D3/ b RCbF)Iw|C&s˨ׂ)rUZЇPtnI4 2,117Qe~ ?տKR{tl4z۔6<'BlzQ ƕeWc /,EX]1IaT\Œ[sbPL'^$:x,n b{ImB!)!o%i@Ȫ8&&u Byh\\8T3|>J׵D7qe12Fq#Xn> )X7`m&RBK @QϿ°Lzq1Әn`+ڀ'hpdmdof_}dt8.$mQ6-=cAPWBȂ u:DL IM&9,nSj jQddɸͦe03>fеm o}u tY_U_~2Ba)[=fsQka5/SAFwoV?4.v~>[DG䆑BZ [6@53utaE)&Z$Ro$r͘B\qjn)OpCaQ=8:S:znEfȴ݄+rMG~2ݔ;^OқM.L2t+%OgE;o@dЄK |Zɀrqb5[HWܚt{).|8*%q049?"dMsTHtXH"˜ү!`&<1VpWYu=\\ jZְZ.V"RQ֪ZZ~Cpm*RE>†񧹳zJ82Pٰ8d$|E@bpLw[ws"㽿Ӟպta?x|P9 bT]KgBL1'8/,2,ױH ^g0=U O,}7W'-:'W+CMRl9پ>,N9!$`V`$fz}?If#JQzj\\f9(lj ceEYmZ5ZPX4H -=Ji{&aPp˥1`CJ J[ap~-Qp,#/ɣLLt?!PhO7YUn0I|&4AQKj+{FίAM+ .o 7"7(dgАZ̥Qܯ#b (zo(hAAyNMn~!En!-ʶ mO({lGB6t)ۦOʞ٫lne[jnճtsvݦw2| bwoALZ#ڑPk BeB-'~[,lA=uv$قPwPw B uG#z˄z[mOBOj2o'DcUs(qh}E\i/'%X;@ʼ@m63_h>=YҙC:;B:&qki`w KbɠM /hgߝ{ԴshQpi|ҵ1(eeXϟB[$`ZdeJ~&;!|<#6{Cs4ǤA`_O+څ>NzjeyM/l$l !>,O_),!H9McU̷6Y¥?t4>0{h>@+80LDPEAPM%{0FhNQŽV_jz(~ +' 3cC?ڸifo=RW*gеr4!¥چ4fxk[r̓0FqVp sg"[Y"zŭ dqPy ig.7khshS:u:Nhd e98{ci.@-'le[#xX5,TT_Ӳ<&ph4c OK hz.kB(Nb qMNxecI7 y$ӖgȄ]j#ήU&<·f7zeC!Hբ'9E'Mꝏ"QY3L"X/;K5Gd W Mw@{& * ң4WQdIk6`ei6@ .;k׈]e+Yu0OD:[F_!\p$">A0S@(C=^ZӐ 1,$#ܰ>!yV,N?@>JR fo ;HSU #<$9PTRuQ a FZTqkUՎ$Uvʣ`£q,sOḸߞVNOOr8 :+rU4&ǧטEk{C7w|eSQ=Dr M|Z}u竡R$>57!3EDn j#aߜpCPٯcji)-}(8k, lKY*ΙBnDEa`qN:IDHAkJsZ1J`^2]~>TEʆF\(T@k$=" =c>u]GM>uZ+G ?A{/N8^-3f@cgfGb b'8%,xwF|7C:Yry]R(9 LX"M.e ^0JZ;Ta'*#R%>4u!*^'ۑ8UԈdU ZI11ji^\% 5ܔf'fJ*͝sts(6OtpYm2.]ht0"Rx/Dm?Ah[ZOSBjmN% ,ARy4T@ kSW|+5ؿf{Ǵr09ːM< #zZk/F@q p<FS \܄XUQJ*>x3Dy\-PTG<&)2&92Z昃oag{ѕashi}c&$ ߝ/$3Wsjt"_j\e,:u}65b*I(OO)U>)Ch'^uWߓp5p1 [=μh̉R#WySnHrgmL?8]G =Gi:^uٶV->}ѐf:DN҇-_P> U &jyA )QJVd~,g뛣 /͆4 /w-m$IYiw1/O-Tdk1}#KT-9(ҖvSdM#G7oCC?O_~gP;|K<};mOr_}m}[o(_MM)>=czhssgssto37|7S= .ߺ #ڇg=v:xw.Wd]_US}zc#w}\bB)=3٭0X/pR&/Gʟ~xզ&MܱeZhB4m+Flcv>aİDk.8ӒpD/҇P\M1cb:[ÒŠˆj|Kr|H:V) "Spt12g^LQ/FHI,oV,Y]o$ʱ:V9Ӷ潮LdF#aaIΫ\btn5F,N0>\o+B&3,F@6=L>a$^& *>=#KH5O][?&\{ ?a$$^Hp~, t'ysJ0]#gQƼ(HIiZG AayYF)ɮC5HgjVd5~,6ABFC$vۼ&ءf"1ρ )pqp/&bİ{:v1R `c8~E)&2G#-FFtXÓqRyd˟Ό`$KG-Ԣ8ͻ8͛_<'OaA&k;2om&FpHbuz0t*D(3 y7Fhl 3iiА䀕]!ڂQ0,v\uKvؘa(n2-Kx/E9#2.!p^йoi8#-;Z%AЎ eim2.0UbZ9X ,N?`vG7Kp`ຂK+=f@Ln9ۄ1I3~2}l.szWM#z\`- ^EfWL_޴2Y+\x!JS/ѴaIlz"T: )uL􅰺GAq"S)h @Q92;ԖSd R1]Lї"=w#$ΏJ)9ptF0rI@w@F}3ʃc^_C\yԽS5WF>^;xqx{/aJX|t`PO`odM~DVVEpv %7ϛ:5a+1#Kp/8?U+* L EZN h!NTS tX4f$I9s)qgj$b*ZByvRnǾ訋Z#ڀ^s iH)4*MB_?ǚaSPIIV9VαgGsW'qpG5VpcepZgLhP[EH"޹K ~Ojh'KBEYV2GgEnM̥0GeDP;tEٚ`Fm=x0ʥͪ1}<= S#{C̅)Xt&jV +`$X*kCv/#+ʒEMI{+A)g Ő8SⓋp%v"UJڻ[atyICJ9_0x|-3m]TyZ#I * ͏,;>:xm:4zQgXTl [a^V$ 4!VDdՃsTdt*Wc`QC| p;*"i0Y`OA7x01l#{_IvV5řsLz<5$ lI#=dHC~iGEcfR5?у?.sx^nnIp[]n$8,|ii$3p5XS[f˾RӄK$,LTⅩQz>޾qȹ6~Q6V D QփKLڛ(X@D&֤)k\Ek0v:Gcvil:ą5 1J*Rh2Mۛt΍U|`6"C_Ig8|% A *lE%g0!iw$8:éw2&k. H{mu7ˢ`\7"Z{SB4^ʣW+;+cg]t`mL۪-dH:4S̕SEb-bB2夸=(lՉɞH=k:Uנ 2tvS0 }؋؋hsx!q`HsjVNZ &#$R. $o.{)[ ._e>$fF }Q\dMkÂU5R\"0F9q/$+݂<ł?؟pL\aݰ/5{4yg .12 p*ZWV3 nP 81,%u| ɨsmv%jV;TW{A`ĶT $CaEbXG _ώǰ31ӛϼ)gUGb)>=7/TTY>͆kUImbA Fҗ#q {ˎ$XFh;~j;(o=6maͥchaY(J*MAa$hȝ}"q"KWOV|ow1v<ǧ̬q=ݧ*MSr?$vWjA)Ij:/L4U-է6C~kںET&a*Φ\H_^OՁ=n׫LCvȍkq`l fSXɱR.J36]!qc|)C8?=>8[ċ n|X̢VVeyv[+2b8d0!0Xmw$pn\Edz=T7+wfo | 9ܻF=b,i rfYkec5²1nlK߬x]q4uј mUr18FDE>,S'G"d !ܕ-k.4X U=:/]rT>6j!&=N}T/l&yO[p/fǀH~Օ I4=S FtȹIlݼ6 PS|B80~e11mM}>Lb?ޕ6rc `ll>Aݛmlas| v!QI #ld%$Ye|wQQ@O*杧qĞkꛓ4ԋ8}'h=#$?b͛ Q2mؚ-xY@6j9îU1fwf ][pQƱg"n?GN'pښ/-X}p{òw>E|8r<bZ^0~2)%p4IA\Kf&'_-]HC$]$H&hH寧=W=r{)OCr4nEz.?tۑۨ=go_5[j *H/fP;ϫI2O1hh2kCo9aI w>QCD^r37 }ǟ鿩LqFeuc'iK4X`;67ΰY9j'R4҄ц"g/Vsh'X6ep >$)jp f'8htAoX3ױ?s6tx1 'c<>3l\c/h}3.'?ٵ|SL.<.#a~`; Ty$w5jGG'T''?(P? f r>)~޻LI{cx YEyq'|Rhc;nP\SW`CFӱA)3$ثcȱ0LG+?]+1" DᅉTlbLpBîL܏hNgpUh=\H~beΣJa~tJhV4'\wvFUb)jiIW8xWZCD;0]&q7S&}!zizICpa&^W' [GMq@_#0>e: d?FE\]fBb38o>}pc]cGGV?yܺ\}6sԟV1X['(}hܔS777@בo$6uP"?v)8sl87 ka񆊓on+-u2g9{>_~ B Ez6Ff`58 L,KiW2g4<`b6$ɑD)Jiџ 9M+Mz2&91VIZ6ÑTuuB D BRUqV4@ݍl*1Tr)=0w0RnIfEw DN -ݘax+\P,;#;Ԥ|E.Y .P4Cnl5"օ$'*#\ҥl CIubr{ ݑߘh ҠeA+iPXDhf$w'NM?hFpb&ĭEXAkM6('b0>TPQF!WQ$Xk9NN1*SbOfpc]zZs;!ʍj Ti,Տpߧ]*8`CZ7|}2AQB}˰4;.F(%8cwH)/),>7IV-.[@M}};]xllII7ſq ,'RaeWQE;Vjr4{~ Ƭjs)cvW`d[.7,(D+(^ Tx.hu,XW!2GCfu:k#TX0%Ϊήȕ3+S'J`DNR3 apKrVr`K+P0U[xcL0vܮnQKʻtrwt> Gd\mpQ=UZCJ.sxhk\=RǠ ftYO)eD㤉y-ۇņL0i0-`:pp/{hgvHp*GregZVWР1Iu/KHJx,rPnXk6nEVNfVxFicnŶ K Npa^IMsV9‘ܶQx!P9i?>gVټہy&9QeuwŘnʂaa=û {\WkA^s볗S.w?',u8O|I{Ig)S-[ĥ59źS@xwpΜ5NG_]mb毵lQ7ޝMsQBo "5I}VvnZwpHSq0oJFYqP&7 S5:gy.}Del|Xإ~Dc@$={{q0no.E Z-qfDuqqtR A$.W'$[)dAr^ R;wⲘrzys@.r1WI\@(I>OX6u6Y+wpհ1` Oe{p}X8]e_oV!xmh_YyxՍ+*o^oںZɅ6]X Жt+@tˀ~qguknknQZ_9VPzPm;檕ul6 NXΜØz50e}膱W؎?8G @lpGû'._D>? H3)#:R>fGSw#j9+5 E':d:F _Vg65nSǀ4ά?n,V3*m*l+Z$ά!C_er޾>]w")*,&9s(f[,#!iQ/WqW}H&u^6(bX% 6>p7ޕ \kf-m>>=rZӱj%z{d rܶ(~ 6_ʘg&+̭A@ƣ27 !iZ}|Fl$6W6pYhL3jnGI0-v(wqn94|@:wcMsHszϙf`l`p,9b1;~rZo05`(8l: _cu Zǭ4Pas'(uϭ Hc͍݁Kaw& ,z-Qo(CO Pf!(JcE39R' t $PErc}.G^*E< |$PżZtLJ*PKwߴWkskPCNs܁s˸#:W.ȇ-ؒj;U4YFVI}~cR;0;[w+VƁLp$i7o@3IQo'yl-;Ω8"yZTt[MP:U;MFDF,YmQ aitҸ{a J4\CG A$U5mԟĘw@qԟT~ f TVV[jqWPE,,gdh7,>퍽ʁ^V 봷Ua0y'G@S^sc}ϷnmT\T=5a ):fǽH[O˖zK=&VB'!U`&<\n62rl5@8i E n %O`Z9AL iܪ)G9B \A"ž%D? t'99P U+ J51Y'F!+h[ +`ۀL(}%,#lIMGt g\TG''`zh͜@o 3'tAi(\r4W6sӬYe8x"Ȇ (NbCT n-9b#8݆(C`Pe0!<ܨ`s!TPKsZ]mo+  5e7( N$9 \G'j%7n~-{kyE4$.!g83$g>3d9MH* q3[ t Db9@"|@e`He@2,`E FSBzyޚc 5: fg="`%{cE(JɿU( yb0D NP'S:<7Y *{bɜml`#|WJj{1?JG=`~6(&49pej.׭E.VB?yR|rj/f1D> xz[ )xQ-QT~?P/:cUS|y1Ast<ߧo:?iQ7! 0?tG՚Q`;Ur7:z%7:1zmGG_W×=H/6ѪD_NfSzw`?vU#\sѵ]FiCcU:7ŘL"U"5=:{&USnA%`ÈIXfMfS{)֚cmtbX@)M/`zLX*60-Э8ZbQ`͍͗7m Ez7 kHו6%?+!k6ЮX v!ZӳLp60{3(~ͺֳbNyf@j%C%kwb #F\ pJθS})ZB6 5+k201}ȣF+o))*J=n4cd?Җpjˆܘ4 O]4M98 "9Tj8-Y%qX%iYT],+q A)qq^ŏYTY#C]5 nrΗfMw%evz(iy FhV#x(pUgFZR|uL@154S BD 覌bUPLJzT!p2gS%W,Ds?뤐YI*`4^,81Ef5c}rfFe@XqE wbJGp [#0:c bcN&Zփ/tR5l5+0}Y(Ĵ_AcC{4bf Ǹj P N8hgJ֌F"\"Zm>ULNٌ#-GWLpoTA:ޘg5Ѳ!֊NHIBip*cdtvE 1LXKwʛeyMUW3!x7((,0.aO71Ky0u+ѝF0%x[kFz\Eeu#my..em9d1s%^蘒WBb( rN!tF~b9TH "7bO&hX }H0 :"lV®O7|<|ӈ"U1F0龰TCk2j5W<{BY)eR%#‡ .uZnBf_s+-@j 6iL"]kfpM!PY/mզp>.T ձX_/|?'aS;(,_nQ]hTC$C$1)[NFwOQ.$=XG"\VVŜ+sM(1Zrb^nΩT8~/B淲nZ,.ַS`dSd/*Mn; Q4J bm5DUK͖j-:bH ?X?qtIk%gB 8]Yr??LZ"ty0c 4ːڌB5;% V  K2gkǯ1O& wS#"HPZ $ ?1,H审^a}dQ@Qt?n(X2&p0{:&&Bw7Lqsಎ5O =-CZGM1C*a{;ttC{:D(Sͷʧ ̸ڹiX7HUoLH-bdB*`q )8lyn-T4~|qz.ކ̳S.0:^=3q|(y$&?O!Y<~J~BU3*^!xf鿣5(?<~{`kg,G эq|+j;oU?ml iUՈ־0ė4b` s߬Gi0hPvrcbݍɻgs7N0%zhӑΦeab+͏x2>C_L~|$/DZ`-Tj A'8JJo~5V]猨ߤo! FV3W\tQqĂv/I7Ɗ lP'D\ XjP;5@7GXLXt\jNYMy8Di DNyM9-`;` g2BPZ)͌77|`긓m$)`:$6+;=Ԙ?ӒUo(*ATBΖE˂PBgCWP:a ]R2c.X@oѡU8_G`>N&eUI]4?+X`**g!3 =.U|[ۃ:g9>^^ӕC tS7no{ j]skxio|5ޢj|hBWoԻe[ʘ/O[=!CT1fLx%tȨxF>< &_/}qT/!,'˳43hd{-ⳛS`dVvނ9n֓_ +8}::Q1e+Άn}Ϧvm6Jrx$^YcHvUعa^^E}o ;%O[=CTq08X!۟[; Zlgq](MAz.dMc Vl-2Ok{DޟlܘI^"K}VPC F0(|]4%%t"6gsB8.|ƒ YZ٧7\-̲(iَ\TM_ {C҅i?{_1alǠߢ_ O7s :-# q<'Sg~vܫL8Yd7>W$_&˓i8<ɍݤc{2 [ |-=G$kO}N ڌ\_.Ɠb2bW$>ś(sa*ISIl2 &icdBH5VtU2.vEִѨ)LX6b0V;wGIi|v&b -Ѳ+ b2O*^_2;~y]٢hMJU]|,!xA˕͹\TxǗac?Ga͓ 2q8OVڎf90Lc!nDƦmzʹj5rkedwi ;2( (ךZ VEWr(W7 #*I0iaҴ<֨JjSTtC{F,p-,A q r1:ϩSALU4yHR}! Vzo? 'N`_ă#iUyjώ#a>8S$ Ri =0\QOA*->{ R ȵ`B[ehXElfqg+,׼-++ n^!N2B9A\Xރ}w\Ƹ8إY+<@?LW}28^ؠюwlX^*k5orMX~F@KXP,bT577qБڠXyi2}tWǠNo1׼(cPc/wA]=[nrX(Z`a:Xx0<͐RcXRvͯ0FUjޒtn9unlBaa RaqZYguR6Z [5oVA֕шe*8!m3Fp)$Hfs+l׼-.SXjs1)Mf'}0x4Yhdb2,hK.û;ٲ4otTFn\Y$iX3f4Òڌ(O5tQvm-7\ #.0Aڀ9`v ¨Nyz2^meLP,)%3^t8r$* VezkFyz[25tx-T/NjݑXPV19 +X@8رƄJkyz[qoL&Jp)(#Cs%>Hjvt5iW+꒓lFVF2"W ݆.ўvqckrPAmo(;]l=7iG,jZ.w-bJ??>>9[OՌ"fͧۀŌ¯)˿Ti1NB\,qp"捇XatXx_"fek+`d:CCT m!"pѶG4|v5D($~&Mj dd v"rK٤1~Y`~ᐬ&)jWk֭6JQ넫ѿZcڙ鶅<-7wI]QQf^cM-4aM V1ҙi?PcU+X;=(Z<3hDR$Ht!HJGp(p{T'v :6a]u!=[*RҚ6paFF|+c`.r%M]c8t#kqvSׯ.a Z+aRsc:+T*S4ݿEU]f "͂r"EQqqG x<=(iڈnv0D+rmKBb 1 V"iʐͳ͐?^Q.T=%򼬰 WVu-j7*Qq ;j]D0wh:;9a}f')d^qiSL|ٲJO;2p!S)Y eS1nj6S?"{]XoRUʯƘGOѿrEw0k[w ߉eZ1mDm']xB³xB5H)W]ٴk|FF κj9V4mV. Z(㈱SKa:*jrqt<|ʅ.EʙRT5IV@MʎVpsVCfWL R[M9D^5&BH5VWلPT_]*V&? pS#ۧ.BpNNچsu(gdw`" 5IXunRB,l7 4JՂ3*!"^uDc)mLZ_BF~(XxF$ 8B1k--HUÖ2_eknC`o-qPODග|<piȨ+8&L]7rC0рz1 * $n퉴_V烵R,7'x#(򱪝tխ{:0uybp͘DlT݌PIC2 NI,6$ƒ6g%otwY{}dH HK@bvGF啉KR ~jE>VFM1w{#}F; NY `4 /,m5ͭo(zX~ Al5o” >4GT(#8m?ϡ&Hfd;MT>a:[,T[:A!|^q>}{HjW&?mzAysHaLMCjQ9+N1!ǷZ,<3WFFfjKO*l0ܠ&Udd;T] BȨ]Tn9)i<,~,#z2褷_ykR6kP>{x`:+!m'+"fMuu^A=Ǫ\VF[+ĸ2Z8cJʽkgO?0G]E6]0{àG s,I=(ODxJ6 |AM+-89u~[C>Vb*mY @+7;cE}ݨ,jwrqM am22*F1ֺrsi@졺j22*1n&~<J@!P_0yR:'V l+%2:!mHp*z82!*t0cn]u@o>ߖo;LQIFFť%k^0u?Ȩ 0B6Y{Q'j *}sK9n+"_qr;~kJiŘVrb jˌ-fM2f9?Cxv5G׵|u\Z$O$#xՎѿjGU虩7&ʭa[1 /Qq5BM^D2}oHU^y&lOjO*Ɩ їĘ܌qd!K$a1122k-UwǶ]w/SAHjAu i.#b9&ikŬ+9EU3 PBQ5aoV\!CRqQëɖ0/3cz%GSZ6;Z*3_ɯm GKHu`&ejvVۉbY A8[,.eE_f3oێÿ+P땇Ln-U˺"'VyO,MMzzWYS8gs\n"Z1QY퍆~48g3l`VEgz1vDyGB bU/TƂYmpۛgONaOkC]=Xjܡ1oq}͵Y;$7*f$T\lh`H;E5ޭ(oHSҷD]^^| 12Dgˋx&q!L5#គ[Zp^;i kSGȨ؏%L/# `νcBUP"|=ed<`X(*r)Ǜ7_^pNj^[._T{5 S>0rj|O,v_Uh$Z&h|׀K?3z@aB) ۯ7tzC&b? E\02V sVLwUa%TWFgW'+vV FqAcFRonK*z~ޢBW/T<(cvHHϫJW'ԇ;S^{&s\G-kkYWx@L::ڱ /H8E߅G-Bhq|vHY9hoWB17qr * 1. ]UQέ-Ny,J1jԮf`J".RsEXHZSR(7ne2NӯyO1cUAyQ>_oOlYn9z4`S0гiVtxb(6x Ϩ)@g?$B-Ѣ,xK璤x,Kf=/ofwfeYZ/^X :].! XUHs!aJk5Q:@cs`xHSWnkDݑ'lx!eNj9XnffIȉbo* ߣP9SDf/nkVT/WP?Cq Q3~7k}=&!ӈv4b"!t2&³6<#h;qvgߵ:i˭?r Yggn Gsv4yw~usC/n_1_PHw7b'&XNaM DߺOFa)!/#9]B|~հu-ɨPdؠT.O0Wr;_K$sŜD`0(<`r('%"ƃqMl[ИxOt!2{ ƒ߳0܏C׹_U1oq˗է_t+/_ gZp(ݽʙ NhVn"F`Y@ GmgԷѶq~ =BKi(KhG}8Ng^~?J"F J|K$xbjG'C\>Jpp(҂#H97 j\(R !,'t}}婱xz{[uۯ؂-K ~B[tG}(BɢbEw 9/۶ gz/TI֮_~ŢjH A)[wr'A* d CB=FG3>:r`nrlF5-:Kc횖10f2 Χ+:;o8O* Y*?,]uOޟ1M(R:a+iqTo)6Yl}ea 0ݥZbZr B܅ bW_G`J*8 TR,oʒ8)4lrsZπ `nג1n5eS6m.%n4:O%9Bܺ_ x򥊤+ Ku[L)|޽S2ƒy*q@lqjK`Z0Ϯ8nc CR|%us]QzS2v MiXtE]2b.1SrjkR( Ac#Q]*q6ꃗ;A.g|q1J=Cm36t[Y2dQtDQQ(m] 1DLJfΣ_m'y1Dc[HFufāFb"`}hi V;>{ڢOoPV}=_효`5QDr--=ӂh.gNt7 c3q粹W=]z1 W %eEP4V;0?Led꟏tjS<[ar-lFBK0 )`̩I5ka9[k/# o䇧(7zZdm #Y8f.'fR\HZ f8 o=ػ[O ~& KXq|RO`b~)0_p&w9#@x`72&wFsVx:^;Aov7at QI39X3B5C[hItʗۗk0{zE&rP5.E ӈr$f^)͜K;qF;*KV| Lri 4 <; 0# 79PJ|t)ɩR0#vL1ped Vg$j=C!wB=>|:0#܏̤lj ])Xͥ1B|VU ME-(|ad$Mʪ6 `Þb@45 xcF: xXB ']/7R)l2/]/{`ȓ =3D/ao,+Y,b657~b7)ޕ w|]; 0)oHd4٬&Ԁ]ɇP<#ydLg>ڟyf2_ԓ~KXfB imV'9_ٙJ.fvxTQ^9yɛBiXBx& oo&N W/: SلQ& H0S@wo~\0f6kw?' wbnқo V={nחJo^{)'G1ݙMwNzK=G6fƙo_M/˥ 8|4eo #b>_ǯ~9ߝ&a57TG mqJKyPVYI(sA8(VfَDV-:[`D[2Fb (DNgzo@'/=L(MЗ(X}9ң%B'?h&c z0SL 蔙v8h7u벪m}_cՑL5^::sX? B;^R6,߼)7l]\?s,bk>#WV0TZǺpO[$HC"p>lGF*m?,}XFi5@owa cOpżStb}9 hg/ie Sql_hfΫn?z/?f_';˓t fb4y-n7V7\Q?/ ^C3h3;o~j Iތ~[0ZjQܑu࿪k˄|2gm.(w^jlV.(%>/J-.S>0ۼOCܐ",E]Gi зPYg|ZC|<:L6gqL]<^Rޢ%B#H` 0V-H4][K eUXR:D gD4Ln.ʮk'PZNt\ml UKFD8-O0́SʔyC%a[qÖ!7K ]YR2xeMeG?q 2Wv  _K 9㠳|˄sP$%YKx(ԻRSAG<|Ǫ_ i1&KjQMyP6\vs&WJ+m4q\ƬY|7zכz^-:zcWdK+'H:1IaU q 9FgNY||3GA 0$6Wbr\i 3Gg3xUg=__)m*T޸eT̈́g:2$d|c%Z3$UI۾EόjpJc?'9ʳQb-μBƼ_`+0t֊ aeQ*4i!ld 3'?9oU Us}f@|19P sVD^csYuW.,q>ng}E 9u[F 0k>j0c)( P ;YaqZYgelF0zt]~aX,vyjPl<_,|Nng!8e"&` \=l [ED) KFErgFfN50ox{q[-8w^20堝e1J;Y,Ixa:0gg3eSG3zc1=z^ =闀ӝ^*{i /wF0C^ly+>)\V(RxX޳Fn#W{l8(" wdp,2/;H"ew,wH=En!(\2awS,U,֓3PGtE)(#96׋W"] %'nzy "N_!LQ;¥0ex+cgk9Rh|o$"z`L rLyztUcFodMcZp:L`ÍL>f_Ê}=ؓIqeR:D$ & gE L=d,7#gWي6 m-I8(wP&|.qO-3Bh.:ռlT K&\7NwQtGO^CAvt9 sA5pG3kGvRM=16Y>=Vр(1&NHf|XS70&FNuZgj r/ᯂBO ćoD6) T[CSUpf X Liym3s!'&#|>j .EWHYIRW~bDa `HM¬!cȸ!~ JaզۻIice>{bd3!"GtҽlzMƅt$Ui(ֆ,ĉqƏ]nL2 s r [e8 eD11rF0BNN(NOБSf!#XR2&,dBDqe(3p1B1VEY_q'1)ߦd!}p)ȳ`RDɲ]Jwt}Fѱ\cOr ߹Bɜ#gT ShkÑ/VK7U:N{qGb+*ɩԍƏK9D=YVaR{`LI 9 8Y#? iF* *Fy`|c[u0֔+I|m/Q1I(.S Ȏm#YrPCw92`3?iB5 3 oGDND DW K p*m9Txr뀰jL`7SG) u9jJ}uqIQǰxI"bi*91HTHVV;$+)J*m?>EO7q3k[.iDE2Q=.257pbm"4T$A H|Iw!j׮_U!P?sPD Jqlr R'#$\m$*$Ӊڃ:N6s,omaHІeX%"bF$>0 ~ :dh}dq(pK|J#%i3dpuUFMwiJθ 9B9tE٘9+R\WTGi@&jyu(\͜Ӿ?BfxvAeWbҹF('(G(?ċܶ&\cB:ٟiHRb$~IR%+ 'sQVuj/}t{-tl{`L6Blj4_ +.jbTTbH:Wu^9jk$Y/ga- PX2+#<2{mC\m h=aH~S͟o afs𛒝^Qt'p%cgs59/=.v< ٹn{ !f`}`J4P)MfS ۜR ;=zCB }$< е{U[:{olU'$ʝ;rBhͳ&^KM`,cS(rAƹ~AS(^KQ`,EeAQN;XcG=Zh6[k:`,EeBn㦷q/KʃaoGƖKc@kԠX,-hZ<&Kâw7ZǍaQ CJDҡ" X %Q14F$J SԣxaV4op-=؉c.VU `] U"!C_}4 L )+,UqշߛUmv,*,b} ܀ޕw[h1MUM}{+sb` oiy套MSYfK1m2f a'o&-;t6pe2PC 2 nXNDج-gP .xg]^Eh;̛Mfx-EQm1 E}g}|W#z!ft}w~l(Tͷ;ZH2 *{h ٌvn=9$$lQ3M)iI"xe {P&\y5! aXkgo̩#]s;niX)NNU: 4IH #<($bq8uC3nT~x6N98+?;q^/ë_F;ݢMZBIuuܯARˢ*.@OZ3EΟf 㗋K}G_՟/t8R%A$֞ok대6LȎ+y7?cn̲Ц^zTymZYL?zR}ZVTrK0OF 'ra6?>Peϋ4;v}ǟ4헃t }&myZalÓ}Cx\-O UZ,jwNz2@uvg+wb09m.WGyP\w8|?-`f(b(ꏗy~?Zm-+J{7C1.= ر*? H UAؾ1z|}T/}Q.Oݯ- _WQ١/-uQf102 RX.W[e.^J6^w'S޵kڹwe;ŕXLtSUg; + Dڧ^)t,ۜcd *je.R[Ȏ̏|2yӊx;a`6%awU ~& ô90UqKhL)0UM$2CpAƓķn2Q<$q'֌ަ* R0zs8;bZ]nw(Jf`zbE]3z}> Ʋ]9<KH^);͂Wyy/ֵ#Sv0drf[fq\;R/c\8εS2ֵzv?a~8"0I5G$)2~ZDwƨ`#HS6-˜Yp&4!I0B3&8BhcXWZ87ex ~uIɂ8cZ{LSUpf5dm o'o1L,c&SA._g% nO>"^km|"5*]#^Flqq0H7ZIGf`TƔY]K63<\T6"èeYy^l'KqOWEz~ceͷR]k7mosNۃ>"L&.N8z /S1/-ȹ$ w6ĩ;朵)\rX XS_Ff6 o`7˘A0rÀxy E1Zc),c*eN+OУ.CYzDta(Aj€c_eC%br*zD%!j3)3E!#(|7LkZ{|݅df}}ҥP_s4.L@+9V5^mǕ! (UjZk <C4z d'pE-:эֶ]G~P8?2^d ILbLPx4ji <}UK5[LP8yU,?%0EĘ}vg,=~pnH bF)QBSBDdt5]+0| ok $j:(wbC'0(y䪄ɴ\I8%lI(.Sҽ.VQaoiG}OdU^ 9?+{qx,8 Tl4 ab]*x&8SFS?2XD?DDfFO}g_iuEԶB7v3NWJ4c,x ^黙ntgϪ_wG޹hfe2 !ΩՁ=5m\By}[ۃ}/+Vwdo7Uzk4&OZ4gkmhٳ}sIgct`-fVa'\y=ssqxEV.͆dD/ZۧkOz`Ơhf߉>@z6togGNQI(UK6!+TNY\-ds-Ü{&`4VdO?$Jo!R\k*K8`MCȾ}-z礍X( gLQ@tW$ѡ"*QRk&m; .mqmat:!?۩sȔR6_WݴYoJdp{]][q=+ a][o#7+_vy4 d`1bhXpg<$F큷`Rwk%R\a;Dw.B+Un?1l:Ir2JN%K[CI'H:Z>sJbZr,$2_Bm|XOv~Ɠƞ9 LVoi13vFY '{~p,֬& @TBq' -}uJRy9~Vߌ88eУ#NW)AU y0#9 #Y:c,G>\j7'O] ˧1>!:T9UYQ RpnP75naD~\[ ;L5H Y1UD{( dya=8z}9fW2dj~#.D5RT݉%Pk08Zd {K,%fg]JwedoDn.Sy p(& ăV"M] i ^0.4pvuCf'9 h8nxUqL%56r$؛-soz'D5Ql="&j(=sGЌVd:R-y.@g2Tu_xyg>ˈ"d4W1;9lbD emͮq9Gapa61Xg0.- \ZnVs8 D>^ =)&x4tNM7Ngdp졌X .춏IZm,?9Vq@t~1փk .(kWVb(3홣08f[seγ4 ыfߟ0Zdܬ}I6ְQȐnk:i*1El`ոZzghai)q#`xp:s~$sr7̗nmi0`H0InByTQ9{KQe^׭=sGwp#g!J!<ҙNiBcm_&&1~aPhD_=ϲv ;V1M \H@TQI՗ եҲ5la>yd=ѱ!7){.G&'i%} gd0~f8~~ˉl2]YF1nu$]:e_{)J©JIA $2<V:ՖslY@cv4q Ev-( k,12O'I0r8W//+%ZOD%>0Iap΍/$)֠d;披q\Z+T; NC =b9T j04 ;7w/ !G3كppSc6vD FrJQJ#3iל. ppvU4 70?^gR^I?# B,^{E)/ kIݬGa Rx ^Tú<=uCdlJ76KnJ~mSkC_omwPeqoE{;7kf|m[\vzTaE|Yam$1$:B/s06f# ԃf#2JpT]r+F2B^| [{r6v Kg5x*O7LhUlavv[ 8#i{PŨE`߼nd:/v誶:99px`v+#bc F "[hؚ\g Y X?#O궛( KgyfѠw.9p'rѱ?[lyK6#<ǣd-Feup6B?w}q9r*ɕ 1WYˠVՔs5a/+G`KֿzkGD.9[F=Svy9{f鶲P7sQ {Z7 g@IݥF"}!z'Ǜ<Ä 1akfw8Q(8ɹ)IEֺ*0 j(˹/hϜ}_aPL| r9#P=RbƐ#a0&Aogj¨erQ}Z($̴>fv>ND_ڷ퟼DqIK&i'5u~ WCωsB^%z˛w( T4k?7I٤94j~3h Zů7ۯWBZ iNJ54Vx9sՊZzYx/+|O7 vVm2VU6P'Cw&jYgjhh U"4X5jURLhZ̩ռ5/?BZT<s ~.< ңJT*hUsZ)G?׋ ֳy^ctvV()59έ!UqaO@in0 n|J _'m*f]\_*!%lpbZLL3_yqWW ^]U}1"Uepbp8Ah"#Y&4X ^Oԛ.B[,__Q ݏ᧘DcTkQngՊ/dV笧Z|kKvi{`Kc/-]P]1rng0?X$h}p1Lƛv˒˝])l@߿Jt=7K)\s钨"^ .*S}c,U X$ʠָ[x՗)u"(uW[[Bpw}=s{u@ΚHB=y`C^8+kɝ7S#gXl#0 G3AXoliyΟF0u [rVF LOc)n)9gZc?O `I7 zZomb|T^l?5d#kmua;+KcOKA4]{iaɵ/=']{Ki8YsVtIf-mlbMZZdm=} g@"L'g!x}[#)6UH@FRƕ _4{1>OPL׺SO:H+|[ dSjǴyƜhLr}Lg{5I7B`xhȧrZ˥r-Q!T.6GXm4?y;~Kեp@|S_Ť/GklԷ7GI}ys[?k6MtFbw u2 - t> 䭼EyA*ɕ 1`m.i) 홣 7>R&:k8k XC*{-'mM7Ήe6k%(_t2 Ņ~qb/Kw_YtǤEcվHz\+ː%X9ݚN NC]tZg2;aڐ{+K2_C^aPCRʯS IwAs<^랔!2Xu i^7UhٲV&eovSE; <^vfx e2^B/PK(%x eA5a]dž-H{FHq%1)#ԑWNtoo@cSZ4])L;gLV:9֜:˰UkjR?{WH0O1h` ,z10<5ȳ3fQ-zGIDRuP>LD2"#"Ȉ6Zcft -#;HĄ򣭊Tq[%QjipAj*gi\ !V m2 Huy +|i4)G9JA|hZ dF3kR8yI=[ 9RhqAlwbaq?kɾQR˃V=gTW͹^v?zwdsC{S#VA/p>ayzq8aܪ2K\887;͎s87;΢fYUfCZ+N7߮z!uLvU.Cw]|geyַg T$׳i{%7!JQ@@XRZeVݕ ]&\ś\ʡcB)7jOeT6fK:YFδ"!y/4R[2[HQ%Eo$xGD!z%7,"EC;~qxݶ_} 72|Өs4r4GAs4GAs4GAs sE:fyލ 8nnŲ%Hz)/UjP7n<lL4PzlBEF#V?+xq-Pb SN+}cؗv l NͿ}>=u4.oi8VOl1'l{9OpS~K' | 27 n$7E+.0V*}v46C5R2PψdZuƒ5YUsbď(om/_QrUg=ld(Jn7is2R7J&;b*$s|cGw3OgM:Q %z.{TzF + suw!!a`u ^B.) hM)gƅdG ەn$Va^>`D>(eO`ViY&X6vTՇ%Fׁ>2dg 8`Ƴ!+ 8 KΉU13K&u?bs1P(]r>ZGh9-|Ѯ4j&g.OOZql>j^X%5MFe#o!{q[{U9_^6 U,_UtihjUqh" a644&y 0Dy}ZQWܳ^[CX!yC܊V+K$n:{.ܥfᘊ~׻&rZwPŸ[`}<*[q3'kE8qAOrUMǖC 9Sز"ou}?~~Cl *X0B`D$4{ߵX{>vɗxKb"{&CDt)RR%nbi1g;Jaظ\Qw`ǹ}E'꽹[U ,_\::^?yWꟳh_hΐ 77 #{ dU1}-D98` 7]08بanzw7,[:`[l(>$9O'],QIJʫ-\ $СnmOF_ï]o̊K3[ GdS[hm!"vɕFEb\zKZ!OxwuGUk:w#:H,]~4߷m՝mW_)-T}:{8\I 0<0TPDʂs0@XؚC=`kan*)TG"5qKٓk ɴ\ۇ́S.@ ^_ײcC$/)Ӄz[{1L~|a_י&´ HŤՅN1"%ĠX476琙g ~ZP:1VRi]IϻΒ[uvRw; ڈ9.e?֝aSn,q ?i.fW8=o#}xň9_1׃^ 9!Q|q?YU=x_~ bڧ[j:<J.d}t/ɗs\TU%O4F7# J՗dpxPp5 VH'8BTeW"]gy1슦s;܁r&kJy5Di5 GK/;;<TfWyOq)QGwG|@ ۽nnl1"!NЅexYhWuh*AVTd1O0!`)7‭iUp6XB7zf!%  ^E/enΡxBUo.=tƗ_g #"w_PzI*@-`q FC B -*4E 6R#R ڌS#So+7G8+-P.w Yuh6rz1`S56`)(q:#:^4)FN'Apl_<8fjHXG"\JlB O1jv`Rk ǯR5tx96 >=w lwxMlgrV2rʫOR;|j oe=OC+&sNt[ #4(*WcRBz}#*7k#sI` ^ 7/1۳Gw@$TDVLUȇN1CQ{0rHRIpxaR[1 }ie9-RLp71&l91M#p^mv{̃`7yAq,hԑ`IhoQJ(9za%ɒr> nE1&8FX ބA^Jga:Ӝ ֵn ]OXdTaK.#K"2tP>B6xј{,ռ7א|\k<"Р(18!#5"..bvTZ"2vF2rC@~Ᾱ䤫weo] D:?)yIpaZJxׁG"ЁD_hu!8\)@90g^Cv!HQ%E/p`.r 8 ȹ^dXDcZ)"VD h$]d &,Qٜ(X΢~2邝 Td# 72|Lj:|GThۑa`hVjAG/וQv>? ӑ?_[*zOM` AHRVa S띱6] DPZFL&Z ip1('I~+*.^ we > IM ר]QQ&f%*J낏f_l*U.16krݞy?[I*?w&kB})n@╏R&飶ěU`̙ ڵYw[6__ngW]6l/{Tgzɑ_iygev ,<˞%䪮j`uY32ӖY@HFA2G9s]J^AF-Rc3R hF/{A.x# LEgZIA$l4AMPkQ cDI;WCÞXFutFҀ5Ifx.JiLg9h4}:֒5mԴQӆԴְP¥v0缬ϠʾJ/ .X Mwiտa#3bv#=@ -?O 2'(i7/%{Xo(Ŀ-Fva,C%eNj UQDgYF u:g&!hRh㋁O!,{0NHV4J r'ħ,&?W\n1iyxdbӋnh|~~RϦkV8f˂>f3LTSm[Y'{Wv%VKyxUco?"f$,r0$'Wme_͸! ^pG\o8` ť=f]5894#b2,rZvrlZpA .SϴGЕb<;`.nnmЕ:,ӯ͈ آ_luLgm]߿um?!O/<;ڇyT5'Xh#h**n1ߍ̚5b% 6g\@cm R~zb1L b@%8H02i]T2|>*N!K;'a,PXjeeomTDlT[LyGV']h/Cndi?cq@qkV6,Ⱥ-<~m.Hk#TkjVBnm9xMHrxx- fh>L~~"4InWMnttz&L|(|gF\ 8iE~O$W2Pۯ5њh73vE)J*#W8w=>ob9|04ژ-^]>NOJ ] n=NoudfO\nEZOzH2^!kQ|YM%AozΒ+g QkeZq1D_v$ 0CMKBځ&p_̬<~#:edz@/1p +g#b9C.!M'E݅0OiJ+Q;aR#0n'eS[CgnZ`*сnD+Zɇٺ]~'g3#!@}1|#2\@{KN"dAa2)axg90Ta /~[6'KēAZoyƅ'b:Xh&~6[5uO5?lD?XϾ M⽌p̨{ȇe+K#{B3.S34eq?2' Ê齱ŹYVVعR92"aD>f-FcJ荭c]*VQ5te-liSƧiтvKX;[1C /ܞi.b6jceGRZ=m3vZuګ|:+icmMj6|,P~2'z =.?2Z=e&IbwhӜTT8=9j XE1CfUxze|BJƅ.RYJ_-ʐ 5 R^"WνÅv(sis|!Qe CpiI#w叧<70&ixPеb/Uz +s _O[/<_Q*OޅU>,n Fc{ap 2Cʂ"j+:f;TjԾ8g z6=n;W6Igƾ䷾y.pwV%˘18 8}b>MNi͹ J8Tm̙/uqi晚dt9vc DB-Z*+6S"Uĺ ֺc| iV9Dy1 >Km̱,|୍(b%2wwǗݙwj᷻y]k=正%t.OP3\ޯ(H)HghSd6Y^&eA|> 1JbP_0NOYPZDCz0/Vg޺$Ks&x:؛i(3QqE, QX'%C{;\UE7+)U{ jҞx9kϊƶvB0%3"CpQ{˙2Kt*`$k Z>*S.,Xt bN$hL1kUijflr m=X"9TF16WPG#]<o(;fwmZwbj!)_`7/W?^4$,f_桭y_MG/-M1]ceD5"_dyLgaO)Y띍&m O @BBHf.%hq@xc v !K.Ш-upz6C< pTzdK3 6s4juEQ@b ćP|lLhBݴgnVTDf iT`o./aFg0J/PFQ tZaZ. > HE J Ar!ej}H$:*Cf,2ut\5Oֵ7h3Y܃R;6lֲL*#ɠ6Ӓ|JJ\ A (I|rQLlGJej3c \Ѿ-#,i*){P¼cd29*$HD0RǙCVڅ)h4@ O\Gg!iӡ*0(߹<,D&J12<#Xjϴ?| 5j/{1撣q '{P`LfyVh <&Z:(lҒD {੃0xb /8@ M Cr*ӇB ~6x $^ D("1n͚]>0dMcOed[sTrP pib]DP1x"["v+uϷSMst,`3ɪ<2:#8ZzIDAQ\cZgCaMݷ&c,;_mlۥ9H-y[4ӫOow!ʘ mՎR[M='4'FE>tG P)KՙGXf${eP@V=VNtEP0oxN ,$]Fm$q:{PF7dm)<&2%QxPR|NP!ɸ!IbŠ1v$P|̼*f*: 1,TrM 8}(Tag^R,Ζ vhɒPw1߃B T/)4VGH[FKwL~A-̯O]|I)SڸRЙ@Si^.iͥRC=(@XYzg_?Uo*W-itm,&\ʗ"18aKSGcD 5WZLgY{9+\rp!Aq8 C~2H]W=$NZZ䨦U#7gEu riv!B 䭰=!pįzsv|M$l;> o\ZŴD*jG $ss1>(ր6[ f^ַ[յ'ɳ57=V_U]Eu"ytJ$6>Z$K|iEO( z`~ 4sKwOD$.2:$FB*!b,Ci\r).:glstH]W.VH[$8:]u_e ".y ^h2ɏh. L oMsMLd 3-=c`%seӎs-'o]uZDb)j&!%imT{@h5 -L-Utw}-ZKGɭE3%-jLrc4&߅|MwuQ\kTX[$ΗQB U)"Ĵ\o(QT1k\0jg&r'0(`[)#A-7`ZWA,T+(E,+!DgMMͯ$$,kIgu- 1/\B$#m<>Z oӭ/5r%xWTJҪDQ]B ah]|Al .JTĴyc> ^e2T2d-Ѧdxd&[Ze >0]Up ֪.@yxcF8d+d@+߼8gϑYߙ(ٲ@eQC0\H1J9A:b/QJGs,XEf@3>$:"VqI=_{ݴvlEǮk ޵Uh轒gvegwպq9">}/' [;}XwZ;e%^' [MG v8bwn+R*&?ߟlתlL&[ CFc}+$L;+]?~?狛-W=4џ޵oB =yNi׿9zK?}W/$FdhUsjǵeqK DP:b1bDIӿZ J)G>*>dx^m^QFk܁wˋg6|*c[cis]&({NGY*td*EtɪJ2r5#$!(?z8/T1yO]|̀R ˋY ~OV8Kaґ^2x9Ko&?&tl3,2rmF{y+0xe!U();b+~8ba:en/>kR>=2jMzzf8Ѝz/.:kOnދ|viyg0+ZFRGi~y𴝳7Z4 fd$pkmȷPOû-}x~>+Hm? 7y}'9<awgnf|ʹͬKlgcur/R8w;q|q8w6?q|q>8w;q_|//e4߾o&k3#]Z}[_ͶcG2~ _n>}/s9ˑL ~|dѯhE˳EoE?,y{rGGw?X=2e,○_aa=| u|)N`Š{ ?i\{ Y)_ [Y>|~рC{Zu]чHi,YsԵl-eJ,n#l]:<T/!eS.%vCP{-_GA 9Ĉ/ -D@)rzk7L%srz4kPT"XikkBɠ<'gJʹGg2Z$1'-gw:M4ip"(n}WP4n4ygml?<&{&J3ouɗ^=dJJGJ{%vIOkr݃}4nu@$ASDQ̩]W74}ӭl1Qn4%~M{uh.w>Eo :jkTSxyXl>hjhس)_h~-M'3u=؇L dgqDf"9fAA[a^/c]?h=Ygy4]x@m2B!S23\ zl45ޯVr_@@>hzSwЋ.k|#+|-C@>dv`OgTHʶ~3|Ç8#Qr+, 誁xwvnǣt;fgNeCza&X| r k gh՞2#,RFXp40ZTn˶ݜ׷kTnૠ.UG>êj,>ICW"Wk|wIspiؐ-m1Mw\T*=ZEmlޓ־]!KƜnV"Oѡ ^Y!vscw7sdEvbFW%pR.J')D/7-8l#gZR@Kf/5 p$e#F E}{Ow)Pʍ"VFudXDcZ)"VDd$ m;Ͻ2dhQtRF؋З?3C:Ƞ@%cd%Ҵ-P9+<+k|s̠zŤ&[r/@dRR#nKqtƙg4_p=NXlh1 KV.,xy\ZlGyX>;l\? 3mӑ?_]m;oV?5MgLG eZ2G$¯e4zl5ͭ6p~?Et!ɧXXM^W7 > IM ר]QQ&կ KKB9U 4I˽)7ncϚ}*r Yvuͺcb |®8O6_9MIM e[w8s9dM(֢A b\#}Ԗx93Y{)Ckѣ4]^lR}O8`L O95*0G JҺ@tY$h) Ӂi]Iu /H"88fSa3 AV)$ pXGmƴ!a_Xe{A2FHH5pFBb%#1 )DXҠ8KFbRl2`̒%-KZ%HokJ&ZZճ0y6WٳD}Ÿ֝ WؼPreKPz[Six`,(U2xi2%h)$${_ԉ?=@'9iƀfCz /g)Oc.*5X@U6iςюCcAWm5V);҃U1Y&"Ϭ FM{'!Zf(`A(tI~$6o>Lݛ:S6Jt49Rr-g?lxr!??~2tW7'AvPR9CbA04Jބ鯷.s:COR>8yne 4)+eZ\-d/5iY};YsՍ&@02gIPu?/v]ɞa>3g@m"f]~bBw)=w)UÚPKN&vƸebg0“CrifĺZf ;s}Sк6=GN+r.y$ŖO&w%C] \t;;HY\!v']r9؃&iw].9cHn>v,),!p8PjĽC9ݍ](9L0+AL;kH)EԒ)E 0s>Ϥ0UιMg@r6do6 >{51!j lk@vn92y@0*;ökqjqw 3 'UCKK)FD_As1X{|痾<lyjʺ9Ie1n`nKţe\CkQXu9%6ic kA3\1-%"YjC(Jhn]qrTRˏ?SQpc:)l2VQ][n/GKP֑\RGdSxC!D+Apl `3CZ'"m$\؄ A2 c*5(K)µDX/5^#\(艻s`oA+foqų8wuk}=1EIߕkgECHxkHs|H aZ w.^9W)F.FiP T(@Q9eJၸ^KH:\(n*o,LR]z} DIEdc^ȳ4N+1,8 (O {%,:wqп~_Ӗ%mWLUp4nZy ^-;Zu֐-1fHs3\66,o@h} (pp?'df?18U*A[զĒ&2>]z$Xep1KR_5W?Ѵliho4U쐦u|9vS}r+]KW䌮O5A/'ؗ/+mJ_O>'7{Eիm Ȼ,֑f|P"5_2:۵k9bQiĖl@K$= ad&"l<3_.JrN&0ͼ&IQ_~BL_=+@!sVB`˵FAp/G휴ϼ#Q$^V20=}[[55[=C}38!Vl˳:;;Qr 7$m :E"m$9i`yMDQ^xp$V^"K.Rn0/fA36.^KP, i4 mJ 4g11NK;xN*+c韷UnjqH4Xڻ1 B㍴ ]BW)UJ/t^hؗG%gR$f4Q FİS RDH!wpSżN47w,*}C$eyԆ.PT{q\S|?ڬ/ɍ?%B|c[ ]uHcgJ6?D zMХe tֹ:m9IWKhn/n/Z۟f$' 1kt0-f`)aA%ua$Vb޲OGCJ9ō?9uX-"ro@ dqoR0X}8A [6֝l]>h!G+FrVƳr{S m_Ηbݜߤg J:sŬ=6&+a-q2l `QXw1x n\Rʭ.c:2r_KL)nq]^z%!=ؚunOP::~`^[y _A UMR>^RQe1)q<.I^/4䕢C@CC` ĀIw~Ssbr-[}|HL6^('0)nS4 R)* pW?ckń"z 0led^ml,N=ۑVuEX7d-QI:Ӄ^F\+ز [3GaԆ"F\a+پ?􊔚l_0NE[=sK#wtA(w_DFz[sșoG(wp)R6O=cF@1gz4‚S'&nPj?Qk9HH\)^e ͥjQȶ E|>Saۯ釓="D/7-8l#gZR@Kf/5 p$eINzrqOB;  qb`=APc0 AGVsLweIzZ/\Ab3O i ,]".D22"2r6ȹQjEw9lVJH/"%Al ],i7O)Ŕ'c"9wea?CzV_ļԔj݈ApuS*R? ,/(B,G{p]MбI<|*;\ok/z<~vV%BЈtІ"xgPPJƖ'J3=ā/R~*mYPքB`baf tPۈ4C!XF|0h`XZʩR ˽)f_lתٖJSC@fonm#u=3d 'iR_Mքh-JބR>RL@K%oW12#XP1g&fb:j)]=>K~Y`_b Kl0- siR25(p2ݡ۷nWzl\=YwCvuvvn@Ŏ{`+Ro^fa`l");YsX`tWJ i3&SqmvIHPk+-20@ +à z#@IMnWfr-z=fEIPm/_Ѹxj_^o|>ݼٷo F6MT2'@腖>:A"!#Kʕ 1p Ω?!d#8#K?#f2ڇy™n lUJoo}*f@ZmSRX0euBq#H!X<,qYfaeBr[m?4%<~L_gEЌ1 )v1)AB@Yk8RHx#|iU()s y [_Ͻ a(2xG ,2)F bP#]?w^oǡ4YZR?~M^Y."X`T9F$㑅`"RSЎ ` `pHwhAI?p4[6R,huk9_味 WịUX 62( F刊`) NPBR}rFOR⻫ez݅o7i%퐼^^{IDeqi1a4(V! +ZD*"éJFMJLqƣ6Gr4IݡŞ<3x֓>v픋.J#|uwr9KûƽyOЋyE8vB OFO 1NJr<8A"@ F/pJO0;6:Rt~ކ6uTQGg] \Y -TT AO^m("gtvP S澃R&cHQs#=B+j6{#M<ȩJ(Nr@')Ls+am 7Jힽ M)%SKaS+Bd#Xp!XQvBw›ExkF|n*?ys2LkegbN:mwFi]ȰOk-b]L[#8}Ue֒zlGHd/c˵_ ͊CO]d{W+ޕlJ_d}mxn%`tiy! @wemH0ebc$KuLo˸'anIcOxxRQ"PGbY8 |r|Gh =}ԧ4"%o{47;dWGrtstTJI_)+%}֮ڕM՛JI_ b*uJI_)bVJRWJI_)+%}RWJJI_)+EUJJI_);ߖ%310e^H&<;LLOk0(4& ,{|X%S`bm\֑lccv=nI6puXV 5SF $Wi|N`2kt|M|(h5mŧ&qUʚ5wUё.oi2Hظ4j`0T@h$`%Xh!2pb *D 1(R*YV*4.P(蒕,*z&Z#NuгC*+ZW:@rLtOzvx~V荾@Ah֨lL&ƙ 1K!ԠJY;30Lp\BY`Ȁ9Yn!^ !E6>03ߣ[<-O1:7+1b$dR ]IZP 8xAogJe © C2X J:&m?%st,B<^$w)ղOU$oճۏc_&rk /oh]<1&+Z\,a s6$E&VhxL(&5SEg";t 5$t͸},sL;yH}J"XaM&c4j'2en=kٟlejcmou>Mn`$i\9\GJÊ/W#,Ue㔟_>|-$>?aO}?T~xϟ}h$a .YpϙM/Ѵijo޴]Mv4!ohhS֘bmHʏ_nc,8d *0#mRR.=8{D۫J0y7x%Ċm:P@xl'S9{|z=nپjG6~MIH 'e0_UR\;j̓r%1HU,SHCNu3x f'E;Aܼ@)d'; zZfELAdiEH)9%cH A֨B))<YP}L՘W6{,> LXHYaM.yV"7>8Ձ%{d:8?$4P a eBtȜrL`J6*f0sP cc4=ktG:֠p3_ 93خ~t{~7_;%Ɣ5P4}IbDl$Yܴ` GWF}y6A0'/{%τii<+K#XҾP /g</qL鼗ALūW 뤴K?l A<;@*'7C9fkUpM.i:"d+}SuS їLfH@퓦DWF,(#w<"D:3WeJ:g̱|wS-ĽB)ePtt3>NF ,{@lږ9y=WIɧU`K)2D!,Eݙ%Y>w*hnWӭ}nU֝[ʅ&Niyx s8@f/@; !ʲȄ .Yø>9+4MvRzmE*hƺn0SudRGKe+o_1q !Ӡb0Y";U4e}շϭaCgr$އ>]|$`i|NtA:*\/%L)eh;8sS|(L xSZip\pڞx,~ΑA1<ː\_(#fS6ĭn6`tpYTKr&՚Ͷ"WBZ΂K_2ܒJfV&~iXՖSgڕaǯ]Mo$gq8q $sw~򸹐 G0<_Of1ICuQdD2YkF/b$U㪠9`8n8k3͚1 yG-N;[~˯u.ׂ]z nwxozj|գo>paFec05$ޠY Y&pm9 sawL!dl@{J&75P#}z=u6f.16fr1}l{i7]c[˖zctbȔdJG*Hx%\F`'U )  UVq 8״pp$xBo!- SKB*#]W5;5x-zzG0u95.53mgvhRI[}kudNNn "z [))%ʁqqB+Ia%^qE ) guw{z[2!Op~,u+E-vvϨxflxi)&=__[8g̞1ǘkuϡ|ug≙by-_WL$v*kRRa0=mkȸӳF7^[saq1-/,d5fo,K% 1L}nq,suذG~9l<2W&4kSEC:h-b&Q},*e2%Yd`Q#eݼ{N]$j}=}VnCFBQ~񘷒㤂o;î)32y/$ZH^rQ0SŠFQ'4^y:&L<^\JujܫsօM/\o6ׯY %fujP{0 z  MFŀSr`6K20jx⥂CD1fK:uqڣg3ףt#}x0 8\B̮<K-b.ZF}TBiP3@rȅ`I*K dU!H$YR!Ysr,t ^_}06Zi'v'~obrkfZ/I.&s[2Ѥ<5YzQH@֞M'V3i.9x+N8kuiQŏliN(,aU,$8zb4Rmˏ-A~1NIl`~uu(C%x.r)CȠfɀn \O!8TiF<!Ր ̘B$Kt,!}9H+K#b2Cܜ "!9͜" @ k. =Z#%O&9&$?5{0*GaȜ FfobMWnaVC _sn/SO=B^Ox58i[ӬV |y|xR#`;[!=Z!#[<9~IzX¨lBk4k=6/.Fغ9S{›U`vc29 r1pGÏ'ӵn+NtE[5 bqG8~E1&u$os0{m;Y%wĴ ǧ|,Xxt5_rņ;GlYSfIIs)k"yP1?) M;qxvBG~xw?~wo͇w?&Z QSNp|=CKP j>lKG&d֘ƅmH_/I,1Wvl8-&YAMlyQw1κhgY &?Vx_6b& 9ҍ L Z0LFʮj٦K+:TF't\o[oL:?oߘQ'Kr.e.I;SeI ZjS>"Z2gWt "ѽ6>x0D_BKFÑe9#ݳ@MLPo!LF@&FE+#Bt5z@r}1HzQEpT2j4$x5 6JLKkt:l9Y5*r$cFf B蹭GVjr'|Ho2{(xDƵԺEZhie,iBQRrM G {,XOϮ\- y(wG'|xVfGԤi,X Z Q"N }ɞMT6`ˠf(ЄsT5'BdΨ@{ W%esM5$JЇN{S!xCmVޅ: !5 oyv+#wwJ]4$|`:&OO2)G\R sZe@9ƍ&ޥԻ5(ĥ -o'(^R+xpA:tB' ׿M>ZzwU~[N_hYЍ/n;wk)y#X_hc~s~ ^ASR@ok}:ޞJ[1F%7??׈0Y+1KϜk ) `y0q܁ܰȔN2[i\L 9 )+8=3׉x%ŔZs!Q:eֹ*t,so)^jӹۋI&p?=IɵIIYZ{{K[Q2< >5ٟ+"zDCLAqR!BE&l,w;L++տ[s\p.Nպ>w 2U2{HsBeRXbB`с0\lX[puQgʚH_Ћ=a7Yyjf׎3>u@4@DM@R$]#3뫬2&)3m99N?`R,$S`sjR"25rYri]5Mad=HOgd+`ٴh}7IA&n <\mvה\RWqΓN*> /}E!bV!MJ-&xcD`̙r. V]=kբԷMu\yi6ǵ37κ-Ppy8x`\'jA 7 j-$* %^FHHLBDN6`c^@YeߐE.qVq`bpGlrz 'H XkA c}D8аcEސi@ QzHLdu4(&cΡc.iEӊS^A)RM z]a#4׍'ͬh/rE3Bx!l-qw[˔@\YW՜_cbR5c6(SN3ŃSl9bO<; =gB(`y ^ż\yj,w@#,\qI4Eb1[c`Cz l'g5Ac.*5X@݌ӻ\a :N洦Mg4);oS-ª s³|r[O+[Խ8t2H ::s9!_$XݧzU$1t2R$Qk-9R&ƽV7͞#xx44Vþ!goz^4#n-Υ@TiLMO/?U#glvP5 ֖S @ozY1j$ Fag#ƁK84r걵#$8Ĭ&,YydcOPOt)()qJ7Db΂wDðqOgk~\n )kӁ35 ceqpqsf/<. Z ˝t ZIX\D"Xp!XQvB͢ F|ċ4~z=11x.={oߖ oUH)r^ˁz9P/H@r^ˁz9P/@@A* e!?LKҚN e|1ZEP BИFMP!:ؿ'#_i_[-ϯ0$)a[WQMʥ{xJ'J&LJ^ŵX_ L L L L k`XhtsoxT/v8L{`](e$- vQ)!Ų)yN$;<$8ҭ\ +%8έ~,li̊ #z!U@gH>HlMk*a$EV 1[szZj~zJ8R#a>LF+"#!N8층\JK4R8i@tTT9;y*@ư@QuFm6*`p'95x+zz<籃;WI+=E{_Qv/Linnvz+;z?irEZЈkf!ҨL٠iѳVl ZeKBBik45AR鄡vԯ7S#q,v"ꞑNǼPwcU^-v;)s{s?sG`RGTSJArۀZX'ﶉds ĭMX8ao<D+m5D<bﭷi9s@k{wnX@  {tjq*%3uͰ&;501ӄ3EˢK˺uZ Eg㓦LA4+wtV>U`u`hɉR yX[̙cD7h5/}08߬huL 9L4O4D#6 !I!Mj#]Cmj%oZ-8G“3ҏ>L"!ܦ=?ُLmSO!_*P_RI4|5Za?i?Un8#t6LݯuQj^Oϫq\Ӡ(к͙wHw |k)T ˜:CV6os|1:]1NU6 6z2X%^yEz AHRV2L-)jؐL"^ˈUDk45[!-O5'f pm'~|MԙhevOՋU5rgfc v9кzbS ]L' t u[zw0ٞJWev?{4 d %)Wkx|WxJ üGk%wCav׷| w>2?BO=xظqNt}~v(we4x]O/ٹF 14$]hMWʚ9vFj68LyY+dal(KD#-;_N]N]4h.-wOݬ[>O8/Cqu[U5[VI<& EA6sI|;z8 Ǘ5%[[Y c-ʠm$,yE؄ A2 ǨUڥ!{)A@[/5^#2lj%'W Ԓaiou%e_<y+vXNWwL˽ ,*!vSa*G"( XV)!>p9$5Yj7IIg6z;hS @TtL*"Q*PyfTc08P (O σ>=VJ?v߱*k'Pa(`CTklV$E ,)7*:+GI)xHE_Fg_GHs-M}Ib,3XA"b/#~RIɹ ΩHy鐛.A (9҄j"ă [IP*x8V dIl]䢲}J^$qᛋ¼GgoJar^9 ->_}k=Kbů|EZt^sQ+30BgR;gL'_?7o@F2R޾aJǟ"r6AgRP;3Yïg90%P0 *:mqkrTa{sj\m~j˟Ѽhyl@+#ҝ _iHSo-kŝg|QM$ DڤZ_ ^kIΈRUצX'`ӟ`T)ɏս>-JWmb_WO!EJl'mӜWu}o︰w=i_9{ mΣzGwReŕ"オL|f8KT SfLYaM>VčNUsOcyL$&IPVXJ )rs1)*fbO^7JnЏW}|:i_E|0n( Ϙb,%$2-CC7u+B"$!Yy>ܤDڻ%&YÊrT#SS^|Y|%BydO,}PjaR[}7_̏w&,ޕgh<8+qiw mK5ڌZmz f: xD׽^GgQijo%Wx"^zo ]M2Ɍ]EkWIP|!^Ob4>/ɦz?jaDeoCIe *&zD0hzZ-xrʠ]ot|ʍ`0N*8.cį(,5(h+tBz,b*[*8ƭldf]cu >ߕw~]IsD%LWp9dKK);BJ eAh(AI]75cLii$hcnMPk#9Пk9ךQc yJmsHi|97[[~=f!U+R|\J 3 w 4A9Xm]Na Qm*;9RĎsqnCfD8Mydҙ3` 8cͼSLz%2=HK|Y)4d"dJra"p¢!-u" E4V뀒'+YlgGWl6+1c>0ka36jQE(ABqgUQ;  *MVE/I{,MNq@2Sb.0rґ`&+E>hbu8*hT^ &jrN(1q".Щq6dΙB@RFC%}G'5ƌhz[π^$|U#2|b]AF#2RcVBI͜ dHmL۷ I^&أ{QWr[t^GN#riHif/sE $J, G=L<*0W=d #S6ʵ{Sk6٥G׸ZIXO)m6e/e.9TI8QZ_'`*sJ{\“4_[pMVuҭ&`=W%u4Ls^@RYޖ{ 6;B:KiBA/M&~X#8dΊF; k4)eAwewwUvW?GC(ZW.`9r"d6Rl"'DFpOL{˽J.1S*ŅG<& &2p!kUlև>z<bbJ)&N[Wl)N~2/zӿu*l94 we帋F'/?>hoLcYJKR|w;=h;9EW~ 0JW,!Oaՠ0%}cDNb:[QOHycZF[UZm翅Nhc3ȓhMd-s5d Frfd,gM@o-.6kGz'kdpZ9vށ=#'h$mRۥI/#xdBK8Hm (34l(Dg/ <.Hta jzPv@z!խrz_¬@n,Ӷ RFs& NJu19CS:e#sl>JJ&Laco1%0Y +\I1R(J  \(E}Pm:M$G0. Yo 8pH]фvŇj"3:h"&bN{M, [*ʰ(7fЎ\y0t5vEe!k;a/-{ҹva`<`h7 A4`Ȍ2=\Ơ;1U"л*[wIeN11sļ6 K˜$!`K.٤Rݒ٦Eb eHч8rAH-99͊LXVz.e +jY[`0mPuDQKwowN[O淮זfNm<\u;~EӠ?1:|F74]LF?|{xMcW{p7ȆF:n~ۇΝo4Ju;/|F]}8Mf͛F=okn<~-%'/*7sҜ-6tj:+li͘} ?v)<YZnm~ \IA8&)(Zꛤt7jctWa2h4I_}7Y-ohKfw>,_]OAH7.! (C|ɇ LZ6$Y&$IiAKM~>/~Z#׳>#>lð)AYDBڭw~%/adY7%,1is;YfjOOycaY݊Zyhzwor|3̎1;<c.Y3V{fPwhbZAқjjuzJ Ύ7d%m[C=$棓r$\{Z\݅L߇"RsE;:ؘ1QOa>]96l+ٻ6-%{nUĩ_bj1i\< $!jE.vfguOߗ,Yg./|)-V!W[hb&7ks˨قLM=Nqt[q҃oo’@GSbFU ";Ϟrr}foF孍EC~R*>6Ռfӛ>!3_شu{RhA_^j>xzʩBWG|}.s5哷*tƘHXj@%kkwz$=>PLu(k-d7Vosgz]HuHhQجY>d{X%h+BRL23$Kz Ģ*J/]F^Vhj6YLt\!;o\0& V[ޫ,Q5:"zڧr;8G{= z5RJp>}k] {hvu|Y?;d(=tِj ?ZC-='cK(`A|Ug/詚J17l9֯<gvzy~}gH"ƉdVfY\xq~̏% l旫1ϐpƬKɑt@E!:M5[@NFKN,Urp&jVI)/u\&VbX9ɩDa!M~ވruKչJwUlS,Jn+u,Kݲ\_Un~dr}T t( L̥5AGBGP v;,>[Y3ZY2xM]/CY,x=>ae[[<- Rs5UFK4k{'@1G+WXlr l *SYi gMB6 fdc_X; Xb{ϽMgGxM{Q6خ'EV~u^ޭ`!(NU~ĖrI(-w1Y3FLȪGXb!P*E*J@ˁ.O.fm@]aS5d J :6ڊd]& JpP8 X&^L8'(!#O2.1@-2D*"nC"I]ĚXՠMPNjmi !4RiF *A1$Ëjk͕+جu _D 僩":FꗑUY刅v:@JXEgҌȥdr}I@zsHSo.ͫhuv3|>B'G[keWoT/f*ލ0F?qyhjWMA1S`6BQO d2eH6L(C9@+P)o2 f1M.f"?聣S ZPnusC +Y?O0( }{i էRݻ󏧇u&Gm t Gkuo'E}ه+՞M3--.o.F8wkq.Ώ_f_LE|2NGNj5tלDNjy{!Nx!-#:uGm}sX~-<='YML']98>Q5=!Y5XEI+i;!vNꨉCuF']NMqS/4Q 9IчϟÇ߉9߱u )IC:5ڠeh ~\|-~>^ܘʝ31~ݴu<˭*Y{6XQY}E-lqn)ݞ|V=+ j52!i1َPU@xO{¹tݾ>&&6~MI%N*لIEtb8٨Z]XeKصt-ކ>yVa€X9~{w%\sHh12K2B:'0lW`zG }YQIֈ[6!~ul>Oҕ[p" u fNP[T捸=q=卼H8KHWrNt~Dk. 2)B6PMvPD C0UX| d%KJ f,g.rѪ@s?[W'q8;cHJ-XNySIW:13QTp@l5C[Qo_BSS2Ge"lT* *rSN%!`r..3jOwm@]Ub]֘O*:X$.CmysU 1|4_kϷcXN`BNDQAU)V5/Wo8qGnv Wr1{woع?<;{~6܃ߏd-1Έ-,kHCa窈㇊8/b{QG#/V_ZZ'խ>UPCZ8 S5B5j* 1ļlJڸ(h]IG+iM5tv ?TA=:Ç3_{[NߍKk*zA»:tU~GCUq, \_1^}FWKƴr@, uK\2L.1q~/PmLcXP%* cAS%SR)5d(+i핎f[H!{ ZŹ5yd#A%Br6 FYtGM'r)M2MʋWSbrK1AٚT!R,dde#%l8'9Q #1| fqd3`($$ ؠȦdDΜTC$&b5 NEPƊ췆iMF{BȬІ}]&AuzL|tv䳽K:y, T9/{dc)thLt н1E]ʗV+di)H _ +ӿXٳ"[ h(:mup)"0R[TzZ|2IA%O$"Ycɡ5-kT#m(Cuub9dn7#^zFo$s;wQ(Ȅl(!.\5TkmqT-̞seZ1 ֟GD֪N^l_7y,a̷ ݁=L*3(gkexbAdDԭ2ZQ{<=`'QczE4n\/閗 s "NV C+xLpy^rq;\:A5[GQ*[tQul+%&l:!t)=շdUzp_&fwOh#=K*Cٸ\ Vbm$&QkUVS(:`jي/Z_R&lCcWJ~Omf`]'tor*c}^a<bo׼N zkz*b.*J)t N*#E)ovo+Zv?u NE9)~a16HjMzʢFSC(M"`dB5T2&z:~#B ݚ&*1RfWN 4m 8(hr%{z{;Gg*8%?<ɣ"xn:Yyï-4 Xtm7 MʷU0wjnي7l3sCT\bJEGQ)U'J5dR/-ߨFJM ZG,(-s+KJK䂵l7Q)gm[26*߉R^'/x,Ğeadڳ_4$Y>̶/.N&'&/[YW&j=X}ۨ`HE d#[{ 4 ZYfŢ?{ƍ ߶N3Rh WC79U/eS.\e8ίP.%Z$l3ӗ/NIĦq>;Lס*Pcp5swtnd^."QǾHm`{j9U(ƫ6aٰ{@v5e'NćT[ >>Ϝ%5cWQ !hT+S<) ٱS0,fFoW><%|3qi=$=k{hwV[%EN=>יICʔDJt0&zrX @e@`u1Tށjowf\,}, ہeo1+ǷJ )C D\c^*zh@B`j۱ΧP r(/kLYTN\P8*EJrU`JEa@'4TJ9jFOcՓٹ# US"m}O|tm[>ɺWsvU /rl'Q{e>^÷i>Bdb|m^:(^8@yvh0kZ t GkuͣDOiQW0=F3]=[]\qtsuwo>if3wy=>zf&'l~2LJX楐fqyg!-#:Gzm}sX~-k{/Z[mw:|}\Q.=y˸s٣Q#~n^r-:nӹ\ <+ȴߍr7ΎYi0竑ߕQb9* E^F2Xzz7Jf7% ! Fa y\LFc{2o:m$žX*s NT!;[ErI-FfIFZ \N^>F_I듻v}+USғ/MHa, Tlaf'!Z K2e*kEHVԼ+(|=g4RJ2%j(drE 0,˙Y]F.Zu}ݦf3Qyw c%F,P\&㖾!9CIH34S[Ǭ!))`c[3dU*cK)U}ћjDϮAW]Ub՘O*DڇDUAkU-tޝh9;վ|<  lB* sX :3[QTsU*琍OUMP)vvu(ZHI <.JKv\UIhK@.H3ktngr+I|y/!W] ۊc4d%15=B1cLx#^U^U*$uǫẫ|&Pdo  cq L_;? {7?G<RF!& SR) As2!{k)5o'y8CW[]r#Rf Jd1 Fͱwtv蟱yzO4'RQ}#/x|6%+Ҩ*TNoJ!J1TIIJPG1Jd Ŗyn;9=ʋ}`͖gL5 ) mYU5L$J19T UQ 7],Bkd!dV͚]QMAcٺJ̞Kld~3uΎ|wNfDUNK <`L5Ξ(4,to]LQWs9rFr!a5m =Y෯riBXeGiڃOE`FH lbS'jd'Mv#&gg *4`k)r,kHID_r$Y@GW{=Ws;wQ$*ᢄ`S>PJZca+RGD֪A^޴F"we>|]f2VvZmbY,o=HH`Tkأ{l^5F!wЙ;·8]{F.Kuӛ~$Z:!aJ6!D% }]M/E$>nZ dA׌6 [CfW [C) ՄͨgRɽ|C]Vl%45. t.-tDA$n11&kU)b7|qmxR_6CWOoxY)q|YS2ϖr]SE'qlv۲ȣ^U^<;?8ܯ¸⾬LpX/ۢvt;"aʹA2v~ts I_@+RقޮhU~zOHݡ^/VߚAuJ-|] T-҃ l 08Qȫ`qi ܞnATo\ȅӵ?VT*L"hv(Qubk6lBrCYCn]](W-|͝ݦsS$pIB~j1kykۛi#_>ߴv|L fVw08X2Ob g^^^xY7=Vϣe3/Љ7^do A/@yw~ne)Zz* cJ4i`V&Ph?}?hRr<ǭQ`}hԨхMAKP@<(CiLc#ͭ7|ЭYr5dnUMakʘo6lI~쐳ިrO^c \F'w\=:͗Fg[2Vh.4Uɹ[fs!Ԕ8R"z8r\Ԑt\ֶ 8LTbͽ%cQF4Y(㮲:B?g[~h-V'āPyyؾ x||=Vh'.Umt-@`Jul%rS EIIĦq>;Lס*Pcp5swtnd^."QǾHm`{j9U(ƫ6aٰ{Av5e'NćT[ *>>Ϝ%5cW'Q !hT+S<) ٱS0,fFoW<%|3qi=$=k{hwV\%ENM>YCʔ4IZio|[^=0XJ[k}"рvHLqH-eoBNG>'i՚9kcHXrTLL6GTQAPjRRbn+SAEUMbA6[NA21Wr%ؤ|a;`$ lυoƾulwac΢ Nw6U޺[rCvKnW:ztz.K,$HmՌ1<}$?"EP5 !xC&h'uXZ<*9$>j_V }/.5z NݨJ'HD 50Hu^{t_Sբ=8qp?][oc+F?m1/K 0O }^mՒGR{zE,>lS71h-QG<Ht^= g$cKMϦ_&?RO &U0 ;Nq< x9+õ2,%Nl5[ Abċa{ׇ]הbOXO~7j!c7i}(jt fI)}ds:t1wEɺCs&?է.xrjQE޷rص  *ƴNô0(SE&кq@Q+⌥1w^t,Obbx1/~,nEAI!TF쁁b whVBz[z݄$~HȖ ٟ(Ee#بωH؊$RwaVNWq_kў^^?]_> r M"XJLKR0ƼqD\|P*xQB&[%`#L$N %0ye hҒFEHYU)[}~o_di>zGϟ~%vH/{I$myr.}.BlzҗѰ?*PjȁD10 m? =O]N?Ͷ×7krў^J Qnn7+E d@6B0!nn^ Z4bwF%8i<.g!d6`6`uChO #100Lz0;ŔԫLjm|wOp$+Y:`z1%EGr6BMӽ-/fsBYcqmeJ+:9Π|oqHy:rn؁t)[̕ -xJeUGb4&1JdXgTiXI`'}}WG^o7<5m cQp8V TpyTٻ/o3ooqr}Ę(q3#,iYN/1"P\8dx5+R6I¿O#cF$@NN@BJIզfz6GWMl-.M.{ew#ֵ+}f]|Ŗݗ^f >5 G0d,pڢרRRp¸H8Ԑ[ [RO`^ g3$IH3`8`-w^3IgLLqǸuk :rRh$ȉ[(Q y PdAK6%cIdeUΞ82JEPKED0VʜbSR"Wy'Z,Qh4OCk@M<|_(=_ IN<( Q(Y%@$h|$'TMZVx{mf!Hr KD#$a9H0*9D]p/6QEq]5xLͻEw3}̴%J(M)r.*A3ٙEt 91CulC{""4ԕlzГs_5y+le-q%"W.g% ɰ@HkV~P E2&V=}l)3*_`d^v%TtIAT!:b2epȓ!&t\A2y_}To=B\*{h6 8>]M_t6 4odb񤟮~y賛fϟ ?G߆=ī?/U铸n/GJآZM:B)]?*zY_9Ay%klp.*;:~/mFꝈirMu)ՀK c$;p9v"vpYq*ܭ[#[V*lm"?)10yLZ:ǴܩHQLR~\H\RdY)oAuT,pZпSb͕զ;I{ϩіz{دo}12ٞc%u9B9|N>< 8wpuqz`VdeL1Ϊ$M2;yPXgk3w7 }޷D, =ū m+}ue59]p`q֣g/N} =Q 8X!8AY5KZ di+Ceź AP{N6+eFT~WpcTZr7fCetԹ %'`9TKSCvѯ}Qyj=ru6iވsRI$&ǭYx0>kg%n.ǀbmm̒ R=)`.Es9{5qnCmXm:5vrZV]h+BtK9Neϑ{Z> :7 &9lv1=>C@9]%Iȥ0=!@P? )ڔ ^C41Y&@C,[{tvkl?5\v58mn vk9 DL,2\'AVFE 1%`#Loy 251E6h"cVrĸڧpMg>9:E1F&oՈXY#bӈM#PVh%h&cֆh@|$@Is# \F:!35=moHkH;{]4ݐf*j+gUwYcmYD-DdɃKWn:_o)!<)u3ʤq*SOu^\G]|SfS}gsQ[hQ'RB Q(WIo3e#O˲nĽsZ-2L;Olh^1@VGTՁޟPk:ԦB O45,M`#3TzO.JFwcҳqϝ':Wbv86vt"vdBq0v͐D~Enݢb* @7jRHbFADPS$}sYJ8 dM.RG샱Vcrr%I ή:ƚ}BFNCG<_?~T $2m;HNv^ i΄بT76Ȍ,1Q阁FrA@Bn\R*ɬd4yj=L&8}۠/ᮔsI(ބqc&u8EF{L DkBd\,0^N6q Bp#lfA56V}nfq؞!;5epy6DF~eRi|;Qk6 b!eɭe J:!HT-#Պl<@]orqkM&JsLeOv(=/l}{{~5]pps9X5yd@6B ֔t&dG7r=m h,5=3w'rpy \wC ll<07<\]X Gc`a,?5L="av)5Wڴ ,IV5t\b˝'9M;'-o%ܖ tˡkG8CV{6r$J_n.I!d' .d'+v$2`'"YU*#׸gHbaF;9>毸W@'`սchMwn ;7 =|f6 w Rfc5֣,ΰM،xWʨF]gcޮhr7V]<w DAz5vV{>xf.i7#E7#ޖ '-F.,P˂\h N( ~;AQrHHm!w>6׃Un|u[Wl;:݂}^\_^㊽@vī8.C}y$(cؐ|<\,"[_ouHrHز6aj;oY+2$8cv%3g %e "VPr՚tUHG9XۈXRNb/U %M#TiˁY$ L(F)IJ7&1"֨v 1!eLXό.$%dIr.<O1Zks 9TmXk7-r}JRjKrm{ +!x}^Hфk޺se%uLZxl/r81@8PTɕ3%(( }exP-z/$D (Doms5 IqiQ$ϙD)d5 qg1/ Σ#I@1_tIgԡxh{&>~ת$ ;ϗR2@n=9a&FYAM4"vFP7IS$v)"9`4HdFƭFxI.:E*'KfKݶz ~Px?ٻv4ljf~W{XK-+hrg -45ofЂ(R)jR)Bw"r4ǃ港#@<$'OY(rEҲ JFg*:-M0Adr(EC78a.x!GB`T" BfpR9*[;[gCͱ) hWI|m ]]us ou{g@Е_:r[u#57 \b׮|^ϳmD+Ozk򷼿 oiG;tFwY pYgbjN?iN^T&MW9t;Tv_H7W Ӻn>LX"pmdVjbK*e&$HJ[N\Dv"Dۤ__Į笡]+^ƥchk-iѽ5dU'#,`XԱ5^ԴK?rlbݪLh*"J!0p8sx1& XwA[61Ά&)a_t7=(C%uue^ԐX!W뻞.X-w^;rd-'Ly\HTIM,ȤI{ (&AX ĭ9Ip9\^_C7MP$&G"U(D%ϕ{FI.ъd$Z$(\@=+BĽ Q&KK?Ֆ))H,Ӗ0cFc8rRauNJH2h|I)7۟hڄqU֡!hH^RSL=AgH"~`1r!KHOɠ$zF<8^1B1/; hR>A99TVgSjd\ ?-~_ޔÎx*9g''7dR d$sZX4ܪ73SaQMU)uj+XP$kzFE  /d0lGx康ǣ125en-GJ8b=@trtByP:> .[\?쿽A7"?NE.,Jw)[1Em&*.tzz2?*RTAfcQ+r2rSԆk<'.FXܙSӺg#~LuoU/^_/g=`+KevHT_UTtq mH?#]-.F#${? U̧x׽,h6f3^19 bV:Q͕3LK8<@GsףXS 9&EW;|Eq⏺N~]oȎߞy6?9;58Gs/,l o[`Y{Cx#Vm Mfo2is͂6d#|*eGnE@},~tF>W$Rn~D̮{ HWo{|E?fBB4/bFS>\>. Mm<f^Xݕ&„ml7$#m`8qK$ P&/ %AIznkÄO G󔭀Zw&f/ %qR%} "#ClO.*8\rWZ[É'jEQmj]OsRIgϥ䪃RKdC%r ڍR6,yޒGS@ttRɽ1%J`(7X!My62GQdh)2 .:mPPbHgZc`e7,9 )#n"2 xPLl}ia8nB2q9h. Z(i;H-p }jv (qEԆZnԂHL!:.h&8Jk9֢}@)C)>Z,'Hs N&&Rނy7(oo>{@B]4U*}+vd}I94UN\ V)`BΈh j_xw ,9/(U<&0.0@}*P>L9hlhl; wikj|t02ݙҒĹØ#|ؤXYG̓nU#oސup"g,ʋI}7mu.JNy] 'y[eJS:iɉіޗgvI#=Ê+"9E4Y(uNsI J\5RuL3 (et6~l1R^ A9>ԃQF&(D-yr>B m8^n>%FOq~>gFjQ6=:Ȳ(2 ϙĩ?}KXAVwJּs_H6p9W瀞Zm*Wln\|(B\8,/~(q4 ;$U8\YHBo?yބfNA}^p/ST8cq>֛_EťqREWj8O S^OhIL۰v!/>j#sy9z{o^< u `]d ufzb3FkJ8eeP&UʯN7V9R4pE79)P3]7 !,Qk5x Wy `ؿ(F6+wOo^9 g@ X8Sy9u*vYΕw>xn Jj%gJ/Ԋh(bMV6y|`dg7;ÇHfa7Aǿo4ꑡPR R4 \ڧ s +!FK\ʫ0tl ", YW-A#ѩؾ8(s?.Bbfmj[pT^aGuO}ӛo[O+o: ?c^b!qBozsY5OcV:3\X?RIɢHR"חdPSl aؐh1zawZ'G9VBvR3!YBkDRhWE8`og I]i]rtQ@$*F+|j٪)BZf #1HyAH6}!PHZ'ӪQ9g-Ӛl< 77IE);0ǒTqh}- ]{8/csڙt{k!i $m#$-_ xUztjr'a-r4xR"^KcM'w'0PԱcq7sV{K½3%]* w۽ٟݯOj6ި51zGZyD mvy1k*^?q-,l7eJEM ̭'!LEÐ͋ Cp6{ G;;ppDz/?`{?ot=ہ3? &Jh%zU{f%޵Ƒc_1z]iq2Л`i 6 ̂$K%$W՘ȇRrFl& (YȈ8bѢxY2 5NR)_Ei+m%xW;yDz=ƻn}ȁ#|w.wP읙v| ^xJiʹLӇWӋ. -Mڸ/يW׏gwHF9 mI3'sy5s?ߢhH? *qVML^Ccߖ/?bsUK !4, ;3) WN[Oگg[W))̾}it 'oy= 5Ty&T*UKrUg4)[QJ5ss{^[+ @dcW 4,˛q'5o{^9쳹Twu>}6ә=9;V.q3Eͳ?F>~;3MОoa|rؓ[n٦Ox].uq9;"soЙzkW[114^=N+zTI Oaq`G٩q-W}y aq6߼Y%e{S#B=?y&ZJ|}S>ϙ q,>bD٦n3s /^>Q4vU|6.b&.}xS*w֪#lz۶9Sh/^WMAP81lvMoVYӛO1ZDaRb-C.U:ZUPېEVfxP q;/U.OpsWlq}sq]wm mnv;"Aؿ*8^FnbWW*h/ww3|>t^yfԛݼOjWiZisv؎XyJxJ:eHUkt &YCu|C{SyΗj5sRںjQV,N@R8 HEbX5f<ƪf`jN*H>IbbQ|beggO0GK83R%K*yevphuwaӮ8-4) oR!TT2G #&8}2:ZZ{qkX+Kmὐ /D"vO0 @G&rsyܽeᝏ,;lC ^%C(-?W'1XGǛ [TtJVȱTgDxڢEI3O0Ш#̇l RwES%>"QQ~k""_G<"Y@-r4Vj()F"Q+59K}2IeSv9L"k]dͬ=좗1$%EVXˁ@ƨ1K‹yX:RȪPB"A%'YQ2L!Dt q hސ LXr #kV@Ql|zk!N=0U`;XǶbUƯk.AU"ѕ'5ئHǀ0^,RP8|LJ[0JF%HFBXe dPoQ,j78/j,(t_XK%(|"$5-*bD䓢R0҅ycmL\t" Qkt < /VT l >g,Xd P?*3rp6 0I|'U>nj/ΣNr8XXRul 0 X(;Kb fv]yH! p,xҗ W<&]MDh7Aʀv PJv&rp1vh1-P4`eωQpHhҨČHr K. PkΛX~ZjoW4<􎲷!zO(ZB^ܮM7[ ,WZE V^矷nݵ"T]D6-šF /Ok.[X߾+9~$q_S \z ^| 0h#6pmгjnmKY|ʀ:p)Xŀs] Pс)a?9<`a#muL#~}f>РrGb1'~?oZ8Yv_xʛI3I9sTueF}1SwUG|6?z~ֺ6'vCcsonޣ깄ڒm/ہ^[N/\#͵7F项롹롹롹롹롹롹롹롹롹롹롹롹롹롹롹롹롹롹롹롹롹롹Ws쳉׳N/u׋xq)W :;7'ԋٙe{.Ťŧ?({{KZqVz_:{K{o|Ւ{'֪y_NmYb3+năuP~3c}xMْ̀~=\c_v3օLnn"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!aEf*E"O[-s,.Zzbas r)\ kZ(DXT6p7(vGHa9¾ݏ[.Pmۺ*KJN6*6`#!6N-(U!U+tԊw3V|sJ%IqRo+#f8l0{9΋dy̪}muK%`-'Rh(d F]4*T&e1(>}#zŎ͜g}q|Rw{^m|-gbbyIVZV*VͥG~;Ѡ"[h^ۖ{.]Yo#G+ļvKOc`c>03F)RVQݖQcQx.Yo_a喾~YjAbr9џo[»O=vsª.Dn$i5SM{d!KڏZ =?nf7O9nL92ͫM#̹ui7-yoo|0W7Ǡoyk{;[&Znz15=?sIJR=*`^KEFJe)FΎ`#K>~} C[+| @k_\ `-64SwrkFwr]ppq+Oie( 9kGrDC"Uhu~̽vF *C㞦qY+Y*+º 3㸶 Qn-u\w ZT:+eY33'.2&leaz*ִ iN;!)LD`,vPQLb_x\s, V,S uz<݂߷d XL+əOIRA@=,b x,k$o֥y0l 0]Q.qB4{#рN=^~= \k*`os2凟gnM~LyGoAD#pSCj8('#z,T`8G' ; Gd<M~t}NAmE.KX-u ڤ B͒x 6 :$4"˵a@VsT8S"݈}m>_NNǫ5\k)vGaQpw #Y>e 3>3 ~p_PO+w?xj#+ _s~hxz6[[nov%H4O?^M;Ԡw~Wh=w{'!66RfY~i< "z=醇3Dޕw]|?? ;Οa)pK}2 ?=5w[R-nֶ͍S|}as)=/>J܈Or~za__Tl3ɳ )%kѓaIp+\wA&HHAE9.*@oeVF;!Dعq^@/Ij_ߺH8 JA2Eķ 9%"R!S \{"22G2vf}dh/Z 8FtlFN6.k׈qDdNƋAB>qG&!T1vc0i3E6 {|dy#j"PdiaDfVf9&GSE[b!'I.%Z[1¡{C#n^ޘK}< 繃ry'# hӡe`2X+\ibw"SɁmk` Jmx[81CʿhOXbX-g=fm^o1^}{Ult+nGX(1MhNH#Q5dsY^K${g=2tML(1ʩK'ʖGcܐ ۠k[rH@yd[)r[C(9&OD2Y7Q' ݡĄ glfHsOCIklqeځc lc)U-& X57{uq9[켦 x +4Fx j+$ܙAxY쒺9)(gQNJℎ %nٻ%C1$QR9XVL)*`"sD"PR>})?W;WN* I!`Zjn`fX >ZM0U?I/XTCɄ:+89FmA##L$f(BGəĢ\sb(8ꐺb<RoZlL*k\58c>+y+z` u@(~Her{T*RQ z+eq4Y.FxR%"|(x#\6 lVi^v֎ڱ*g͕ drL"%<刚iL VP 7F-t)q eK1r']@C74л(m[Kۧn7=G 71ϯFpxWqlyPmNeNдo־DFe$R[:Xqo2@[~`v΍2?5yNɥdfoM)zAyo zL9 .{g+l9ypΪ9q?,Co@-%)ٛƮmOf<;&uT=(!0m2u(Yg˹S%bg[ & -#Vy 56$ Kɱ@CF3.1VzĄ¾l1rv( <ŴBs [(~8vGɃfRzzVzRWUuWO +Jm?K~p;|;g OOVbu]nvc "O$[_r2 #x9ȆwjN:R0$7~s ? 5  EFM#j%J~j%_T`+ W*ZN]]!+EUWoG]qMVõ f:K#S~7`_÷dxQԁvwmm8Ma63feAfk#KI9[˲n[$iHU*:KX#r J S1[7ޥREҨ)&)rvyf1锼 9^}^Z>e `X|6t̛6=%qK ^NKrUbzVs J[߷9i8` `\ԄNRhe Q(Ы!$eǣd X67[P5q(<0#a+GOF# &&@Zo)h"gw\BJ>!S2cUix{3{w1cGϻJ`"!76Tz[4|aHr2!QMX~hY FUjxJ)&a T*]U,V(YkI_G)/~o/[Ccmŷ*WPY4g-mO֭T'T%ݐ>z\}Z}}7RVRή,Ӷ E`9C?%b ZiM3Yb:;c<`b.2 `R*Dδ4,7q#!E9'-8NvTBxdA@F14]1rvH'_I՞p^TO{:p5X(Ȇ M42fh ĞΚ<:63KRja%Nhi]+ }%>lj?ĬMvTln|OPKg m7dSmx[#fxŏ 6H teʂ[fk2iS (!t`hga9{y  GHڹϧ]OvaLQq8V.!5n%sd sŲn%t\]XZ8P<:ZxO7 ,ĶS]D}і4lΣ'ݶ Wdzybjsr+.Vӽ,oئAq@XӊLS|Ľ#iCY<9l[5<0I`y+$wEg&;Od\>;yYY"@'-z)Z΍4$޾s3u\'ކ' F<|dߝ}GK9zI9] IKk)1BML!5&W}ܨ%? o<ɰ,xnX'ߐ? ¦U Zʝ͠ RJisBW7iR}BeV !|%aU0r-Aki[[FyUڪB; mg | z +2ٶHDHKYwa {.iT^_6Q(+Vb"ׁ҃yZUO;^=!Vr I0%YPZ(# {$Ȕ$ee8:u 80)/us]\g* sg 7.Wv5Fnت:`eNS:iGduJV[s",ơOItǓudDDv*TXoΡR96GjD X&=&E.!m$\ 04"HiSA(ə1:BA\䄪By>r\69":%J}`V$En 6F>WjT Hhi>O`_o %|4M¸-ظ͛?R\pħf[1-^= ^ .,/']b!WKIX5%*m.[Lժs\P0ᔧ):]Y&*RdrC[Y&.6؇@፴lZDkܭrW}l0BX/ ^QˀƆJ`%D MJ]0$i}QR D:A( )q4pRxhJB%(g8cQ6h=-l| %P(醊VZ1!4~E&Cy% _˓6E %&c-g!g,'m,rmVhxL(&4S32G_uzP(iѣ?L@TX02l,IasNʁ'$AHZ"F'gAܗcPAiv_vu¸%"qBI$iR6Ved,eӰZn?ִ hд5䠁1H9e&@4_hQDDˆ֢! Rji 6:(t6+^ cP{͘lFtj[bcPWC t&L,%W.h)5-G~F x_e>-.ü˻-#}Rg~/ x)lꭤ?﻽8 &U?.N,y-ŏ΀3.\ ՍBppqKGQen 'sLX5}se`bA~s{s;][vIO~hF"q$֏ti<2Yfw$1~,XvA0_ݻd,SG}$7i~q9|Ms)+bFt}?NAqͯxg0=O;rKxuם;Ow???{8P.l$5x6?CC +59قWM55 >-ŧ6fa$_?Џ?vǡxX9=ON Mb~7=;aEMW&W#ZXČzd3>ϻ2X'S9=nپj=E_vWg H!FvshNy)y. .ad?YTҙB5m뼪qk3ύȪ ޥR>jͽr%!r]ddF/۳bV=O>4rsV9q!1MY/=9HʼnH˕IH5 Ef] zչ|2 F402e|t9mT̄y.̼3Q'oο5hk\ j݁'r*'lAw=z}=8a.k49n`U^A1c+2|KVutAs蜱e7J޼59]6oUˬܥǀޗ3!ft&sɌ@G񠛨D{/NU6Tܫ'Ivȥ"JeYd K0.,&gAsNR 65{XȤuP"&(zT~VIzB k%^Eo ahct3S%} ùD4'^I,:s:hmOsK<9S꠮rgԺxj=)sʻqBgPrȒ6ImHЗK_;; n\ Z"߯z(GdS" gz_ĜqIyP.I @EQjBH-,: 'Pd 6X[JX[_bAKg3b ,ŴPP[w1w#S!SM3Z,sy!R(4$Bya}RYﴠ22Ü;,zUˊx"^"q(7.$Ǡ:*k a1q6a+g`<D,")Cx·D4Z *M-hc &NGQZ\#)Hhr+ W@HENM򠀥3.AQRţ#i[E3Ko,&fD1P;I53C,%["eq;\|Hez" ,["$m|tʙr)t xJ#:ŴPn{EN5sn&xYVj~4[$m-vu!i;D!C6P3y8!m*-Oes+:sϫiXДSd]Ƅ mH !$d+Y',{S9V FdKE !r"JJ)rGKOwVa?"H._irQ=w\νa5 wSR' -S7 g!xb[!,~G'ܨPPQ^k$Qg$­N<A-KvfI\1$dTH502h.gtJK"Gxa;+&NK;;ۚ<刵뒳A <%M)1 ieI2)ʐ+HsJKPgtBŎ2R/@G;yF-0%Am7eŘ8QTO/80g.ŌNZ1 i$4E *VG RWqb:(JNT)2n5R*c<%D4vRcR>S"2BҪ]gmfDP>+9TLE5>]ۜHBvI0%1ѨKV]5D SљG9X:\LCS:̷8 z57o[j築Dއ2ߐm6y&Q>|P&Ëx6}v34kTi :y>v!ۮ" gKz`NEZ@jrG'^ZX[ڸIrIm1Q"=>Xdj3BDN Hv+?Ր;|W{ "wWs|AͻsOqS8!=g=ju4^s m?S‹Ga09Gȅ'~Y@Kf4W`SY֊:٥W hy)*#|BG>v;_$<).:IBG@:i5"m@,Т.\8scc I29L$xZĜS^Kckg "tuf -V)5*ܧ>oǓ `<6EnG?SDcƿpY&">̦:1LV`BμIoWy[mYL36Kj_QrugKuzx*p5hr6_ Ѕ䁤7y }o_罡g&YKŰCrPCBse#P%Z3+"c+57Uf'1D>xt; }a#V/?eb~츓%RqѮh!UcS @UkޱzPHK0/ Xm9`o])r;ztxhqYҺ?<=}yD#i·hkm{վg]@7wCmнPNh"(;E,-"Or9KAsߴ?Q}V1q.8ꍇ\gQQQq)P$K_޾(ʏ JOT tCG,%ECVI=콉7nuSfA-GϢ{A rn[z_;gz?6fM>G#mv ~糳Z3 yjT2w>o_'Rqi Q(egEL.̯B.#Fv#v>ȭ%8p}DMٳe=J\AD4YÏh]DMF4)J R \eqJˈ:tR.-=tp 2~} "H2zr7TK_-uZ3 f1AHD\epL EhC(1xie[xsauNbA0z W$dG0Åi5gh5 KWp1 PĊiӓ#uֱ$'LGvXK3͋vJݽsඌjEIBr8K0IR a^(j]"9!ۥ^ۤ^MP3GӇrL!{W{'U(#Jg3QFdfgO7hMhe 6fe,䱬dw_Rɷ2UbM \0^N;uuմ^=MxEhf!e5]7zi6=jz3odilf%ur-aǝ.&֜?f?Yc_Tmv_|a7ڡ6oYnmBVEW6iOԷI-R{gdxk.Ih] [i߾'V 3D'pJef- QdȽF؉F('UHkvﺠT]ӴME[w'3lBЬIϣ][&h-Q}, pڰ,(Q2yŃy[0x "_=|,|)Ó 6Y]BI :D i1eg@F0-qR'v{ٷّw@>c8^}dܥD)!Br9EP{l9K;R9Kf+Oekl]8RnsIsd宺@ S}*+̂=1q3jɃ|@w)yE/ED KߠƢ֋NPҠۻATe..cj^^:/n-M0!M'~kzR!8*`0G~pƉGY$? 3Y5hARL}1$kZ Ԩ()&J-#&.5Rr a:ų8>^<;faqUy0dDmYUmǵv|r\8uQ0͇<`KwV"uWk˳_UZaq̔ت 9Σ:i6?h8'70.x3ZHkWo͕w7W/fk'Ĝ/q ONfno9Wu~51ZH?ZGRzHgmÈam6'M8>`ŲGތ9<^y8t먂um=`pҴl;\Fo(]_MqpEm>8o%=NZө!gp\/|ûo wV \"r$'ۓO{Cv547Z:о˧Mqiq[ƽ/>F݈Oma$W_ތ)^hS6nVMpu=N4&1?kj uQST &ofRBP6b =Ҏ L <,;q}oke4" #)ȭ$]pRc GQr,#yԥdB Ro$=aZ|'\#y#r}2G=C5F)J sѲ~SM'mEcZ)[;{Qn_%~i^$~IsMҍux}GM1Ud6W ɪrf&#: z//}{<)bF/:k5!84-SȌ"/"(2rPm":SPҊhd<'rB,q)Er:`R]ͫ4N~NEXJ+rindY$"K'""zOuNJQoF݇ 쏤.ѠΠaQ9 Jz.-Ve%@,M>[PﺪL5,h"pT [2\X, gsKa}^?sRh8]DbeEBX& gTb^{&QB,d\3?$j1DK2$QVFDZHի9 c(;$1JYe޶LbZ:k 7u 3mw\G__בgٕ_GW'|_\ԗ˫x(#`IWM PۻnidAӰǿzLѩyM3hUdRʙ*%zBYIf{gm=6mGR!<>e>T(zbJII2QzmaQn2-gR!29?1 TsdqZܲ: 芜pѡ{qY'' ,q!|愝aʖ/ˢ}TJfΓgeBed3G5B wIdb@!ۙ]q`ݞdչI&͑XxK3,*`# ^3\4KETAZ!!L)Ld%ph AK r}fvQ}c9댜-lr+f5ZjIȎW!2g|YS 1l1BXNr3YTY^O z *gs{˜JXmq1𱀭l*m:Zp9Nz{^IsJCH`5`P(5RC,D|8:Q{O|#6ޮ8-Jsi^Gֻ9g2 FŬ[r`22K^ںmPRy^t_3y𹎅oᾅxHq-.f-4̇2 Eks4b^Lx^<zojTu!߱t z54*TuEQoทHy@BBqV .:6`΋B6_UMHN4! J$}4'D+䪜|uU.GXѢ3=Vt.j62L L[3A/HO+*9^]4|CR D_NN2)Mz>t! hjBTHZk%(M LȁS&W_1ёXS6|Эwjl+lUT8޹Ikc vɖ=Pؚl:wȉy t aB,%E'3f09Wfe3c(oZ}d^y} " \9IRJ|/7}Ů98=G;d 9וK,YRkDrTdJ)3T]$JK6[е 7-jܪRC%rZ.3Q)Q6M 6nd)O%8d i`,#>*ͨylYӷ9.Oy5]n8οLw_%qLjVO.* \XZHEɈF9 6q>;Lס*Pcp5siPӹ|V.ڃIǾ6=2 w|"K`# ">R6'묷dBTB }ǜ* Yl ;͚(%_T-n 2tnިoMBnp_q0/"18"r`] gH!Tl GҘ )QzLW'"ΣlѨPWxԵL*jȎ)g5zܶ\ ]~܎_޳$Nk|I=qQaqQ#.Xߧ(lpEs+,RȜ| \Z g Ul^(Ɍ/x8t ޳f#@Y-N+>;0^>S$;dōOi4Zx՛W,%-S$N&r 2Gy+& 똮ָzxPӹC+l89)w&}Cr=kjحO wq]ҳEr4[%݅Ǒ=;gԖM&LϦ-[Җվek~lesg͘+LVY913Lme1cZ|aӵbBRW5 .x]&bю6Q(k-K<+}d|c p#a7֣cQź"gA)am._{W2zq)-' FR]z)~}^|R\-QMS|`HFӺc) \ ֕*GPjc!eS)*%Wqt E ! Њ{GkQ Bl%]#.4tnw &.IB&$yy֓3p^{dL}sѮK]mi{%} 3\tڃjkJNpʅQyQSNmw[ڎ`%{ҹߎS7yfuy5MUx;C+ūy!hzfgr0ÃO%O5ؠnZ|:Ewz-wCB8%DsBЗ{g{J_-{Zځ>;n˰4%qb; VxgN1.k.B>Þ3RY)bej( 7^'C,.`|Hg1vTǙY]F.Z M*L*0\={Ϣ]8wlvPbĂor{ztb+ iFf؝}R6Vf⽊Jk(54B)R2Ʋ+mJj P2tXj[ 2jhJn%-"֌ YUXc@M9"#D&bsR.hMB1MFbɂ zJYq(pj| &_c_ mQ>:Qny2ˣ퓎ABH%X!r i:3[YT!ultlwk?pnc[sy3h\>C@E{}w\-Nmrp>DmYwB̹3bqR50&a Ɔa{0L/V_bZ'꨼w;zĕ@ͧ8k$ԨCQGJJqQb^ J(c+)2(c5pq]ca&{~.xfX[.hnױ?wcϱݸL`ۏmHWqעhM)7Q VGK#aMP*TYl~H.bytrByA{*%\ʄD3oYjj19¨19` LGn7r9X?ҤJ*U{G r>BeĮ%rxUz Ɓ_TzpP/B5dk/$f}_*DR@7ɷ+m,;uR?x]` \w6R4>F` EeBPiE H5'稒^_ (O#D?\s4Q!Yv~_O{/.>zH(AsI%NفɊQQbpU`x[2o,2& gR _&~6tfo 0k (O,zOP3$mIZh$K@sbuk-s%1dqgи3h;rm1rt-Ef3NQ-31-ۡ@D5YkUZ0f"֦P*TTbu6k-NDꔼ`/f tw 2OsCSeז]oŸgzD\&ED@7[&:i-@)v:>ul\Uj:ufJey :Щ좫š/g+}2f”3VS >{ :=sdvjJsIL*9|t *#T8&vs ፐG4TCS4LJD=ԨхMAK!AH;er5in)°gEuTbB m%#j'Z/c%XC$Z%y=tnWT~= -gEd_"ɽ")> gZK[RmZoSwUzc\F4|nڬ*ߞe\] %TN8N4 GETB^hM.s!Z;(2:򵔚PFiέ/5$]tQ=JYyhdl:w=W b! ~G³{[7fe`߲\2}B+E@5K+:@]#4(c\9V*!NÆ:TЌX4 6arXWB+5"j:#vs_P{0Ԧw%ˍHWd}ˬOut_lHiR)IR-E(RTI zD8C2` М< Qb/$:TL8d"+:9 YaS?kܡؙb;!'rŬXDu1vnlÙ_6kx,?vEDꌈ4"∈k=^A\: tHT *E`merPAqAf6IYWfQ^ĽsϬkWBkFYAsI:T$T6jB :Ϣzc,\t`ȄA[lP \*V쳝Y7sgGWkiQc{GSM1{4샵@mM)E]LTgsvpնn9_8}`w&TqfPU@ƣWSAR4%?RTo/G4'e5BU:X5gE3V&Chb5|&+PtȞU4G8|\M2m z%kq.Bz jJ2N\ ע֐oGDlkeAn(us]'_-^/C۟FC@aЕi"ShICP`ECDCmML%P5uM_mҋůt]zo o^~sJ P(lsFz f"beEt[CQۚW˗6|Dϑ衧X8_3O-Eg?٭kYrZ. OU~~oy lJ{YG+=/nm:?ȝ-2qwΥ`oOVyԭWi0=)}׿RW*-: NJC&N+_0 5 ANao^gֿh1QZ[A+hn=R56},VY6.&œYU("ږ=5U%PٚAG]97'\IBۻ_gnduj-u>3+PS_vؒKȞ=EipI\t]W UiG2y흴XL`x8d(t؇r9~;7{7FɟafϷ>b

? ?@'Xg}|."X={Mn@Ԁ|A =pUi>JKXJKKK+-J+]18V. D啃$^j+ňзDlvl`0 1:\bٓD0%[Z4Q[AlZ 9.LRfrc"n Y?L./g'9,a+8Ёݞ<}x7v*zbV!Bd͘Li4* `kF{ Tw3i)BZ;m%?9H5j`j"9Av6ʤzgf,zxr}}[W-{dr5żij]1*)Kr֞ӢK:~A9:KE0] ֖1|Tr/Jcn8BbSs;܊2 s== 5k_3"{C.mף )R}ڰjR F~fy O/~Rb@ǧ#^s5P)8yn^](o]]<ߦ#(|X mN|Z\vlR%:3`G؅'Ny({isɁkm@@1ס-ry1{J; }?}Ca[/< Uspze8,\Zˬt ^W0ծ^+>̲X0W?1 +0#|L(jߞ?lr};j }.уgnw@@?y#X}J0 p20-*`Ykt? L J vڞ \5sU֛cf%> \Ygq'W`“+1T8jZ{pլD5GkVsHj(vrY:g2[lo}vN@1L?om~哻/_<v O {CA1g zƠ1g zƠ/ yBJi-~m6\2 ;P1 2!XX* )&3*cFEF|ѫp^]}|.yQ^=KO7jSVaz5&3Gu5lكU.,6H :zpQ+&(ȱ^VON^;yHUyl*T^ DiEjw_CRC(e  nntyA]J*䒶oۍOEW+>]۵L?zDޑ@aDhS"FIV ^(A(`iK )#ұ␑b J"J@ ER,TEl,͆]Y__/@k_|!L & l]1:IMrމ//'oR_g] T0EH>$-*C!$<UFPjaT oB]bdKFX؞$E%CebEi [k1#Ya:>7iq2d(C1 Y"@JUHbYLޡRx(?0QQ<痋yT=g<_AN°@ !Hl(*4 nbQY輼C9jh #uhZ"ݲl0"ݱI;S@ AԐ( =ϵl'΄Cvz#zIՓ,_`<){ϑ3@Q,P `TZDOT"wYJ◆0V PH$Ygt &Cڒˤ,xS KMcl8;aܬOg;u$Sxk噟_]P1J<ņ^7ߛ&w li2\wA ,GQHۂ e&N|i1$u~ӫyxPK"c"D4>(oXM `,BRrYi[%D>}<  hY#AXwY*D큔$QjU&1L΅lJEJ, Z[.eD[CABET*{/"2 ]}9FAj+HqgC+@Yu PTLZ* rF+HxPO;AY`\]T)ըku~o{TYoO ûMt/X)tk WUAH2" pz>5Bk)~=uѬ瘗r3?,ؓ3{;Dד+kժ^ {w88VW^ \p>.?xTVG_x]~QٔL=b䗽+?Z|\.0{4-.I?r:=[ s{I4~ZyL) m'=tOgCݨnb6O;^OtႰ*C:Jw؁2R;|z~9Mˤcy$'s`64xMst~wA'axx%s뿽3]`vZ>j·ky̭o^e_s8lO+z'y6ْm:ઉ~T}8*%lYUHPԃTa|דsپF>sy\A_T^3Ά,N&ZI@ZOVh+HzmoZu<kpQWJL@}vN& fi $b#3'ͫX$%b{c{΁/ݜ(/=x&~[$_izrHpRHhIyZ"iO[[5x<~ ٩. _:%Ycf D g}}9QB'P-VZ&J!V54yBs!)iۖ@Q G# J!f=hɀd ILTG&3`U9HYɭF]{.Ox}aMa-AN(*SRE EIAFA tCQRGRQ4Ɖde)M\gɗe{x-53˽O ԇ`CtBJBO*kh(H X z lkruaߺ,` **Kh򆊀2s& Yި"%"I"G;  exVں9x4^heMsw`eq`rG aUt%32֜kN[*`$;p"M{Dot'#+0=zbF6xI6_޳秴.N-ꖇ]ɘBw?h2eGSl&J*D0'lH.ٕqslϙ}}+Ĩsդʬ1$db.EZSttVKċ|/WjTT{#Mn^'Ԝ`hGM|_=!S{44Orw~sivl <y)q$!qGُR#VُlljT[$!P сp9|gdZEψt{"w%䤖ɈRɚNRJ(4EYk*+FH0Z sL d $9 $%E-Y:L5.*M:2.yN[sܣzГqKuR|\WTH=@"JpQdrd7DGгj sRmf8Nɂ v&,Uu!*sLs+<+: #)¦vEVy2+rj; U@rAk*1..gtֆY>Κ]M=ʇ8%FEtZ9%g{c}XeI,&2E 4//E@NXr;,*# 3 !1͠PV}* 8PT+^ ~'=cm+Bz-LIIP)b-4RAVN b1ٻd飏4Gث6tⅴ^M:ӴcJE1$+1LɅ"e-M Jqb6ǒ2iuavr/Qj"kl|W܋J,ư+WQF[A>Ȍ"-bMu)`AZ:@f0v c{lj:5A/ubdu2tW99~;Hz}ƛwl7?̓;=PY3#^ȾG=>vOPٜs凫|캔 s~wgWhvw#Y}W_p5PWGG?rr:;чo/'>ѨV񹷧EkϬ Q'g_'9 7|ZmHSqzN1Oݜ+5 Bw19?) *#1:"uL<M6i R$=PRF!9.E.!g 1e(R-En h$vT'{ci]nqT:xQkvcP2Тp1;' {|gP?e2iRP,vO<{}aa} ^,l!cS}(E#+9lL$C@),(2}P."ɤ8E5RZ5^"i U }l΢8'g/闻B]һ*2]oٗӞd,yX=??ϩymH@LIc<8Y Y+JkPD$*WDvJK619{{!hO/'HZ>ժ C97X O~;7;oRF>ßLqb=h+M(yӅ|,p&jppgYSHfsAɸ='=;7 d@%s-Gzy#BY}5'>sլ+EwvI&|hv˼h8ݰ#") ``XZv=J 8= bI*S'*}+F,M\:\5޼=?W1zp8i \=NJ # pKQk.~Ƀ]kc GU?ލ>,9;{Bvbg} hѿ o??0]NukL'e'iT-i~ "^_s+!ttP`IUJLF_^Oz\4?_J^f^gNs-1[=R&ꮞopsٟTa=ðNNՃo֏%;g@`;N#osZH~¿Xa44>?W1zf|_Zƶdf ? @`Nk:d;t5Η+jdad[eS2]P:`9rGnG)qvRV:x-:LGS8x4Уѵp9 8N=\U)ypEܵ9B!rn_L|=y>ҟfv~N"hmP~`KCXZ|^g|i4+VUPJ {OvHxpe{6$Njӳ?/T߾ȷf.j.z_;f'䵋[1|Fl ݕjHē46*+YVmSv3ZQwBKPI-CTAȐ RJ(4E }*O.ͺlMqlۉqaW]"ECq]+ٞC?UnP'ōiá(.R&@OP@Ơu,BK1A&9ryI}`{?i K`Ub2%AJd OFhH/%yQFU]saEeQ ZU1 tq96dvL5^zv|簥uL|B":Kq꜒3뽱>xUFMZEu0&Ubge428,II(QР`m' iJSQj5Zd?iaGmx2܇IֽʾrqY7sŇM"'”E"*4RAVN b1ٻd飏4G-acez?N?b$pae~`)]}-Va4Ri:'eL&bArH)tQ A `}K.KȤvڶlՆ<}1f*Ʒyd̢UvJ1.2h9< HƵu`cAƺxvm_lj:5A/ub(OtSL#rO4;zIeh\^O$j} #ꭟlϱd7)9uf:J۴6+!3ly[(8iu|qtyOsux7WbOнx+7Y%Ȗ\['l :P٦KNHHkVULoߺAƻ"{c^*(.OR!]|OG|^rϓގF7c_{{Z_5īխ׸;5WV]Xՙ-V6^='ݸiZ1WNy5uxw͐oÅJ/<3yږ[&1ToGTVT/N n-2C1@NuԑIYs({jf0vs怒ʡ{CmPoQI+V%1Tҕ$S{a$mJƒlHQIc$* mbLT" 9 Sc͍y8@Y_FaF"VYi|1=m|MΟ;>=?|ejOb1%XBYg{1PQLֵϗ'|hOӌ.^ƴ;VbNFZ#DJ"t:&thh`O:qYT:ֹ,Yd:YǤVy1إASf1vs('\prN)Hd a/dɍYg3qf=1 {c[Hm2.ARuDJgrJ [ QT lJ }TQ(fS柍?1N(cFJ{j(\llY#V&LK?|۫TfzQh.==|;? o[͝uk2^eꏟMTqiWHOQJ/rqe ^:XRwq$<@UYLIVe6&S.X)RCRk~m!QJ|\e$!},aM,6VB-M*1ZT^*Z*v=иT|TAR-'+Rl36gglUڕҌBl ]  o$6xiŇ݅)͓͆hЅpp0q-6H!DacV fAhB$|E 4< 5 I ҪMmYS>±@_ E[xBcʹX6^G[8$EG \0!k%X]PF2ƙPPQHWQ}1h$pȐQfٱIYUY!YS1IP=>B,AsL)Wx,~<#[5^@TѬ2z/c2^bPD́\ u0#zd(bpl huR'T?dt:#G N3=b3qv{3Ց[1:i#m_Vž$10VNxȊ0r zES2R8F©!{xx,v?|$ \ &,_Ƚ~:.(Վ{Տ/Th {FX`c쫹gzҢ;{F{F{F*A~wIǓot#P=B1gAѶvdŌ'T}S(jU}d+E%UT'LN6 $JP!(ؠqB(QK[,cQJz᳐A3?e,W"6yKYxwHo8L\%^jgJ-o61,eE%kNv&ܨmű^(=:JqS*xݦOݒTVC fa T؋Ʈ:"uQ1| AZRa5xu04s`ǁuOU[4U1P-w֥T/yA;_كv'@\ h 䠻U7R}eM! Rda76IT Vr1YIa"siojRI&%Tr;5A"Di 'ҪZ/5gw}dFhF U d8Ȇz_a.Ԣ>g!G E9!hit Ȓюxiy כs;o^&R n>`_&?i-tqdyn,L]t=Jg~jx8̓攟}hKU-c^ˆdC9ܕ$\gvB-:VNւch1Ptr^ Pz%o'$8r=_ *?]deA܎@U=\~19[U:l+|yk9Gehϟ/^ʎ}B+IB pNs1ZoJ>@2P.*G8GT1b# 60Jq<2e))-)I YՄ9 ꭙO5Ib<Άl\׬_)#y:mnn]Rwp,q$Oϓ|DY"YVzD(<æY,9-)R,O3Sz65C&= 1{C8_T0G!ʠ]Jɚ$]N{&@ê/W=Hxk w@؈Ҧ H1zMԩ1m7]hf-CC&P:r: d#‰W^6 S|O>J p)&AO,z}O^(PJ~ѻVn鷝 9 3-M=p/6z]NLY,r\/$"r);7Mwyt祩= ,Ғrţav⶛θMx=ڿBMpZ0\oh8λfi Hczqls['_f+?ʘ31.eGRT \3:pYԍ/~q5K|{ 6(sw"1EN!(-QZ1*)S4*[D 4qzU|yvuy4wҍ~Oh[\xꘉvb{R=< R %(X>Y%yE-Qֱ:R G?6xTz4`ퟫgYvmW$߆+# [_l{BHKPxx*jش C<`82 Vr&1R!] Qu`F-XOXsS9QYr@`UTӪD ks}B(2NVhc>$1^ mb )A2JTT*g+EY3qv'{7p4a#m}^Or((^{BHEs^)b'4AeEgs;^6dGV*,= iFPd9B6L!ԽXJFŁ M>+ >6W:,=FkcOؓ[+2!;oZ]v ݽSRfK*}*{IW&s6 F c $lw(_lTV($MDKiqȐ5PK2$%92du`*$Bc n&n ~g;dzWk_/~zͷP?S,WKE־Y`먫u,U;Z/|DՐ?>[>.dP8*:cRS4]}X]`Ҋ b-DŗLb]Rv9'L&fs9) z?RD!F/TE], cXEe3ѵo!5MG>.OIE]^\OWچYw>ߣgzwY2*ټt{Bnm0R.CvLSR!#յuh\~G7W'wzۦoXr>\V>=tncwtLܻ/pkO] Ts^ڗX}qtEj|T;3k[J|T3ƫʾ-r|e.=f+ PvPATDqV͑XGw8%zʝG}W<Ko Q GȾ0$c<9~#mK-H`AeKDx@G1`6eee=&eI^$hbh&Ξ)=?,Í}T)ZJ2VjBG^lZ^}ŋM{cSQ΢T dm1H$D뒓)gK29$< C]mT]Y1*r!<`ȣ́ ![m8;pjsbGyu`  DAc{УؓX+Z ^_F#} (""KC!P"J !.Q;]YE!5Ae_/$I7+bf##UbIF-UF)>Z!*Ϋ؃O񢦨bO5Quuz(ϻnM%aXs,>?,n~kz1v1} ]8ü;?~99|gVdzg))¨l ܯ'widѼ6</v93ם"lԲNߢh .YÏӏ_~O~Ryᗟ~O<L*f"IWC0k54Z#3o =./.-yø_[Yr'?_}a:R劽^Bw~G$~f3Xl'b.K Hs׺3#|0O7;q3[r|㋭]4mKh½ )QVQ:8cT48A,}?Ϧ2gla@ 鹳 +c8<‘O *)\ 2E&9KI\Iqfm/FDcvisTtQl;%M|ΑW,(oXеr\/L:an,ݲó 7hJ($gz9_!#/~J IܛS=JěġDJcxKNz몚<ӹ`mEJW@E4r<l)3ED|.jo3rD:3!"E/ŕL:1'M,˛Xgi uQ9m,'RzP(Amsj%~oi^MAvq;D@Vn2ȕ6R;=K@Zz{t;|vJHKЫ9(PPMZhB`Q"O>>۠>tUBbBIyХ)2#΅2> 1rh/)չ=&"mBCh9YL*C< 2iZ"D_at;/fgP|%w )Q8)^;PrOtzMć~1tfD~~e} TP~X`>}mCᔭ<Qw%2Jl ɩHTGOTVu$H0dsA[ŴJ: Y&e$gZ[S-jlvFnL0礯GN.)ɪ̽i(>(XJ}5_mpy]Vr`6#\ #x"(N & s!&mF~#  - Ia?ǓmL$sZTkQ=Ӿ{Xg ĥ{d(D}Na00Trg]& B#5+o5/R nL )Kn|L.˨2J8LfӶec5q5;.Tt~VJd?r ~<8/58蓁_fm*'~}ol2f[_{&,F r0q2bHxZDѹ!AHL'C BY2W3 ( {ƥ8˘MRAi+(L)Įљ !f˃^^4pLAnso5vP4HDW/&r5m+gdLb1iLXꀧO7![C#ڟIEpOH!ޠq;c #Nib< OQ]u ܺG (k3wɍWNs\4[1/A lp=h-ԍO>^ðD<ަ鬟."4 f%R]88 ?$tPߋ[UmBM\H+"7.Ī+ 6A3[^3=Lgs]"k ׃'Ʒ; z>V4SY3ӤꭃZ=iP͢=*Oc{+┵7n6n FuYD`6m{s /7v8x=l=VJk.o0IXoSY-~9>mAfH2dz,9Ŵ!(#;7D+nypfZzwwҽI)ڂHWM\Qz x\t9TҺZn1QɭOQgm$k/{^i{=o{˫[u'Kc_۞M{:nlN2bX[/gn84S}}4_v_CmXnm:b‰E`S8k̥N$i S^8H)XW8NTBl?>w!M >p"`ʐ]A:icfYXTXc-Y<1IT+9WJ(ئ;1QDp_hwwAY"5h(*HLĀbImsTVH ?/>$폵4EJ0p8Yjv@s .^!Bayaj2޴0sI*6tPZr,mub!QJ1zU2J3-$]}A;! -b~ۢC_',%mYs;ȯWMx~2җ+r>س(Oø?шg_Wgгg'_|k0kky{NgɐBBvOeDvcT։7|s59tOggϖy9`ɬ'.q0, Bh6?rz2X<[?C/a$GB{H6 7ca. Q>\r4?zЋ1psvnuuFtНKҢJGVGǹ]t:kfc_rS?zQ.D&|yx޿qW?>O?_O~+}O?Ȋy'U){O@3d<3VC» z}\78mvv/Y oxSFS!PT:`afLV51/ 'еok: x}0h >Q8*qUێiRw/d!޷_?͌,5wg![(m~Z?92(Y9xrY!9 I<IQ(8>R22OMsA5ǭifm|M"J"[:=,㢒xּ h g'r^CĬݺ!O>{ /Ώ>7mQnc5ww?ăKMnXUy פKys{ lf;>\:ޥܝ]u)*++>gIbY?xzDj1\`3`e"`I`qSTY9Ȁ PuV"3׊jQ\$m".Z1@6C\m֐[ -"U_۝n[c7]MM% 6RZ mC l\Uj„: uFJTfwPVs:좫/gQ)fE\ʘ S-XXpg>X}z=sdvjJsIb#S1ְbn5`bɵJΨtHx!z[E|9> hjBTHZk%(M LȁS&W_1ёS[Y([+W%>*ԐV2U7/c%XC$Z%y;+j!'uK ~N82Nf^79*3r4y7?98!ܐ!'׷m.2y2l̕$J5fseǭ]] %Tn)v9*2RkRu+-qok-jùRCE@XSD,F2v}t*{Iƶ:B?{;9mxCq|{".6? C<<լ;\}]/z^k{?L-@r7CA0s'{Au+kQot5_6=W"^KzF/;tԖ[J<+Qq'ߌָytߑ$7mRhf6B,UAR^-u28OM]Oi88=nx i4J$ HvxΨ:;OFi#7 ôLohFF,1 fCBE2NH3{[&\Þ931 >Vm}(EL%2{Fqrl[k|I/U>76}-_BomeC[" ˭f"h+͓K QFN"E"Ɲ`  Yq Ad-ȥOUi -J57 )IIM+қiOtsb?};E]e _FµYHl h聀̊X"$4:!gVw3^]Nl=yk'bqzdRs )c-㑌2$)7[e}`^V [C$ϷH8,A*tg<Kׯov6HCtZg3у2je:%3SNα1fZlC& #IJQӖ 'ps8 KMF7ra)>GTIyvG-M𤥱sA*,PgЉ;ҧr7pwό}}O&%hyj<³;'2Ag<쏽<_fY  Jԍ c'_qXQO/q:yɬYg_-7LЫ_F"viQ"v4(sGn>FM|5ɪka!>HK 18gq|yM5&iK:oX^WuR>yRMێ*𧿖vlE…<9 pTCgyvmqnS1䷍k'im!tdo"i7en >o-=OHwz4xKw;fFőIX_R}O 'xy 0nfs$-RPy[DVQtx`ͷ"ؗC{`2̲'f} kn! s_<}C3O>x2;}ʡ-ANs,RU$4D,`Q[VsC|b֮ksLCwG  }3 Df6^$`ESPŠ8ZBLKE7ze|^zҖ2疓 χ$lP/ljsޝ P*Gt Tjh_U59(4.C%$kr>> rFi[yuk5{v]9]u8r6miHrJ2yɃUai]pZH]4b%XqkX <AN˽TҕkUJf9ȍWA!*-< KZ3csY 4(ȶB( +JV%Yf-" Ykc!ˍ.pGM;#gzo[s⋻O8r=G|viTǤ \^KyB(T:ޕ[84ő; z:M$PlL%MpiV~~)Ym]E _x8aU2fdf#'U<}Ҳe.EvޫW)4[(AkA|@>ϣfoZH1UbwH J0L7F#DP>BPXXW(2T%ä˖+7l1MTґޢS[r@fUSanQz|\*`-Y)w^-uv_lS̫1l+:G؉a8ݸe«dX}#\AY:@T1U2S9/\@'v,N6[YϤVkŚ}gy{jyi6%2JpgJ 1{ }Z裤R195Cʒ6jH?C9$! -(j6P':Y`,'I666F΁JF_\ɕg}knBɀlqq[EjmbGh$-^UaVZ/0%2/іAx(MdXM2,2 ]r&(:}Q\Q>8r˻,J$Yܐ@h2lTY Pc ZASPqq6޵Yzar ȼ 瘷6s$6ǠVeÅ,=\nfnG/u4HHt49njd<R_ Q0^iYFzN%R'mr W< @lV'zYRT@<7Z=O $(m,DŹ)rZ&p*013u<" =}Y|:6h[+/fgP<;1]ZVm~7Wj:ڀUAF^IK^ !t ر<@0Yde/t"h0 )zRZ6 %l)6qs*o#812/IC3 \IQ^"X%IaHcW/rMy*z8|\=ޭ!qR΅*`;0aוN_mR_ U'_}ÉWFЧ_}Z~iT.UU>sfjܼ2P S.*T.f#} :}sq4ٻ&dW:eϞsڮ{f9b8l>,3Q,A< '%Yl!7F u˗Uy lfZdυڈZsF(TB'0#('UV~} Br`]θf "㥄hMd4Σ[w7 ݣ+z98a2*ˀ K;ة uRsT.RfDE輥R BMb qn˨V#O6U|P"Ӗ`@z:!2p(cNoX4 +ڰ7p%hPZ9L$@T*FyKI.dA"`4lg{#+r T/.;ބXb`UԺD_u ӽu!UrΥI]#&+[/LNaa&G2YdJVVzhQ`A0{alՏlSjx;}–O!'?F*C %H;(ʆdH!AfiAp(8*47FB7Cx5 א4ee-ySHY*-8LoTHV#}IMh兞 F"2|+-C'% i5Ijj'|@$PЈV 2kYA {<=.XQȮ{7Agg촇q4EH=.ҽt@:ȭ߸|ƱS%l:.ORpi2.X߾)ir`D_xg ^>LӥHkhβB%oė nCC7cj־iz>ΝT 5i?NᤃmNwoMKX_&& )]=Fkoiۢہft;SNT ݬk{ XKH5c͓"VT㚩q鏓>B-:: Y1O|pOq&1kߪb{@,i9ʉzacLVksNo)ޕ{T^ 0j u*1º d6)CBS{w&HK/K09 D#JO:b%/HCp(1x&:{Ζi{Cc6uLJx\]2iP)eƢB AH<*'u*PZ(M_ZmTQ|"[Pc9Qkᒉ$ՈlE#3֝-\ Rq4 ]quNvrupWT.:~\NXX͖oewyb \yjOg+Ҿe<+{.ps(#B9$eT}<)RsQ kbq_?W#hrQT EnrQe]XJ [N:&0ߦ%co٢bFoX?(cWY=Bdg=cWd!Zgq;iqqv1Kl/hx<0=~GK.SWwF yM90"y7A.{!+cTZU$g3Pض2w%Q sĎIbb:Ej {g&"$b+@!l S9]L֦h!T>&ޫ(T}Bha !󣤤 Z"9E c2/" }֝agתgv2}*}qwR01"f2f""52((F`{N_ E-}q uJ313r$h”ؒVtiR;}}_ݹ_"~8&^rq.k:{墔E)8;d! ]wR)(O!zʵfH.m(@琋"{};& "2LF9t:r\@{ :.(:o-/(.t usA5(ڎQ&oWۍuQ7Y&[ۜ͢MGwdˏ yhV+VJT_;,Q~5U *(JC}'X_EեWFI^Wtz<{f%t}E3OO+W ֲ^j ?wyk=ƝR60q%HLt> EMbxv?4 }o۠v\3*cJB[:|<fqH_o6lzLv0;^>e2.A@/i 7Ʋ(@ r~p<_qC-<}CPwOݔabʔ?TD3teFnM֔(ZEE" ==7whʡ `*|A&4*-rV'&(~_)\d@T7ҳ6gtD#=-Lz`3xY޺et4ݣ21l1 r᜕'{HԐd(p(E*(fY:k12ɂ |}+4>xj: ڰ.')`PT`@fySFc@N`YsʏAUKZlۯ^c֭ۇ~ܑnĐUk4$QW7YgrM#g>?|//5;ܸPOoOX5pTk3G?j$JfS7AԑlE!%x>Ñ0һ&=Y=75m[[7dBjN[pZKlF'q|q0<9oV SUI>ʐ. ]#$^v7_қԛJA4t/i`^]Wq. nپ_}ZBGGV(5|<;H[=MchUrhg9uf 6-,xv>?^^x~}]ΜG2=^̭빽9L]L~;LhZb%coodS3js366sX^1fL;3Xgjd|ٺh~P؍*qwAnjuX0nMIO+ [&LR}ii8.3/EsK/&M39ћ_ٛ<{T=gq30TKlpGMO44jZ٦i稗~\v?fgtr;u&hjʯ;-k8z:ɫpO.t]rWJl܉T;K9t/KWM / %O=+vOFʉTf{J&$FN |BO(S7\aLllX?ia%<`2EO5ivNF g3->%2+1sL J p˙^&ݻ|eFo z/C|Eaca{%bjwZjoއ^:7=1p+._xzO~T޷l0ֱ 4yCIpaS Wk[d]bI,[%r(BTZ( IE7!b`H4|4^$J&=3 4Ϊ6PVgl{t:Şz54ƖKT6`XtdFU4HDs6zGc2ppJ9ё-'d:M-)`@mVͪ'=|zsgQK$Ib nGQ%) T%;Uha1{3_S1,AEEc1YP>fo9^DdYQ HEʔ|)r=3Z_מptgC7Ӌ/tza-'ઋtp; )8N^N鼦g7&~YQBD}J/owffA!<ӓpz( -:ah%#4 @wyE[}4V_Nykwe<,=/͗?sJ<ݑ5\$z)lI,[ڛTL>}+zz6s:nQ7GdeLJ[A,eȹUN $ہ!CNϔS;ܲd>% {DEN9('\ALZ%mwmm,~n~1I 6.!90R-1Hl/zH!)yؖ4atUu}U_!tQAVBREOtnJ&fZ>}NT)RYL>I(&Ξ1[] Lˋop)z0Th,WRl ǯ%_ <f\rcU^F51;^gmN;궁 .!7PXmt|/mHSB1Y"]&%4DEwJnCiXL=*|a1͸ҺGk92nivS}486 T>{H>.'!=3βJ%Q3uBlGMLXV|opW xbq)^۴^`O뽄?LRiT B0sBbf2q>&ڹ8a 㵘~mah[zētyo{^y:iɥ&_lI~_XL;.\3BCo)k^$.(검=p&eų|{"^Im"Q!i3#JMtLkf)#I7< fn;S[ػPï:F6vͱGL߱2#Z00ƒ18yE;8I\*ڙ#q8S'"5"4h’`AsK` 6Jy(/lg.yޡfhصEQZ|.jɂ6dk T;SJgJG(/ϭ[(iEz?}fW$TlP@DHgZMdi j>ba}򷈩mU0 ~ZtBYuVp"SfZi #f(BGt (&iǁzӉwҳ9XY Tr8D%%XCD)1V§̍ B1h5r;"([vTp/Mj|-J  Tr)ʥ$X)hpu.970GsFD5r]^1 9:la?&$/pHUc|H̹EBBw8-;H_g"4\ơC)=Uo'Gm4mNEyv>`Nnen>!!v1;!W']ĊNZS.^'?J;RY bO}οs.*#)iEH9 DKod&a^iI6${-lF:}g!/輏t :k p<(QD@NKZy.]nH]\o|. x9βH) ޾r9Ag&Mk߇rigsiаedžF{]6|Kм& oczv JlTa[^·{gVhݟ8^l.%`ҭ-(mQKz$5Rx%\V Wmi^mWvU]v-\լꂪYE] T@Ӝu KU&U85Wyt 3cb`Ae˳>Ɓbz!GwXktϋ( ;viҊJ+C<>)躏!{Z_R,dfm4:~w ?~a/2cYN#o8,<$&c.ΤQ\YHVV:H! Pn@:@Q|ƫSI-^z@Fa{)| sp]G;4^Ji*vHuTEe+!-+QA8OV#Xee.1Lۨ@D$ b@$28' Y!$cR(9n [|4X =h-HPDHFE+=/C81KN cf#Ni{b]#E)a8>ݗ;!عv+{͋cޟ)甮dw\>) ̮4PfitJ,-OHi*+U&\B?9K5/DPD5%0].àW'Z*!vt#mK0ufxk;;xl;{g ۬,Կet m]J[x&(Tkq7_ $G]L%s*E XP?s[%@TR$hp@l$$E ځz)QBD!?ZS3{{}xN2g [Q?;e~f՞/KTLATOktr CӄF8UR$ mΣwr5بsu`Zt6ķ-9ъ!ޅ;\#Lz~8{#l)Rɤu%@`3HjΑ't[?v{+#UBp.Y 5"i Rzk~DFK%s|~8RGnqq.ď]?ܣ7bw1{>BoNZ Dr"sZ5 {iѤV;l  3JDTdrY MF`)KYZ 8^qT9*nrVCLebH)/-ic/9 IgAD *d㫀!OъȘ e&XipJ&fD/(18-(K$FEKrgWQ^ḳ8{"Ǚo~q0s z[{YzxƎbN ֮q^i&S;%|q*i}6 xqa_!6/PwF.BoLnP Rptϣ9}ᶾ8‘\[Ýw|^9ǝWFnׇ ﷸ}Sg<"z,gs,x,z+Oʳj.1t~. JSbϼC W[σ%!R-~M>A ~}ɀs t f/V 3♨zɤѐZ,7Fa6SM0/LtjڿdjIG0͖x!7Z$~$3ILy$q~0ÊJ?PTp$OG)pxτ, ;rB[ӝTg$6U~ u8|3EEiuC:^|\P"7 TfqJ^}镼YJjJޟvWJø֭}Xڭt+]_2|;n s/?:zI$\rX"&%Ap,uEj0ii(ߵm1Eag € h}[mW5jly84_oߨWwwKb}3DfHDk⨇$emɧ,\DAzxs5nCOadH8@Iɀ Lȉ62PŃH9E>K<N-O#pQ6߮V8bwDpG;Go 縏y4·4w&M@ Yd A0/e 0t,'}$gjt&۴`-cGpz)[\Xy.Lv"٪ u3荂eN#)礖F6g$ӊ ݃ w_d4Ws;]p}]SP¼J{,ua,wmH8`.E63`0~ ]dّ$%%nٱ[@쨻լfOExiLv@`|YE #!׆MF۸"].|yFuroF0m)7ݷ/M+SWloU73-nڿJe*cumYuѪ߰%TE[iGtn;7k.'y7Lo9gܭ7ac "} _5|qo`snu +ΡڄNs<2 Md,,Q` <5gbV(U!滕6Khhlx:^M{.+.X)i <<*GAq왖'PL1zeo`cRnuM:G'-ݷ); .y[< WR# k)#Ҹzsr<wX-!B#[hO)Ӄ9Ȯ{;~Ho=]៫rW.7J |Q_l+z$e |kHkM5>i+?~Y#'Sn2 Xo <A Cfv̙r%1CTZ(#x$fHQ6qQ(ŸG<&J 5)2k)UY`Z3?nt)w4tl0qn]BηG˵O(4`>}5*ЧB }*ЧB }~裁;g*Kz-[*]eI)+KzeI,%Wʒ^=;{LlDeIcXY+KzeI,%Wʒ^Y+Kz7FxBE` gtW]/EJ*gM0U9k`rFW]9+gteI԰3rFWj*gt匮ĭ3rFW]9婜ѕ3rFW]9+gt匮ѕ3z? p<ȄAF8YX@G #c8XnڏtBɅJ ۍh^Mcձ]1@YP(h,DŹ9-8GDEEa̪퐎0]E~ֵ   G Zl/i|1;oM\̦!^&ɼiqT9WYTYWqNe/tPj0u1AF$c-1ŁC#.{6Zm`#:YB%O@YzR`d}LZC#sjb]ޥnqqtZFķO' ']0ZES#6Y.H)W-V{e%XEL' WR\b`mұbf.V,V۔+yE+yAECGP+r(k tTLzNִ)&Nyr&o[ʳ^2a"K!enes62 m4sj KAyI,{|ھwYͻ[rC!Gi,72l,I)|dIP U4HX]xU:B8Ŏ|.*\Ze^\2/g9歳p8I6[ & ts{_6,@@3fd2$"a i4"e0^Y69+l 6+=,ZL)*<ǯ%L7l*OW`w\,e̤,%ߚH/V xݖe7G8ΕRn@=rSQFt[RTyoģԚߔڎv%OJzUh|fOіGJy`Fv6pp*g6 h̡`20>*e L@!T}6L^jϗ] 8 ŽW%r*cp蘔IksZ%cb*M`ўXÀ k:;Ul1qjd%U#^|7 jΣ&a ii:r 2+/G$‚,o4AFkYYB5üfbyd OUbY3؜X"d=hoWt% u``yL l8%Yf-ꕵ6fy굃s3N$ \Ayq㳁cd{:=ˇf5=iX_b>vA!Nw:'- ,[>jIVp ƄƁ ,*.mNۨsɟKIbNNFp*j3&oHGYւNQD.Ve)!hS zD2`C!n$!EB !̍3 >w*hu&-M.GŏO?gA3i9ҋ$\֣\3N-ǒbc!JR^Px4TXىE2–*Drx?=Laԯza8Et[DW-bG<SJАf^ZXHfe!X䥡@AA'jr6|cTR0a"s3!IԨhc$OK#a2NL-wH:]7KڥX`va"j]4ubGPN'D=(B9ٺXbB)r&1ޓA ]܆]{8{f`ʝP֝xΑ*gO8ձKhoW)Q7ؔ}8,yCHѳ?0Ry,ӾgϺ~#iӡ_xvv/R[c,K\t/i5JbyƞgL J+lug-$:}g^&ҁm Isiˉv3ϮU):wNN3UkFuo?oĄﭡ]5zQw3o㷧]Qٜ0k)XO{#Zԓq)Tyx‰6>^O1%bO  olR{db#c ͩ K Bemo'ooUwz6Uѧtښ:ըY!lVe퍥w]V ,^߶ȟ|-}΃$|ť4WN>ÓRp8A=Q+ݵy*@f b3][o[G+/՗2$,0[Vɐlg}IxHd[d󰺻YQRS 'PaPRhs)) pDZ`XN\ۋ,6BFE_h&nCj~6=9i=C]z k7fTS Nhenj-Uk VVF2VXYMNyPb1.t՚ɧѥG9jMpf-@9I1>蒖]!@QͅJ{wovJ媃} Si#9Oh;l5q6G6R*JW_.sjٲuܮ'S[e.ˊ`9'R0aV[U ^ NQYTRVnk+wNiZL2&{R%- `2XK9y̡b,S1.vI:s,;ip2)rJD&CZD+fZ$I>:kM %pv͓ƮPTzXR#8F`9[eOI^iZh\RePt[o: ? W}YAF{YĴ@ӿJvܸLJ*ҀޞB?; Գ#MJ?} /e2q_>a+HNV^ Wi΄%TCx p׭¤^q瞵_9Ybxpad,G %JF9ܐVV,[H{礵>Zdw:6.;09:r gƺ38oۃ?]݉6NJ;Hu HX~B)I`2'R+/Ju)勔`_aJyxJ0ddU+^[It3Wnpé3\@`?zAˤss2)5+նSϙ/0E'TcLّ &N c꽒ս`/?|oMK5FbiU3^y VuuGYD?|5,ŏxF 2s41zAj:NvOk>>v|(Nޙӳ~)-@].T&_ r4fPs0s.qmեG?~ѕ۝fjrA|79oitʓh;Y5̬R  ]Uv9`v|`N:&WĵT\ɑَ}LN +=s%S1W$-q护9Fs%5S\*+WAWEZ͎\)mgFs~z= )4KnHNxM_ ҽOzQ AQ ;1P*yf 1Lz %w>}Yr[nrQ?\w冀VÜ{Mz@z,1M M˷f]]-UO WѺY](b6 ]c`RJe#a-PLQhW2"SbhE^șENKCN!VosXy+KsYME22$'e ,'s+S;yʕ6'iN5+.Gc4RxG[M Qfa#jäݓ^lp |Sy+N>ysn{v=7ܠv'?ßv7>W|B).GE&#VKݨz.ZX(Vb΋e@@; }}%N" uDÜẐDȈCy` e)'KDд[W*"sDKD%f:C&fI9Zk>o觔,5Z+֚8=BǔWnWK+;^MH3Gkұtd+Mh/TbJl&AAO)*dLH\i9{IR.k 31 :Sqႆy=y=:2^jFBJec*0+B dRrږCiͽcEl6%{mȧotrĖ1C akٸ! !'K<7Ў~P%Md/ˌΩWkV2j T&-1,AI&5Hmd:0K3lж&zB[wcz|^6BS[X6 BZ$(sSŃ^!-iCH8%0`'%Jpi U#Jȹ&];asr!TѢʁWpk_KqX67dt,~%駭^Z~t& hɑ܊/^8{k5:8E[+:G؁a?xMoTwy|7j9jB;J 9j[YAR)8c9bBQe!cN!|r%9"D!IjmTQ;MJ.%r"E\ň&sLP:T^/t˸5q6dSXv  e/n\sϚ;'Yk( x,^ɗmr3 @mk=ۅ7pt[?FqHgm,J&pnt WvrFbO5USZ/iՏL{Cm䅖]y&:OLc?Zy<),:Oz]:՜wtlUmyY67wwZ/0+-kwkػ2MvL+e蘺ۯ{/<4{&RNbJF$9!SxxI:#w"lFQci>ݙkiכ1(v. M5fۭF|^zs}iS|/.wFʆlVX̌a X%ED Bb&ȝExPtyƝa  Y B >RHemA.125gV 71IEKz 9Fc$d´r֝s))BZgaW~+L݄|<˸fCR`#| !+Zz6Frfdt,gVw3^45Rqzdۄ,*c-`>q pC* kvSp(`hP/Em3K^yB0e=1;{f|פܝoگ[{Y\4r9+N9vXJSNv8펻хg.E |Mory>9+4OРQݨ,K6 7i<#/Ϯ|:;f"Κ) n]'gEv2r!-\lQ͚-ix44- #`[s^ 痷Y+{ofc?hwz wPUh#d]+~ 3+'f6;Mzwnt[^Vn7]לc7: ncj[ޓ~ud?6/1ItEemEfxM)sd}Cweb杲+LD Wa+[[ip~y٦v!"-Yc`thܗOJ!oBsO)A.aKr_!#) PySBR^`S| PHB ;\ո-Mђ( blԦ{c/ʣ܊>S25.)8x6 }wSzav;.:[nQep:cbZJBJڑ\hRcFZt6ZGp)Wxa8krD-Zb1 M46L6("+dFSʁCAe-^Pyl FеǤ}m؟ǵ&PǍZ5, 5t&IF&͸N<ɘ2\?Ҵ iN;CҐb0HEH'5  =6W 48$Iu&I"*=s1H{EgI[*?]3ޅٚ-|Re8n='Js[q+墹ß m50'Տ{ۆN|܏s#H{f ѭy?8ZP} \W4C0J('k95 \Q3ExqC[-~E .K6p7P$7^ YeѴ t9; }hS᪺teZڞ/CúZxzqyzZ3M!*: []CyČ]Ih}LkF }Z=ꪻ t\T?tg^'^̦nÉ/p4]qT,+yI\NFϑzFB;wt7 FaV- mѸO8]Fj1t"{Ged󨋇Yꚓn%_Ɠo=k*czDqojĆ*䉿w7U\Cvo^zM?ًoޞQf^~ g` ^IIا֥Cs#ږ|?2J=rϸGkiAX.q~Z}|1MȑW=v#o\h2zJeu MhM׳.bB ;Rv%!oGbѾS7YGvcbXg$MrJQ8Q@ &M% Aă W>9 aσ[}âRKk ( dCОF6f=Ib$)x䰧SdOgģPvt =eggvʽa;7:a;_&lG Ewsf7<9^('BMTDZkkD`U䜔D<łBv!GQ/[y |.jo\&I59ceAtМIc9h$-H J\3uQ9m^X)#dH 9y5a%KΉ(@+'X7 hc.!`mrrGW SB/iJ4 zOSGe<('} Ԡ$1$cZ@"G+) q":$JCkOD#@vn :;PItΫP<| '8JgPZ6_x|K?vY©ȕcxr$U¶S4ϕgx\|WOVxr=FeWsa墥LQ*Ux+g킸FS-{vy}5wၶ74ȋk@%KםyvSxI^NWXy4+T*r A\+SN* y f@ ak>L\b0fvIIS/cY.n08/HۿǁxсBc * IrNeQg)po2"h%TIn<^q\ջnAuA2P`F˫\"GUmVwo}{7?z"3wrö?t =Kz%U$7K3蠙#JH(@ 1P UD9BaYLF22|ާ[eG_~|J}cBYGa ǣD¸`48)Crakjc˦@g<%EC2%͟PO < f\rcR2z 9 * z\qy/5[1h:jKdthW_ȓ64Fm{iU`fseϵp\% N95K] :^B ''RQW72:Ř'9B URcE]&%4DEwJPnCiX-*BSX.ye,#΂v4 a?562i%$OO" Et*{#\YbE%2W$Xe: O!{.2½geɕJx%f6ꄶ161aAX$n6zVG)r~<+&hbܱ/ZnLH@L = !g\;R/2a[LyI*7 }8 3A!pG8$Q ڹ9֨_46ϊq_4b1xF5=hăFD%94uXK}@x40 (H`NN3Ehڤ4Q EME5ZLh4!PN+E;WQ/6\YK)-)9Ń^*RVErbU@SB͜C.bܱ/>N@-kw_5ShȭD68Ա h;D?~G&eHhQR\-SލF\jV (sN%)ᐲCR<H3===Uտc]Fᇏ3΅؇r|st&*7A0-d85(Em`** @M$*or\}dss+&wqB]mkgi-weM\NycGJ$HN,EP9)S)  PX .ߠX9gYlv7w7aOyon݆ݷoM~V>.1P9:*%JsB*ABNzV"bѮ-Gw/}]}Ft[k`PN}F.1Y}D I\*YeK^6h2Z A!D*eaxK\3"Γ 4Bi?b2Yt#?K/$kCPv*lEo ;/+GdR4MBsOF땕<^ iYdA[LrF9I0},gpGVBޒD MA zqVL@$urM3Gq'##5"=k’`Ј %i@'R8 Y1rFxb(oQT]Ԓm8O{w,) PF"q"pR'LB4Ha@O&H& [# 3MenVA&Nj#|Jk Ndb9C"H&k*$8, .wg[HQd::b<(eB w#}iZH+}H(YIR+SN6 BJ[m:*>Xg v-J.%r)I&"y P:pc@#B"XيG9XJdUC+tدqtZ58 lR5ME8_8e:S{rS uwY »j.|2+o!Dd<)Y} {,**t!P6el b[?Bm|zPXb8-Qy "$"~#0}mygg/([Y+H۾T(*\A-Oo3qfq/㘿db皦ao_\vso_W7ӄSK?pIkNOCw?1<\N8?Q+a]@Q EC")o%*lY.nD~jƫ3090 $ y*QgPylsw6ifʋ Gn5םfmָά;?71,>ngBLA=S{g3ydk8%QD%+ULFBR4LH k3Ci79mQ<23B>XQ#?&vQjj6.wFn5f , 3B$5qC26fmt.x" Xj5A4Bs+*I1$CdHОq$pmd{6HpGNg'ɇ=v{ktؽ]9s+|lD”6IbѠ%AC<bE(zHhI PVQ<*gi5K<GB؟KP61T A|oq:cJFt-xyy2\wN BqtO9?0 eJ%cEPR\0jVRAJ#VIjc zZl}G .J[v=DK%xyka^OX|LQEPGy,$9yAQK)$dhP㏫eb>I\ d# JhM<1ą* { `JvLQCC ǝ?r;>/ǼH;ΟHN;uﲾȎAQe7s.nok]0s)H '<˜ 7k\&WS+y+ٷ.4>7So,1ƃ5g`VN4{gyq|$ b`F-A`d;-"xzx!;ZcmhMƝm~{CGeļIMΒ=M4ͮWoC:`ro54W Zټ.F?/.q81|'|9O&;No{ّ6'Rh/i rkwi?Cehr/;z#3]6.`8ףŽ rL>J{JXeFC/q8N^(vBw0ᯨx]NoQiquʊ~} /ՁL'f?i֫@\Qh{9u5Da5Zd5fX0ӊl|Y)ǹ \zn?3*C2"DIT!$Xpk$8Ƽґ9ÔՅ"KuwgҍwVa?ow:Mycq6Sg7wq+p8g `יB;ymZיt?D} 3k7?)zl5ܫ*'{ˌ{ˡ:++ny9vmʂ[<-S]}u`{ۭ {x ؚ+4ox`ᨋ WAΰM!s󼞱<75=c]DB^ m8Ϊʺf]Y6c+m%]>72)WEP>!Oq:Y |m]3 ~jqAӦ%C?̐5v_3vϿG.:O$7Zklo`il0g4ޡ?<{_S$z/-9 1YX`Q(ET|z2mMp  D2lE`NWY:*JCeVmͫ#2Z3Dx&Ǔc`v"Zj؂ \ Jr69fkmH /vws6Ag 62刔b!5C,:KpʂQQY@(Ǭ69&@4k#U1lF&UzKlޯХ繚wyn8ܖq[t_Nb*vA>^V+ABɂc7,߻A C *rU>(l8h}8H'52ےtbژVmL۾-݄wfb"T|A"9(!kD,?jB ڢLQ,JDBJJ;8"x)ef;k1;/Jucٌ ΀͵^*!RFPd:6)<43puz7qKz[N+w gltַܨ*.-K"\( R3B9D3,wY;dM1~@5xlŽطCIڡt:E^,tBrX*S%i{w'3TE4!m knrAa{У^fMN}IZ,h'~Ys2M.DMCIVH*T@')c6Tʩ>jeJ.KX)`42@!3+=J"*! rNh,ˑ"el떤!lUY?'ź*RkjjZqPxºMMZ9};~oi\+C#?`7\bϦy~(Я&Cb-e'‹} )@(൳J |N| nyn-PU%(B%pH7A|6MG}tr}Ca_o'{x>ѩo|>Ll]g_/"CrN:almv6\JA 0 CG;7l>F {5L˫'C!DpE⛳ś偫%Tkbޅߦr<}f[/Rpd2;p8k]3H?FGލ #LJ1nt6O9('#Vv\tc 0Oifr>g$'G)좗lO1! a3/)VC┴ʅ@RLP|Z@\^Cy|^0䀒*e"tF@mhe)f Vղn <(B/!謽f//2SED} >G (I+Io > {III `%Sk&(M>-dW[;TJ$oTg7 *vSm"aLKH5&eDj1X,|"e@P|o 6ћqIri<_ XBEc1)>foY26TR"%) K`Gxކ: xs0h|p ;}Չwg| y㾙Pu޽4" @Νdر;X Ơ} AvŲ.mXT;aQD ('lALeLԀI=:&Ud 4P5E"`DrBŜBua,n+BMK ^ޫ64#gCyޥnLrٶ rJA'C>RJT!T: yVT\Jj:C6OzY3ޅe hLAQb=Qw<,$N*`j,ȹ[Pi_;a)8|+i KĮy9XI/2O.v7V^FktRXV;lqlB9$Dɓ% *},aM,6YCh2Ң+9M`#(!YP\.>̠ KQª с6֌`]Vi;gl .t{]=ݨg4N}}|iq<.|<~jYc mC6]m5(bJY^X"B)6 U< YQ*Yզ.YN vHI{[Hܿv+rd/qǮhmk=u}X1+@HI`.wAF1)kɨ DŽ{A틱.nBȰiS$2p5BdHT >lFwWb،?ՈF{׈rb'C`@QC]ч͸cWN> [7ip>_лFnTzvCEA>RCُ'fݥ$cx9_}QZSO|dDHl +*N:ơ%h J)3n'BO~!wsߡDH6E&d`t'T1$Xe  CHBQ 2ZjUf 0i)ױq^ii S+PQm =u3r6Ler:M~ˈ+avX/=׷=Δ d4t]B)HL1&'6dbrTulYd uчSҰ )i kQ(d Bj\9w u-wڶ텹J%LYc!W/cc7j=jEWfol#Uw\7[c{+Ǵ8y4@:فio>Tb{F_x|dU/yG>"_)ޝߋ` ۟j)ݏ\8?{QxQ)/6EOŔQ&^z} _3pp yk(TX3F vs7IҫCP0i[ ^7,sN~~'g~ꃄV/juv2;|P< Flh$5)&| Ӄ stM*{?!o=H%dP?NϔUdGڃ@9i@ߡ͡3Q7H(x#}v}kw2P"&}.|VDū3; P )*;Ki ѨE,:m0VRj(H=$i/?5:`y`OR/!cI /ӲlrӊE{TbZNco*G(R>iU_M.IhRC)H\wwerNh/$w% _mKr@yW"L1IP 3EC(#Jr夒q232 Ե΄Y1V6%Y 2.HHS9u VV naT>+ ĐɡR jS 1b66זVD]T"KE~d 36Yh$]K#4RKg0{3 +ت٧VC{?i'}j䱀^+0%g)䤳a$Q.{늡$&RH.Eu=g8T;դOAtCx3 W.)ʂ&<""YK-b3&K8."´5}_Δj" 6U^&E@e ri =e{Qj<ayx5۩8d[Aгy5){Kn5snύ{8֓udKa 6?9lr[_tIu~k_7L1tؑе%."M8,#o!A>l eօ]_48]zd2֫q봺*[lm&qi׳&dYxLʁU:xA*3L"LkڃtQ|9($ _"1^GTM>dUsmB'L&W͓>u>w;q7 Uݏrw챞Օ[-nۤϰwb栎YwWη)ƇkVAo׬{6{HPO?{Fr k_H;}pl'U!^'W)o7ƢDI#l3d{KB d>jPמ3aфU}I DM :Э趺>㷛Cm(n!(,D&┥XYRA;)$Uރ(I,۔kѫS#RGE>'(W"Ŷb6QBA""Cp[=wȹڋ9d:5~5F}k/5(]b#ϋc1 &. 1t36Ƽ;.=Ol44!oFN 9ϛxOk;Gf7Y,J!}[{j'ẻNTR>Ϫd8I"i E^! /$$ ‹BvZb u,h \҇I739WC Se.:aV0Q Ϊ,Ŵs((}>_w#nv@RM'a-"lԐ3IGuKy`a"!zkhlnV=g /Hy낚3p[ R9k̠v*PJUjnJknOItk=dwOG>,\V 1^5AIX24X~{~@ༀ h}J%a>2;ɪDNj;x/Y'MIAտeWy]~D惻jo`̞.z˛ "TZ J)YqFVX]$M6([qk!3(c|;cztH̾5t%TG%{[۫{?I$?"TU r)Z'Q8暈ѹ֚U?Ȇ@uр ȝCyRB?僰ugv `'S-dJ |ȡ`\rr5\ۅmhhmsm޿k]yFfbA800H & VdAY :j*6F鮝 K}9W a1P&uVCٚlL$ۘ)(b D%U}zG':ܤk%jLLpk')z}s5[Wqi;4?c{4ir3aU"S(!L<!ʬVC[Io! )#)1{<`)Qk RD5&_sуNŘL$o>wUjco(X}RAZn4b ( [IphwIj[x XBEc!)($2 N@@Ɯ oTb"FE 2E2v!!Ad 7#w Zy/w {rjHf?:o}&_`j&(-|=c,J4\ !D` Uє=HD`msV†֘ FeOg/) Ĉc `]&[D$@Y{mRk>FBNV]#A^~V1Ϲ@Gi?yNnrץ =Yn1^'ꬳwB䧞8q рԠM1Ae2:E&`0с(3xL(Q%S!BJI+AQ Ii(o-:1Y_d`1+1d$ FpN ͿLOȹQVEȅ6'qɝMOyo|0+ơ*;M*B$<@gT(4ZMߊ^:φ>.3f1)E I6Ixaz3\ J[vtTDd>Xd@y,*ҼK^CI%-df,P Y$1V6FYjmsvhmȊS=Yol(g[wbLtmѡR RSrf7/QT+#Y!AOz=}o%flFcJF4RKg0 a K(j`' v&'aI+0%%Stb H% #Kd]1cIgᣏ.4(^V'YnpoM#C*"I0sɄXa(B*Rdաh`~Km%cЊa<"L[Pm">Y)R (cjbQDPuN!fD5Ra8%jK'TBUӽdtUzʭdg4:i^Vgϛ]~l^O>L'NJe;9Gyuӥ6t}3ۮi= ;N=gnt eNd͙뽜ZV,}4g͹}~lh~p܏GiC4Wg~-{:sAB1جU}I DM :, 8n8>QXMR)K1a,9vR`I"eQxX)ע2FH|NP 8-XMkC7DEz8sȹ:9Aj~جޕ>G;8l>~|  {Ƨ9MVƬy'u5)5Bhs!0ED#1H-u mSAע)NEma]RI8 O1jؤuSI_6ٙ#Ҹj&l̎w%JҐS$MRW6Iċ%P*hkF7h0="o=VYXf12C(cLpL撨; K:s`Nϝ}B4taa݇/sdL,KLƥ\lk+T5=@/JaaN>*.Vg'‰RX!! FOCj902d eGXH\J혃$Yd,:{QZ8{> &C(;sR!hHFd2v̺,!8#f8;i<ԮڶCc0IzXƔG)i"AieTAmrN&Yicy 9FNǢ(lD RVrtڇW[ЙKVaT~<]eDt"vS2SY$4`!)0+˂EO^ZidO :ҫʵ ,Qm|cT p \p&$25@6F XPԮQK%ңfa\YMKE:\pqgĭL9/@=Sru S)ń s&10#'@ ;\<. VӎS&<ݜ1ݒtWyP CamAp]3E?aV~}6kGF!6\2Z7;6B $IEEe ⡩ 0)2B''N(ҝD !; Jzq9uQz,yQHXF2I2k"D%ma@e, ` t&<ƒ@o%:2Klo3u5q d)E=>}Gwo=W֯ۋս׼s&ŒU\e,[#J9+c ITVhx,sF dc*N%sp%B!IMf`Re>>CWMF|úZo:8eohOz;) /tgnmzgiw}&U9NLxOLzOUc;䪮K`td7Uؑ?/ϗBhuU?wG0vjÊY&po'/~i^<vU/'gC=7=2 eN"'OZ9{ 2\6䅫k~ ;Ey< cRs&/eĒ)6J]rF 7[0 XFry4IavAɓ{`risK%"bBI$}PK2Ӛ2 u%"Kk9 B9d,)G22+ET&ap²WmmpkRDh|brXr[%3X|Jn?I^K7 ڭcǎYO)EZ*U$ʢ)$Z_p~p^s"Z]-O[z"ߗ]E;@foFٝ齹74c$n4$(n4#Wb2WVq*qym\Vj[fg5*u&}J@A)3X nvgb_Is0)4?&-j|4nZggg?KY;k\L6n^-$\ڛvQOw‹Fk^L4׼iEZOfIit55ӄ4`bH ɉ7bŪl2o,%y EMVxNW~(6%-Va /66hGгWP~T_'Wlm3s)v'gwR2PVf[ 7[f- ^dK^XݍJ9w4{Rz~|!TෘHZH˴mlќ CЩ}Onj N锁G%rQHJn<*2+x3 Y*g'WR&9p^~Xl_OT% AEI=5wkZ\,0^Jp Bp#lfȉ:͵댷.(+Y}Xz;11X-~9t %y}|Œ 5Fp 2kZsSn 0s:6U^^~qu[ޭjoa0ŶcZĴFhV<+6m' JFyD( MA4,ӿAGu1={ctǂ@J2 c=0U"fH0Ykn 8%c"*ý3usIR(IJR)tT,6f\{ 86Q$lvfgQ85|q&`G}c&ӕ ,]W hE '=VKf4`khM cBx;tD[J'z&3zЌ)Ճd9Cd%G=aTbFK1О ynSy]JWgM=?.;vO/u7CAif12Of1YdJ2GUʑI˝JY7\LGU9\ppEżv,ɳ yQgc|WtB{ sԶ|H<\ƠKhyx#(&T:sW(ABK.A VM*骼 zD)!EEf)99I}&R(SA teYM=E"W5VyRfg!ȋevO׋U] S(YBvw]z}o$p/uCͼi:=޴ alRˆZ.ˉ9Syn}(Y'Oc2E{:^.y1۹WMJ̏jp,U?Ks|g͍+p8IOu1l2eFlTGy6J }X;/iL`FA1%TH~xΨ"H'n#(p )8uﴏN]Ni:u\ڿxrɟޑ;ReCX Qh 3cR Ķ1BDfD(PtyZ 0<0ؐ fA`JgIf\{deyk".GGÐ+8qkz۷#8%&/ ]j?}DerqV f;(FZ +2=,HΌLLhu߰уv6!Nlw&z&%Dt\dXBpʩ Yjc3-zZlCG P>aÅėjx1E]WY2'bH9A+'aTIp@1)y>'Ws Q2.m)v+0T UBdD`'6Klpdf ur;}Ys 0?3<8M#Lh`I4eѠ?}æߞlڿMzNРqۨGo[?̣N=4<"䘖x^֟rPo7t72+qGr^j9L> ib55l 󎦦0 a.J[?qtF\h̟$h{Ч?ꭃՃZ(i(dF|c( "FK?LywWy0yE3FMjwu= {  g[#} f:gnaW--(%W/ {#o%?q rPϧq4ƻH!['x{WfgsH;? ܘǝWLJY?1f we-X3*Xg~X~'NiM&8ڻ][Իm89?{۸e a,LӮ@cw`d~6dw)ReYijKm%@)֩[{nY|QXll9]-y?= [4}u ez=!pN|iy>idokO=]ɾ3w1Ta)cݥWT{ij.ή߬!DO9 Ӳ{.a9z- ^BRdG.żÐ^דt}s! 6=oܙ&U󚭧<.yBÛ,JeӐ[IyC,5vջbwceZ3cu.k; =]j|5g;|g7&eri3lwß^AK7ɾ.>d+hg9lrq+th)R!w8kR8]U^qtqj>hvE(9}r تzwsi<7˫1ݾ )봚ޗ|u#4⧳;VF7.f jvj;7IKhzݷU;Sq3n ګpf` {Wxj|8k;^5{w=7jg< ԥZCBVOc͈֡Px_M'gsMҥ}M~]Dh[jhli$.6<6 0:QkL2:zK&+&T!QAH[? ҧ2iS'[l'}1VhvFހʅ9ֈaf/zǪiե{,rtrGLl$&]RTץCPeU'>|APz]$\/^̓6\}om65Pz*\WM՜t9hYµʺu"_F kz3 \Jʪ?W7?ru, Z.?ĩSB?pO|u\}Ӫ:>]p5[\}Sz0HWf9*ٶ}/f{j0,^D!Z(:BT=5FA;O.t;b91[psxVåF<|ˇymKu)/5{4yi$rn)pٗ˂]JJk-/qKyxJr懋&b>ZIMV<1QIx̣Sj<rwnlQ6IlVdG(8[ KjKm\_wW&AҬhaVS<^ m39轳 .Bm,O/)T$JhYX0sg<5Iwc3҄_FGF-i(Zpk1dphc(M=!R*AQ^>m9`?M ks5cvoZكE+n./y ߃"]`TQjH*U}un{rQ% QT|UU_CBܔx:KcQQR5BTy!lMMG3=>0$^_|#,ySʴ!=|\}* n':oٺBt x5(݊=;֗;jL# Q4ͥQK $VU҈s!jye@Cty!tEM8;}iΗn5./SqŅ9rU\f$ۊ}b}{"}BdiX: ?7ro.h?]PM_:9P&$ipx^zZ[tE@W@W=%ۡ_رtoR|wܕSWnef?F?m&Sj&*"O%Gӷ U5nOfy/;9┙c1;7V=iJ%doh|_hvi(~A4- ts}6jx4^ME'];G"ma'uL,l@ݎm|={ l:#,%P+mLɇrBѫve"n9Og[ĎJpzW~ڛ0Cәf>Ƚf{7dA5^K]kgFj*Xh \7O@PѼ3&?H+ \7t{0O]@L KXj7tUڽ/ZEͮUÁ^ ]If-1{DW0{CWBW_9]ҕ}`gU{S F:]JsM>Ulg pg]؈ Z P2{jteRM־W Ό̮n.I"^ƼX@(+\ Տ-û2 J#ѽj.g" Zw]eJ5+DWVGt3V>׻v@WKRt%T|s^ZΩs^UF4vӺ+w>LOD]۷oNQ~;m77t];;T{4gj,7#H:] ;7]w(v"\̿By9?]}@nTOPY'jŭRBMp&^ӖrSGGˆ3ˍ1eW7lK {!=Ī`7w?}op^[}@L]Gl<0Vd7P`?!"6Gaf58 _\6MefGstA\g"GMZq4|$1}< WGqLj.cRz&8gDI[8zhP2+ q%ZZnMz 9%<81{6Q"_gpS,;4hE2kx$gQW^RG$TZFpվ^[$N„Ή+$yYe#Y\XS5]ƦqtRR"sb:eb̙6j gV1S)R 0D+DSr)C!jS%c`@DΔH]rgW3Չ5u\VRn. |)g)C+%hY JJ&-LTDH1 pwS_z BTǎ) o2( pXC *$w(*mpfa{=GR2ͭgPFhC*$\0obAcD~.?A+c|HhD.lt6k'B ψI1Zm˻AzXF%VInSvm\,H栨Y[/`IT{aJ?a.)ef'Z*PMBKY˲ $81$skHV $-@x}ɨ5CaTfXY,G]™dep!Ƒ֑CsɁ.d!X )E$ȅfƹL*&&L!%QtEz,:=JP$d᫈;I`<M&cuY NB; YdV/G#=.^ɐ@MvgWXktdaHcÐ.v>P!/sP6D*7p*]%@_z1#X<ˌt&(},܃1:h6X*Zyput +Rٻ7n,WLJC@Lv`f6aAPUdZo-;i-[ƀJC{Yt8sԘp=tqJ5"Н)M L c~6 sT@0j#E+hqm2 ~mB@t58rXd*5  5Ft.XN1iZDkRkE+i-#k4ZIP2[v Ԥd:i[Dd^w *Q݌0 %0lȾw OAK&C6dɵ]s?X~=xy\.]N3Xx^OM*PfR[p/i0uB7Yk'f1z-tD1CӿBAٵh 7^GmF֭wuRum伖pY]S G2ha#[5̈]bpaP"Gx rh$xsC-F}-h:(:[ wS!;(X6HH :Ш66'2(Hs=Zd}z [bcZBB?v=+qҤ,S' ewo"g0?$o-<[a PҢ9FHml,(JFbk:00^;pBc[qO҉BѨ6c99bNmVwZ0Gj":CTkɷALLQ&chajj?uS{y:-;ԩM&kA&% ܍F>"+Pxp* -Zغ/V^HOĔ n %p*wwK 9zluihWNT 2ٌ-l?};9Gobnh Bg3Zܼ\,^Uo<_MWoƻ>@Pw mw %\KXD͗T?Ӳ+)Tj/uvo~"N/K7j#mgSԱ|50HMݡGֈP)jb@W/F Dp,E DhP,@Y J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@:_%JR(*O}B@@gBF+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@BBIJ (7b@=@hqr%PJsTE|&+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VRTU1ɕ@JN: mpQ =,%:@ X 潮W@Y Jc)#U ꌔ@T}>)N @b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VhJ u<%Я >ޫGԔj/7 朖uvq;_=A2$;Qp p/F~%ԒK \za,\Oy\2mݏ;xClL,=o{u FNƣGKѳrtXoE@7 ?)5QaZ&ݏ_mOnK0bXp4Ճ~M9Aw]zZ}<5{^P"2!mmn)N1DAՅE>ΕAVUɲ \DMIYGWR'5jP:rj#/i76IҔBWք Zj\]pl~Ep.C+B# "e0UGh_yJǓHWΒ^ `cʙ"w?6 M_2]]ykϰXtEpe1 C+BiΑ7D,u_YC(9Ut2|ֹGQB@"ç+ByCW]O] NLW 'B+]r tRxp%Ҏ8fQNާ.9ڴ˵ˀ,oT6tz^ObiZ/i{}W}V7׳iO^/Fs=22ao?bn|,MWVĪM:E~Óz9zͽ|^~,mc'*uS:*֩訐K-\JFe|ȮcVMUeHUTmvO8.Ĩ Jh"IWz燞ȝc" $l)tEh:]JΐwA uB (:]JΐLΖ]W5"~ >t:CWk*֩PziΐYi +]1tF]ZNW2]#]hLQftE(c:C u lpOj1'#P2A7?vZ*oz}vSZ8|~1ó ~w2hˇl|ZWZ>T1}@AH/ h4E^m {ַMFs"`sMNO;bQ^"9< K]Zd*ԡ}yioo/6G]L37W=QlTm RXw˒GW_ѨUJZpiQ_1&fx͚9zv=fo;jiv-a;1>)(The1-5#r0$áB57c!vxϱ݇0:֞Hn Jrpq˒rfe/m PΐB4+RCWWRj;t"0]!]Y%;틡+kC)tEh:]<|th0_]=u vLWgHW^;%Au"! r|*h%+Xr+{}VCρ}:]虮ΐRDW.*DWVˡ4wCW]oB0:p8e9^ڛҠ^~/aiOY;gcJ815j?PZf4NI ,gP ]ZNW2<9ҕs,<]?q5ż#. e`:G +vCW]ZoNW283+礶%%lD,*f3t"N0]!]!Tѱ$`cE1tEp]1tEhځ tut\PdG_ ]"fsW%]EDW \++BPj]v o>w>F'zyd0z`sWrLWz)_ D&JMH1J'  ιA؛eC<t!z[t迯(^'jңzJ4\Z2nWuj_n}~5VO+jBW2]#bcDE辷_Jڹ/wh2q~}~;}]goKpswߑ}"{Ca};Gl9h?Q\r]=Φo_t;tڏ_nl ?v}]1ʹZL./uH)N>]$=X-`T}][+Yt >a4 arzz ;J墽^0(¿,_ʺpMmP{cJGؚ݄uh٥,cRɼ f'CFAn_5l]6fkx bsixrL޻{fG(]c^fߟwo`WtzrWޕL .-V-w|%_Wiz7zŞO_}lnyNgW䏛B?4P}+S:]:_.kRI.!8Lz+)$9b<|qjgKby 7!.ZlM>vڥi}HԦ 6/%A!;mi Vwx ":$mf1`}g>UYeC-u1(Ljеs JN/24O[rMH^61km7 *<ۉAwZ62:նqR盫G0G'[ʶ)ZI *r+-U!Z'D:䟧yi  _准{lUv7TOS('[]^mb2=_wr^&E[^.3F_/_ݯN%`\,c>a M\c&e=OWyDI[$5F"gR?9jnzѵgZ=Z>'kyqw7(̢n/y<-ۓتp5ɑt׋ħo_wB?-vnI=Սbn66Zix(Yt:\ U{]=^w=Kbt9lvlX}rzg/f4:-6^ է痍z}KuQ:ћ^_{ϟ~w??_x2O /v, r,jz~i>V_O̎z HF)|لyw6jx\WuS+,6]^6KRIF#ޅx[]n|/$i}t2f"lGĽb$x~1f b|"Do8n|"a Np-uM@!kblXߏ)r5d]{}u r/ڴ-Ýz=t9 5Z Ʉ t6#rv PrI#TP"'5&k:3_=P}{.6oN\3= l;+7&{M!+; ^m~ԃ؞S_'[r.9H8?ay[m#` I+W^ivs?#/ x6t*nυ +R8,ëpl hetC8`T0'#<qwԼÃd6x(ѥ !(S !J Q$E .EVXHMc,JAdᔓF II$ALkk$z %4QswNHYMaj~dž|/ ƨQbDJ(/b$ IZi5ptrt d0ɒ)Yk+R J\gɗ! t.GS9xoA73ID|K5d҅/HKJ%SAp1 5<خuۘD{PQXFJ0rV& OTs% 6vD6vN}b Z/ZlO Oz!~qenr|Aޏ02%e!֭9w#nʲ)?L=Α_nM;3G՞ɤ=;\S4y`q32QfaiDG h9+%+s@OkOge4炯9ƩP goOKvl$@YGm0em}VƩf /~hz כ2m< ^"wu^no"w#\泗,ik5)vEY܇ۂ1?ϛ?ߏg1'on{U)^}iF;-h$9*c֛/טg64.TW٩XY\O >0P/F|'FqDZR@6] ]zXBN܉A]G 7'QA{/hTsGO'4:;~9'}ݛWx7~`Wٷm٩q[ $m5(^՝4 }AVT<>/}QYm;Ĭ׻kכ+V pP`@s{{",tGEI%\}l X Cgy0;HV 8nJY#Jf" mPl苕 DB; YC4ȶPD MJU\q! RORZ{L_Yfܝd]dO56۝LuusN)R )EV P;n 誋#.uél-CJfYv!e^k.qgv(\0TlS"5Ezᓇ@IJcJ ,n+ ǒbP1d]1l&ΞT[]ХaalÇ͟~|H}JچK)uT(J|+AhZvi6YSǨLx??IIzxTFyϑZ2>76fm2~NiygRTz65[Pl:?KUkvGY'Ns;ZlrqW+߽D(0bHJ#xEcXZ"3Z bIITRɮr9,!IdKZ*nskL=v7*|a38ƾ Q}Zl󋊌_niq_i{r_d_c6yA_袭&ښBi%DBPtN-n"Jm:*{P)I%Ted@W8>`K"ìJ=v:Fy,^vR '9 a"))X]ɦlLNg"L*1lc]!32QV!"[Ig9 "&,E栚\a3q}P_w6+~:>L?G G</tX[X!j$4AP1^$O43((v4ho E B:-K3 4Y9ک@纇V?~q-rʹ@/oxc&X4aS|Ȕ\ u_5*c$%5iDZC8> [Hޱr69G Ev܃7茄lN?N[f'~qJZwd"h+ s3Ǝq(vDR8CpDHtDpDDpDHS P$@ 1+riDL0T( T(I0)ERˤ-YKMTĸ JK[؟ZGaDh'<ܝ~<&JS_r>γn>:]~繄?z1ހ!/DNrH|9JCKr]^,ծ18]+DcM8R2aCR6E$p &J2d[k&FF]ڶVpcֱ)x?Mw7~_VQgm޼X,}_2ĮOz ϟɩ{,ٜW5ṜWuG^2Խo=6Uմ3n5HI"O~CO.3$֤x҅/e6%: Sȉ$ (]Dik ,1krYnw3> ?)_ΔPĪĺeNY gL RLV`(9hjgQT2NfE&sB-A+,3.yk)O5fig]0^QOަ|XN:":#J}s 뽱1EYu,H(u0&KG~bVlhOVH-q= 3B`1UHnZ VO8i{Eg/bJ(B]=Ta$=늡I&9f88njgDzɸNXҮ&u{w#}iϐ*"V.c( >zhթ} ǠէeDXs*_T4m"ol|+j>u~L2CR4ʊ#eeR"Ű3G;X}LC:·Չ5|Kҭ*x}݊rN J$*eEH&OD6FJٜlz)8o<|E'uK3ykgҀR]䔺"Ivݱ ~02`ghM#VH7[zK4ž"ZC)9 R J }1L9-B@+<*:oA?8$Ͱo^wmZ_mf$mZvvKoi5N F!YGƞ :[Ԓrv?o`6^lv-:jt4zmwTکTORI1ݛJuS>/'iPBg `hn)1H-h;U1TͭA5k>1j$ Fag#ƁK84r걵#$JaH8DN,Ûy&gs(uCÅ)D;@<.X0( ̦BG,D,(`vBw[ioa\M>7p!^4ι^tno;U-+ԚP &mh1*$]Dj&k!˚^'Pbҿ 2'.lbR9c6([8SviZXSL5ϒMs3v GKnŊ(EAB6ƒE ,ӝKjIzI`Ű":"anq9.(D#,D!LH"Kll6iہr6ȹMF Ȣ`;X+\$J />5ŧ,tLOkcVO-lRa|hеULOb25i(QʜIs6p.RB KZwlY$";7"b,Θ](Q2a@Vk,SQYۏc$;n>O.+{r|nE`݃c3" 8E3r CZ*DGǐ@Ӝ`Pnv;y*&qCsglrD`$;A PQsUzxR)j]k׋xynmҨbWtYŲ\\_QPM8sN ;CYx#PQ':~@ C"(`^{o AHReZ30{-#TDP ܊a18;Z%vj&h,ɍ)|^ܾPڎtk^yC^E3)4{G,{xV1\Vh}e|l"=5 b6o@ROECswdJswﭭYf)sHWҺ{ۚFwg^D!Z򐌿-n_YyKcTOȸ]>ZoeIw͹vu`搶hLEgv_k[:lsǍ6W>c' XFj68Ly+dm(igTK].q`9~RI1VͯLq)Ơw`cTDN7:SV÷Ǟv9)Y0G*Z:.mnD3i/[Pn\(Kv$:EQE$%2؆ȹ RY|[gY`. #, 1$bP6r`2 {0< aS$wXNZK.b#Ggp(b/ 7/.&wX .RvVr{=}3\YpH*2`X-%Iz i9 =R8I1#| 4C>͊I Bp>gߠYtp6%M觥kиLG)~0"Lix1MhRџ fߣ NW-Xq>ktl\A.`gE Y@]-4M`("`҅ W9犋x_+:+jZˢo',K܇N+Խ+U q+Eܿ[~)|s >^A87#?\8a@WN\" (.&ᢸ4MeǕw)CURkȏ;[zCF e&o)_&W+X3<+췪[ -q+q+;/6\{Wt־/ߝe&3}J;݂xv.$V'mHPwG?bVҙG2Y/wnj(b`mw)<|Y>YtZگK M:P=A\p"V:׾/ϿO+r IOr$u]Da][^MzKaZO·NH{?U]؊9|Wύ 7@d9cBQIB6U9CcAkNa<(Z.BoqY/F&?W|4S '~цc[= q|ec;eU,fbtڋ寿|8k\jϮ(洐+;UIlsQQ;3#Ym;N]Ma=t)[c`%@U=C.PnhpzleObRMkCyǃ,-;;V&ne^oxv,ph!A3_qt iZ@RK焢w[,O˅BېIp/x051- {\1)w_W郓Uєfv)ҞOᢘ|~`E\0KAO2~V2PZ(0qQ"TJs7Pt9) UT_x'h'*,h)6BWhlN<,y,{n}S%LP%7TliV_.lce8?,;j6|2. cXOͨ,eZ0ZL^Ҙ}.^d:_'_\FI s$[CW .m=]%RttJEti*]Zz0^ ]I-+q{=UBˎ^J(ҕFs"U[CW -NW %]=;v!$󟜮?1x?=-]-SOCWBIБiWt%;zhcYXmgoJr|tv턖N'a"~) yM"LƫoxIy)Q= "t+e[h:i@,:~4͔ |6nϦfmiDr>=hK#3cV`>6~`Ft6pUw܇285q'&+3r!\0s,Mxzp^u'jb0T<M?\H*~vSUL#e0}ۏ߿>V0#)6gF\8jw1ՒI,eA0E[Ƣc\g8{Ȫ׉^DvӬW{ɱ6$Qs-9˝R 3QIKt窗y LqY̙C:jtNcʠQsX,nE,<Z[4 em1h^ECEt]%ZNJѭDb B[DW nvp%jzqBK^J(k3]S[DWx!$LZq^ B]@DC`ZCWW tPNzt%)MKZAJ;ztPMtuUBX@Rю^"]I.jmPX\@!H<ݒTW_n Eȹ,2 4w'7CyAV b’ar)ΙP$W aTrx|-oiKkoB,T.ELo\̇ m*BJiiHTRioe'Ux^sd gr>Qɰcj@EvoDxB35 0 Xyvu]iM jmfQ )רO3Ĉl)M rۢ%5uNGSS;v!(CONWSrtu?":e{uR]=1uc:\v2h>@yP=ָͤE,Bp^i\Ҁ#z,PbݱKdiU9ke3SYQ ҝ{y]:hun $6͟ B?\2Zϟ σ̟3Hrl ]%5~B+wmH}ťmɢ`.{ n&~`XEr$g~n-9ȭXJ48N]lV=Y$pGu@pEs"="iŮw|<\)+%CbW$';".C"Oy)1Jk-= "qQ \UR#\peEqHl',>v_%e{?^W? \1dp"SuURǮ~Dr:QzO<?/ߛ~kQ*gWUNN!&@oSim*XML ɒY|^3Z&> )صD՟kOd鮝HE:ө_yyleXm5a23tDD.`mI(qbVFc,삗6|l\8n'(ip Xib: NYEFQ.yzMv`5&Izˈ #X"^@*RkX_{^#C' uH 旦 ;KBkgXwzшͅC6{M>767{;,rᕛvo_$M[itt*)ULᚗOS%ǼٻKU_pkWyvwE):>6̜G0f0 ҳPRp¸13I1b -3],C' X$dOD- Dy͜UL:21KṈRhɤȉ#/a"!-u"=0cJI|c;L O߁Պ:~{R񷰥MLٻh;Z\Gm<1Yף tiWߋρz~b3)v҉MBqk* '[# *zsZR0co3q>^Fkpn| ĸJQ(s}dXEjc$bBD< zѝyd }4d[[A/ű7<-U}K5ed*7ӱ71QS:euon4]3&SDI2:d)`0-[\ 1@m|vrTWC 0[ym>&̔?=uϓ&gw|ufWlaݹ7huF6mojra/G۝Czﴨr-yz6iyܥSt2K4 tYb"Ι+2"XƵq-\*ĥ CyZ1UjWw9yft_q'T^oi5ѳoowUd|]qwەݏn0[]׬Tsa&V}3?_P}rU/`_)Gߚ%uZ)H-{laKlQ5wHzvwTkfzq4=;TCHԄO0YEQid.ʪh+MN@!5;]PC}/}jG9Ժ%,B@E6DZ8Q#8c9r"ٶtS*Ņi],+M6f:C*CLG1;QLBjfV㭆ѷ*Ky&Mni7BX|/Tit|-Sގ!.Jj0/Ǒb>Ŕ%s8ra=nX`eLbUV1c%4+˔tA P uuf\rXVt:9ѩĚ2JgaS `". XbL; F8fJer&%VY Cܦ=pcpޗ]ƚ%QaYuIMuv&ì&FBйao5-7;K q@!8A!LHo%Tk2dNH*4hMN@/՞Sʲ Ie7 xHHJ"RU0 jglRM\UKgzUhS]>&:()zGI'gkOpurM%O}wewfreQy?MrYoDNc$& Yx!Zvu,ݦ鲂&fYIЈ1K.zRlD0l9j\# ]#cg+m\8PL̐쑧 !-8FcV|¸t&xX4 b*/ؙ~lcDGD<"Z}TyLdGLS)"Z$* PlHn6\F"L N0 PPt0"~'/21ne$@PR=:c% Op CΜ:0$w{{G30r].&߮)_|snpFlz~u_g%<8ۅ1E"*2 OYhYpBxm"g*+ "ez1Ԉ,ue)ȣ` [2R(2$)"s㹂D ]Le]iIu,q.RtٝW'mn#7%٪ *ڛl^7ˇK!1H.t,px-lK3t7}'-Rȳ C)8^x\(a+c'Juv9%QD%+ULFBR4HH gߖ0~"T zCoo4{3e!G=?lēRyxc\ :^L0d6ⅇMMW?3uz q7/_p*MBVXSb`B lbw>{&KbMncq,wY&ETBb /M9<V*z >[i{yEfZ+w^?l;cc4!{[Fuӌae.0)`([Pp"`Ō(b_xYAU ^Nze>}+fFL'w+ Rq9K^$ aAjn7+2Υ :$I y.HHZcLH{V-Ukl΅qE9nϼ!6>IL(zC}Hd&3V~5.kJrJ2.\8eIV ۵u˜Oj2܂&l)RȤu!JF8cNkOĺ`Z=%,{PhЄQ-6.FIܤrJoߔԠPK\[Oݻnv ȍb S!7,5Uc[Wϭyr-.ld}y5e-[wvyx{5=܌YkovQg5]p5H<ηt\.BpqmWx܎/nmJjpE?IJst'ͅL?Vj8I Vc]M0Qx\Nuwb ;.j_i ڷL@A$G.9 ,@UY %6*omw;P:$bA,mN3w^,߭}u? ݑ[DeEKpÌhM ٷE ^@4:qzˎ1tI2 P^3Pd2.r T9 B` ֜dA ^8~sl[DI>` nb=lx<_D”6IdѠ%AC<bEHzHhI ?PxFyXO#8s<#؟KPP61Tpy긎1*vJSidq݅Ak9x:8~b&8B!RXgupsTP&TﭴȠ֨~eXX 4цm@I(j[.\|v%y & nt |Q=s68,<$" 9 >2*XHs"ǗRH'wU2ƛ2:>>]InKy+;8YyәMK9ϋ?F-~43Prz_B>Y ͎G4@Na:ϝ_#=3-ZnxV]uu1^;?7K?;hw]eзh:CkHz}ܙ_݇(o x:{}Y<|R=Y,k C#u?&jD:ُIQc{8<헣A[+ޛYYTAI5 f;VEd:ZC2 d+{kGokG_QlV eY0FD#CH2 pcQ"g+)I S.Rd$/a4cg}{iv{<yOY!UjKS$1\rMF/_{ skF I?x7%_!6ZʯwPJnβ˝ *.\}LL+CIHI;+ @*`QRKF#[hb| A2vl/.Pޢ&LІitI^$c hJ9PpHڮpw q`2{ĺ 3㸶q[K`s+JgRJddҌĉ ۡӠy9 ?Ѵ !iN;£!a g6q0N~^mǹN=B5*A>=?xߛW3aSqTA0J('hLδp."8-w"bU SXD\A+p%[?Vi N3˵y䐰{W迕u\Ʈt n_ŅNNGךi Q9Qة?xS#fdd9Iv+x|t%F1̩nݛ7{MTTx5F^+ _s>i?;-]3Yfó\ _"5q$!oi<ԍì#;$Qşp*S|4\.\ٿO7L8*#G]LiԦ2" ln%_F^fFs{٩:Uw_B|'dG?~ӏwNtw?>}+EBnB5U Nj5Ʌls IʟK: <h PxucpdiɘJ䔈JKXxP!'"ю&0D־*;nVۍA9́iYNbm &x{+2I;0WUh?xQܶ| >ɕf hr0-;kmȲ0ۺUC@`|4,i %^gT%Q-v$>UuTWW_3> OxZ.{> =^#/FdV gAc#81Ζ8Y$xX FB/]Ƒޗ2tyǓy]\\t[^.{+=7Cavu2ߏWf/?Y n6>s}syv `ß~-Y܎2k 8 xi_VT⟏z$pՋQ.uF}7<{U)ц!qS<"w*?Zy~!BӮzh8E8ԨNu$rqkr.~ d3E yDn6JHYJ`Ѫkڎgrw!Vyyr"sr]k'z޾Ж ԅ`[f켵;ݮc3D^?6o-6!{ ^WO^w^wh^ \p7O:\BWm}+4tC+”.^f7p:Zꪣ\}+חO'DW7-O+m]NW@iQ]"]yRFd'CW ]u_<:Jtut+6v:SWW@)HW2aCo+E~j;>-zyV',t%Jcz8]-=c4Vk/W}nhrk 7E3%'bpٿ+7˟x9<^ZЛs ·Ѐ ҉WZ%çnVdh4)B{(4}4́ |8쳫xVŒӤ:jnı~vۛ9Bz}yA]g;EY |8,"4edu1c^ή~=_5n;wp xy?|B{rL4zʠ!q\t'gpK^?;JwX?A ѕA ]uBW3(%):'DWp$2 UGkܾUG)IC+1"fJf񓡫N v!;][[yá+kh7!t̠K%OGfʱ1'DW\V1R‘<[Ä Uwm)f:D Li! ;py2ꪣ{?Qzw ]^)NWf%ϭyvhk}+D{fi #]=vǻK^^">w"+c=GIJeżxRFB,_m/ $Yj_?a/x_[4O#Jv<Z'j ^" ?Z!I CӢSf\OQ~(?0#W#ke1ibidT=ŷGPPN ZF!8!{Ԡ9*8l8Lu^Mɑx;R!:9mɈ]03M:\+S6}+\IHWCW ѕM:\3he;Jgtut֋MG]ua2fhUGȋim^f:sWsoZftǹC+Ī)]uLב ]utQ8ЕsI]7ӡW&Ƴu{e ǹC+Eӡ+fΒW&c;Z:p4HW!)`de7hOw{p#]=^RA=:, Vh硫P=[Е>cP׶Uܳ`ԌI&+h6{>~='c J9l5%ӑ lhwO9ʆÑ Fo1%;`<pBW NWJÔJ0r+轧+ܷ'8^3(^V;3[ryo1h[Dƹx,2(o'q^"MnP:#k[BGΦc2^q>?:yЈ|qWmm<^A.|>Dt}~˂WG:hV,bx7+^|zۆ~_>1\_Rz_J>O}npQ;{LZlњY|Pl^}|I_Okg՛ =#PQA^H>c>ޏ yBx\/3wR3@1Vӷh:GA=-&/n|"xUR}w/wJ _ϫ7$5s4Fo7% Ȧ֨T)KJ> -]?z I=kO^קW(.ky;o:NJ} -RV*P%p-AkMSbtqXN6 E%IVLE$^GߗN11jT b6FŢs5bJEbt387?ؑ$褾ُ r܌ͥM s(,VM%rkhsdtވ'=]uZ+ jS,&K75QDqM* %%qJR-$B c +Hf cI}cR֭Eę>bTgkOpOD{}q) M&$Bm iθ˚M :Rj*0B?@p+Y%&ּk94YJV~|m=dUZlƕI!sc4`䒳"ecVAvڳi,xsD u;a*Vѡ^& E2U0zaLKjSBV4TTPtT-Y<6ԠA0P9"PyUɖޕ8ɠ-] %"[M`s8J֝AW\ZF6TaCYuq[Aj+xT}FJnc֢RR}CcP}JC.((DآyX|/H1xK V/I1TFbbXfC PQF6HK +ũx(R3 JS㷜2Hᱤ:lKC@z[Eoj(PB]!wl¨͐jPo]TP?A6lA0KBo@ -ɁHCceU:}n:t#X,x4MC10gW&sesN u&X fP` ɄFB8nVj \|MEw&Q %t4GQFF,+RX6[GRAPSN0NݕM()D5ڽSQA}wn  R)~V+CB9qJQ8PFl,0ڧ vPuо:o3\+꼵 A7 Lo{ _RYdc̡:C:/HLE< ت37WrӚC8 \flzdyi>I|txՂKy@9r,}V2z89Hmi>Z.Uc-EMp❂N k!()dFY VLta3Xv@0/> ݗtdP&G͐ B@8 Wvd!*T?V<{]by'vm,93|Y͌ QH Vd߿<.ΣN檢82sOOBFQ"GA])6q;90(T}}|pGL_IS`ʠvКvZ%$AbedrTkǂ@A WRgɮ$M`6dQ" `Rɚ% <^a:IȒ,B\9v6ELL*>Ql!rP+ b{c؊.ph,;a'j**DŽpJĪ)NP gZs,?@gHwڳ5MT&PY]xSXP)_z+Rpm iovBeH@>og3P\L CRj^ZM:3+9@ӡ 0^:=?+εݙD) .] ݤ6@Ger a#k0vPQ$(uZ*5fݯFHa#d%›A<77Tfu( 9RA5v# CkwŹD;ӤDbZ{㸕_i nZ1@pw7#Cf[s=Q![ ߷)i$kPn!Y<"4C0e `  :P+5>N({oD P ?zVWL":}pqr%so `|Ro- #=Э a PaQH4Qk``g=kU'X` h\56VЏ%'+1V'F#b~O w 5X{+{&8>&@$jҸe,e0w 5u#j7]k jJBu`@"m@h2p(kfphYB Z&@\ v C ^cO&k=HKLCM\CO*t+7{ B`vNlC%pUըvy l*X1 4+ģ #\I"! W , \c#:at ᝪ70&`ઁ#" _ :T675y#KmR*ר<ӺD6Gu5_oLJLUVF'Hn?~d(`n>Q]y]1q6W,c 0}GwżJK2Q4ק>׉X̲^r4BIwvƅ=e{3,V8Ml%MҸ\o'W|6?jS7wMP:M0CۺEø- M̕B'xO_pK棳F)& u^ (&%*7ӯ- %)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ RUA`FJ F p5E h?W!JOgƾ"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ Rә@|@l@ +:F%Eo@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJUnNNJ kT((0Hv^ dg@Ǩ`FJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%(>Zoo!b\H;TOyE͆ӧ@4VKlKW\KVٮ $\:ҭ}ˬtȆgt(!:B2+,j\ *uB]!]lVt01BBWt<[D-HW[tFt͆5BwDW_ ]GvZ~T tRJmμݡb?ޮgf ~g.ȇKWo\#r#\m1y׿YNˁZ t?sp_ $oX)J)}: zV8E=}ˆ\n7lTҕ/yJT6gzU'j z5RIVjUFD Ӏg!46+ɳhVD4ڮG4kh03+l3ZA _AJ]!])i[K+kY/P%]]i)i{ +Bw> QJB8F2J2ȻBBW֩te!)#BdCWe ZBVw(JZ&Έ +D+l QvqDW_$X ]!\%s+Dk:OW]#]iki +鲡+kx.th:]J-ҕq,+lD>KWe "Z#NWz#+*#B.v c(Bwc+6#p53Od@c+T9X|ƻw޻B#^2;<]XCֹ\sW hj'wla]iv=YIuIɌL41~*SKqqR;|9c)%:}LG{GJzAxds>N=k72g9cN/N&oErM\d!\sqJӵ0Q$E"9uFgCWWH ]!Z˺NW;#+Ŵ0.#RCWf]ΨZ+j[/p>Ӿ z ]Z!:(HWsTFt4\r+DM^4HW+|Ftʇ6=Q軡4uti;ó+Yjw>DR]!]y) e+YBDWHW910qtq1FPH )Φ o4A3.z5 >ysR,/Q>IZ1/oM7׷>1Bqowas)lP}ZMw`/_)N;99;zb0OR* 0Ay5@[S3RAZ~Z_s;jeOږ[[~kxR4湏p\ JdPyi- v(.1h|zg(fi:6/irgslھb<|D|=oۧy$S&O Sg NwCʮeq|5CSs7T',(UJ+yT;3iE64 0siDMiQv D_Т{Gm9u0 .6v#ZtС=)f5z zpɏg?(TYEzesޜMhk?%mIR)*PV }K^o?Oc辳/g{/GukbF`W)wox1àgﻤ-/Ox2K³OEcPW^tZ ^ϺXʲ^ :N/'@]\S?3(褍f8`Aq]q\4͏sڕ jp ӴjB#=70W ۞=}wۈ~=vBr${}6$d}z^̆ܵ0^Ϲ-a}9&AŖAg6y|W<7v_*aȴ}0z~2 [{y#nN^b0B¯iXL:׸{ͻ[{^{Rvo^ Gݿv![[v'l;SW> ʝ;IQIj/ xAM6:4zÝzp*%@.ŠR蔳W?JF+ ]:f\Re%E( g¿5RtB %fM(Cbf]!ېNѲUe93ʲfMܪFXʦռƗ=*'pr B o&zwow=Mt) RwE{rǠ>5oj@Y`bm{o{e#o܎YkinYyGMc7_!;^.&ڪTQY~ul[|OڳxE?^sJUztsNe( y6iH2L<4$Z= IR> WII7/veJ>>V\Ցb:0,:mU0ZTuaGViRxoXZ Q 䎳h ||(sڃ @P/KN}Kd0ޤ2κHM&0-3T-k{pCYT v7q:om72L? +]'G\z,F^`aFD&zHRǢ/@ K=Gxqy=c0d((|.Pq$pmd G{u50I/Q{\\MpV|qמD”6IdѠ%AC<bEHzHhI _Ptx&yYO+9ϗydGB?yD,#lb@{#qcT2R픦{ gגl qĜ&8B!RXgupsTP&Qa%{+<2詾3_{o!֫E +t!8dahP%u nH^ꂉg^v96,dp68,<$" 9@ >2*XHs"ǗRH'ىR>wݖVvNM p r>ˏ^Li~㷀_crD>Z nǓq '0Lk_H5 EՀߪUWx@]L u?ó-oK>*a+vLv2'alvBoλ :})c~W٧MߔWe^~[8e0G~0Ly[6DAX*;"8W ּOei l^{>{}.=M`1r{6C|6tk0d 63%}u iD Ϊc#@I|=:e](:+`<>5xtq$C ,E/2KK*$9p"t2tV0_LZUϽ@ 7\ /6CnD}]Lqc@OƒU CmOqMQʉ2nړtO.C3E:y1Z:F^ m)8-T%DBFtoE[ϷnKvwݎv}k]m4=ʻm/i=eWnwUj8g + 7}Y!Fcyށt"`Ō(b_xY g^EX.k.FPÀ; Ԝ%/`RTkamℰ 5uVws"IRz4N)J")!d`$PO1&$ƿL(ѵ3pvϮs/ uI[K8xYwWJє^y"Z*HO 6b>@B$s^Vta \!P U_8v >afM'a<6th BH&p)(Asө(hc6RtS>RlJ3.> 'drDSǢ`gK_@Ŵ6sSfyH́26?ǖۼ-=Og\:S6^^}ySn\^m-%uZ{'FCxab!'$KD{]h}8s"{PP-jKS$3슛+wpۈdʐ CG |9YB[ᩦy^O\Vn=tYx謆+Wqrh11 %!%HL4$R1JGM#H- WD:9{pc꿾~S뮦Lm\rwϋKKn&(&JՑ[o_areYx䋖Y@3yWl$HT#|"V,İ\vxI(WeB¶fǺUit$w+בb(f[HԛQNED焦 oR%Ɔ  C8 -X*DTzi-%V!RlPnj#'I$<w:TpNvtV)7;PnعsXջ<ӌqYY-;r䉆P+<ѩ%0^,NJ"t #OgN!-7[<-<уE-$x;g x3. fq$v9H J\ZrrmE"@NOPE#DF;:gרWh.M@FX蜈 rHxE,q<X띍6No:AzJ9J (wDz2!DȞ7:u.!D9͡7"^7U'BIy 9I\ߜK: h PxvcpdiɘJ䔈JKXxP!'"ўfP_O@k_qޞƿ[vcPo|"s`+s4iƐm әeK7 /_m{?\np+-\LfXɛagd,bװGߟ^.}F&Hdp,2WtVWYY~_~e/NjKo5|P/zpoĶ(t1o};Uoj[mcm7,z]'G/A=ɲl8n&h>T:V|15= ۠.²Q ^?FWK|Ok%飗괿Rh4 :UYyflz#r.K^cIMGYx볡gP(2R1f*G \P*.Y:J:P!ZRwZ-8 2 _oh87}fen/g_ |y _T٠v1̞ieItL/H`G-2x ,y|U,5`Qh Zz-6LڎB$!ez~Zp359֮fCamY{`Lg@%@ƔK Yqk ^ieTD>+m\%)d K >˓-)pLs>fY Qs|؋eBCajq(#ʌhGFIg҇2i$Z}TyLTZB@ @d%*WG EmֆC4R3>ɤ@JZHpͫΈ%jO85jVr /r^9yqŧ,^G.0euk$c6$oM SLr&!$#//YPƇ/@a ]`nӇ~ܩRn2O蓴wHHُWZ2'u=QMSgFk)C;^Q!aTA8e!dfD<rxf! !̕WCBGc s+ !! LLnd-F,S:"cA OpR$J#RlȺ!o o_/Ji|)i`zwZ2qLQe4$P)1'&rR1.PsN̍,GcYp,(X9Q+,ÀPH "s㹒)2pjgaU8ܙwŔbS~/m9;En5kuO+6QIn9ɽ,'҇i'SA ;&E`&X'(R}*F=T}:vVl㴏,p>4y 2~ټhBrsv] 7 i܅-=\Ys*Hbkq=H%&dBk /T* 'y,!miVÎgybjQ#λpO9.ah"e6se,:v딇DqcQGE+yf,::.EDqOaDc]`CYKxCh.WP 3HS)-u.7'1IZ΍WK pIg١Z%WTJL_.{cYTw,Qr@5(*7 Iq6HȜGi͒Bdw^3gS1NZFU?s,;08 9qa"Ka&C4DVHtƔ j>ggkxx-=aP[|.%AmSR"Wy'ZM4'ert.[QG-0a'xP,ev0TB #kG 3/V+i*jU 娓F SbfY L .#6F r#V$LLpq$cP}7gxs2M#Tւ "'R*YC6}ABFe|N!'f(hE?z[V:ԗVf":Vve.3!1@)r9+@3^ƞ}n4.M'2~9!>N:L6wN뛆~>6?OO>O7^?7g mq% 7<'ySN^VnY>}^*>\Z487 ۃ}=x~ r+JfzR҃}ZP"{`V)4NgGY$Nf&Ɓ(:mtYm)fU{=6\}zuƞP:tCX&Ul@1kc)1iyLc!w*oȜu{(rKeƖ>+:ˬTA=ewl}S MbC8Ѫ%efAQW>?݌^-\(|8XI. _`E\B3|芸D6*Tc 0 Ulۡ+]d+tUЂ:]ʌtute5DWX h \LtEhPCrmgӡ+, \]`U+y+tUКUiF:IBזn@#\}p5-F 8\Ez?۫~%Xٓ{9Go&yzW2|nyV>wɺ/>CRPK4-7IqP\L2i(M&-o|r7'~xȷG{)|[x]|l]T8MtHtr~IC>^ڻK]\ob#l׫2\dE)nLJV`ɟ3&'3EzX1C\̓=l|R]۟Jkܶլ1ۋۂsej;]~hNnjpp >nX,|O[/o{>xC*ŘlF´ Z5ݑG~"z~> ~ئ ?N+CSy;v ٥KȼqU.dkwŌPfKYk.u O6Ҳr 5qI>g8D]0n&mo^pVSl{y$YdS T4yޥ[T>Pr22]플"ܵM{懫2v4ZU_Er(h0_Ehv(՜D_>O ew}[ݚ|WaN_.aWWo+ݬԿޮKUFSeo'&%Awy2c!K'Wy-v/>9L}%& ^^΄46YMsWt8HhEX]nt Y^Q?:_-B>t ,<{f[zbz"i~t9u%34<]"^ɴyK{46X%W'e(U_34 %a/ds߰gIS?d!V.=+sZo~􁟎k[2~}$]@z9<^."V %c.p/""]XX.7i/̎Ԝ1e<~^8vmnsFGR{9;e>ʊ1 (n\YN+0>Z04 )96 W=,%W+co?F%wJNKZ]RV_6CWV誠E:]rRgӣ [Z''*pU3UAk骠D3 ҕ4\+]`*pn ZC+BtutRtE[B(p]-j?Z JcG:AҨ&< ]Z+BlK * DW.V"3CRȑNUVUAƔS+k_-K=" l3tUi Z5x*(8xƬxŽ{6bp?ǥ"?R~(a`i]HW=/J긠?Ϲ8ӨWz-->K%ÇRB ̺舲2NWOxhYNvQj{q:x'5=`&X'(s`L P@Qfkk%n(h=n3cpq09 PUAqtUPZ1 ҕdZh>z~pl Z;UBiJq4DW ] PCrh{^#] ]i)1H53U˱*h%:]Iҕj ]Z5x*(VF:Bn)+R{+(*h% J5]"]Y!P$  ]誠׮;=w)KLzd#~(&teG:t9Gl9|mN*&$czc-,!.l \[X {l铤ii%ʹ* Q~FYvǣAsi5uhQƵmq5lk _ȶOjG6h `d( h Z1RQќ[+Z)pZNW%tut%!+*p#UA 0 ҕ 4kl \LtU~(JKR-%M`ۡW4CW J=FWHWFLCtUBZtEpkF  NW#] ]bVACtJ;;.43] oG+qM9&@ b/0iL2EU)(jbDf{'tQJ+#12h\|\iN2f#Jm߆8s; #O>'u ZuQftU}@0mgkwHVy/h6?/h9 %W`NH7dRnp<݀h5!M %5nxJ%) Np5xBG۴tt):!ʀDeD{|RQHWx{a%''CWx G_P|8]!J'HW:5/'CW.BWcNWF!~FȭTJw{ZmWԮWڥu_pn چ] 5dzO5a;LVpO-<)<9̔8OrOœCB4ޓCo~k=1MB862o:<2Z-NWqY-]} +9h%42JՓ+ 򣧂S|ѹy<'V_uʿFcŦ?Tty eSw|'}7lHo*)lNjGe3{1(N~OYS!N\J ,H_Q}pFخ-*ȑ*ٲط[23~ye>/VC0P̄I!o]Y.iEյˋy-+h>~+ W|jǛHT(ElR+8Μ6;v(C(B ʐ  fG"?Un68Y^4uFyNu_F(_z1W0\ ٪##jZEL6AUU=[NzޞDzU㤳*ݢЛƜז< .$OIYHI4"ZK Q䷅Hq9 [ Χ(~x4y 1EP<$ϕ{!ϓ6\x!H$ZQ2G:X8Kx8l`Sm2r2m 3Xm4%:Ȥ䝔:ie*1xRnZ?ڿtǭĩ!yIhL1 D%#I@w Z=N#[a4xꉷ:&'D@t$cFc^~IG2k,G^ `44X YDVACgg[^X-:ofSlpl<%笇LzoM{{C 9N#T\Mם3; $?NmoG  E7?q0,r3TYq.$V9q3%|X7P^@v:nxٝΐϻqhŽZB*yF:<(wPm)K췁H}-(ʰ!ݪ[؋saK)?Q>,fKE~ "R͆ӮӧeWKVMOs'pkUԻVT|C QVb>;[\x)̮a`f+0`gU2ۍHg42$>oչ8z&Ux&.vt*g5mqO؃,> /=]98nhY+k]6r_J˵%MOrI468u2mHtr+ByPo(TM*l0nw /HG_Wӛ߿x=y/p ,z{!CACZ{UU3RՋ^Q[W{}H>3;)@2bL/__ \w;vֳ&ză,2]B',q64E%ֽCI!+OV<_KlekY2VZ)݌_MFh[M#GPf(M"H'm瀛L(< K+GKBXdg_W'vQw ]BVOm Yz5@; %eؽvpöִuXsD i@ qX!Q7XMU<%zUE)*hr+o >ְL{ZpbVc E)8?_ 9O{;-^HCdPcw F?B*5 1\jx܅٠YBV=l$Y(uN-QJQR똴gPRs$$KM Z{-v@:DG#q`ѯ疋Y~ߣ?FG/=m<{-n]gb_Si_`%e5%X~^Iy-I[y*Qnc큙[kk2 qX1wevܝgVI <0Vȼ9S8A7DVpBaD0I@h6*F%D.KIN  :hQD. >SME^ !?>}>!oRD^G+1$1.1Cݫ*lT~4Xx$QD!T+R0 \!r!iUcNlIhNg :]5BuS^ZP5YX`ԣ9 RZ>CDWԂ2"&YT1HTvㄹsHhB1*R&g F8513Fn ] {EՏfr> gR.%U[2gV@h* I%Ø|vҦޭxJ>*$HBǓFp(=5'S y?$blm)gf':e;3[ph6,V۳ 69+ScR"n))Ҵw,4o dRJk) |X: 'ۏMa㾌hjfD2bˈ7Zxږ(9UůȪ,zeЕKAcƨ>eJgb"SShp+P$-rv^-q{H:_73:jZ_伮__ܤ&qioͥP@`PNv01\&T2g#"reOVӎS|?xDi):S??T4iIp]㑲Ԫf?>*#s}ccOVTHd.iA4Y&hMMA)&D$Br≐! %BH4!f3Q ;.3G$2QYQed1c\ۄPY %L䡤%YaEu`JLԈ' 0GhȺ8;&Bbf8|zJYϠJKZCF=7uezX̹*~y8a,95:ƲBwyAhlLg8aBPsF̭*l8exI }X)JF,rV@ <@jl7QqZX ̽?hQ67]Pa2 \w[8P{g3_S~'[!RVH2N4.R*.R=L*ء/BUe{dJR(l>r< EU aԸJ;ӥr5]kU׍bkq.FL6c@&TQhI'2%C+ &Ezޓ:>!lSM aH.u *{q6H\̹t *J~ꝂbNAoS{S}Dž.'kJͽ>S[ϸ598BDa!{+yX4q<3r^^Jȫ>3'iˮ^.p%#p\D(}HQL1Da>˘Dl}Rx$ !jYh8kmɥ\Gt3F_n*ϹZE\.&(oYY2f}FjqKKeˁgY%3IೲL22!K T*()B,6N+n:v&KIH4r"&9y Ѱ5eLTJڪvX ) dRdNYΪM+06)*2 ܷW!2g|YSI 1l1&ȢҒ{L |z-?s;#.bAZ9TLh-sZ(aeD0ʟSh;Uj䶋8iaM:]f=a4"l|]x~Ap@ƵԚB0"Gˀ"i-_9(enѤ^wQ<`줸ϫiȾ+[w㲰Bm!tH3g8yފqikH|yBBw8-OM1(t{kff|8of2 )qEC=5^ar+YzS3+{~^p}8s[q!P<@s`*&' xDa. 3@? Eþ۠å]{B_2mwi#^5/&4|'Bw?M7y賏WGl|`6k`C="}L/b Z]&YFTl\J;Hu$HeTWOo P V|vq (Eh gousJZ^/ބWV*=3d&{mRV2Ɂ@Jnj2kx32SVh:&*L8GWD"K͜d᜗49T:_x,n闣҅ܚw0o,,㥡#72[2CNhn|gUw^Nrv5ٖ܄!/GYq Yo8eyw^`tFfbݫIbiFrャG1DtFi4ZEtqo"/NX dS vIOA nmk-A,;bmY_镀:Ƶb)o M1K=J"<'Z}Ip4J?G [Ec(PfgD6CυlH+ݩ)f ٌšcW8i$/jX2kQ RnʎY R ^GZEZ||s_WԥmM ڬv]У~YbG WR~ۼ7 7 u V:( `%Xlba |K+/ $S ɠh#"GŔP*dL̈́xzӋݞ0?Zw?@j~N=X&1gZ LkQ{2i븷A A1Q Ny-g'ȔZ0xst%g=( \9"فpBh02Qe`Q݂W`ǣ`XC}ċDV^ciP!"#ENV9TwOyiX!r4 }7u׼S?2a[&.s7pf.Ts~rsijc^2vE//pFn]l~؜0*ZM%T׈s*ZsPEJ׵~PPO .)~bހ󡱆Fd"rsw%6Ƣ=EyNF*fT⦅ a q'Oe_ԍ-_A0{PUbJ*wqN?'_1a/n?L_Y/,z_;C (/W{54p/Rxp;Ͳ*iLm~ϲ!~w4uy~Ւ2[QwseZJ_ʷ佷NoDĿGZrk5#X~\Š`)0icY! ȼ""8YN8 rߍa0ħߑ}\џ['x/et>tרqOz뽏٢B[u򗇲| _b˰Yycq|w;=)Z;'Ѫ^ho[ҿT"hm1@j̟AjxinA/fTusOf>-O5sl> "3pک6M呩)^q|`NLrЯ&>oz8rはty@iV9lA;jPƤ۹s?WYykٰ _w,OúSwjeQcw?l}}gPB{t Y&Ai1eI a[ݾViaq*FVHx~ou*2s"AdB < 9EY@X(E;W |Ռqr7|k n<8@`; =+|6Ϣ|Mn& >mx$$#߯zR˒<=lv5Y,V~_ze“U d<2 PSR q IX\l Z:\F!Ws(ZvpD%l2^ \r>Yk<6ϳ5LW\=c3P1~8`mi>#)MFJr@N\Jj9u)]JQJKܞbn3F’xA5vttUފ)W/P9ҫ(o00)upSg6voo}m7܉b _v̯woh{3\AF^X~(=db/)Ded D.,{oTsQrhBԮO CT.Bhc+! 7lK7E\N^\V\}Jf+"X s6⪐{B. ƶ+WX8]i5$̨waaKY|7d$k{x*%II]>q z|OT*n(YM?7WzN?!!r?/XFK2S3ęAQehL{m2x}EZt_пyөLʿ{ÀNN`qF^"W2}.rV©˅J}kܗ ⪩z3Sr B/ƆJQn0N%)seicHL'tRITh ,+@K*򢽪>eW9p-FxHQ;1DQ8.BR %ce,4S*@LBHN?oFFt CF`bΣo81r@}$N cO i<-E¨5 'rc9+]Iظ9!3!ixhٴvouX~ ѾVCBגHAVA %FH*+)XQ '˥SwfیҳKy)ucwK.ٚӰC;im칐AAe頳g󗛴#F[]O^&u;#jW$c3E/:O r}jskm"о =KеcӖW<~k8u:cDmХڭURʄ`*ՃI\zz?_U*e% J&'*/40ndUlUt\s[ftJ,!ld,G brRId%"xoiiXԢ4B2qPr^A@L8ϡV1{i7ѐSޙ"ƴF{L hBjisA0 ʰ9% mj-3wkj? &/U >ZKg~7DZo ڤmeՖJ5Sg3p,iPT4cFhSsw6n&LJ=`QINP@ZU09d HtƙA(,h*(£ ;tPIĕ5BEٰ9a<Υ{3gLe6Tȏ=+Ao"L-ܚu%TdQAFt &|b1d?ʽi.z!.SZ,` T:Fu"H=Af3J^dTb9b!SğN_4 ,Pčt@뤏О%O^ksAȘ z֪\"$TpcpI:&qg}rx3p{PYqϴ#98aiT}7^Kc3ʺrr$Y Mu:82l6scxzk=o ,sqm5#iZd,?,?*6몛Th##JWQ|ׁyi[k˾T`}NrP$7$NyCfAiK&LYk0FZc<(ΚfODRP3]֥ȲR$0,Ag*pEĐr MӱagX R؎.ݕ'U2O,rß)wx&ت>R}H̊3B"WgTgTUџ żџ˜-w.0O·ۣR gV$׬R٘Jo+r (EȂs7.wZmM咦iN9,wOHa;؁rw-qfJke FjG:M˒4{Yd[aM-܋Xd9G7Ree$k2IS͝ɡHhga1rZx*j>~:%{vdh\'DiVD%m oST]Qw]z} "\tgԺոi8 ڬdnѬ;&ܺ}Iƛ;mBPMw̿y'KcY OlG7 xkrf9imsV@Mpm^vy~/aCm/Xnm~܀6UԷlZ_nNR{H=-KyؼCyi L$xŔB 9Y9'ق.lm (j,H@-=k3Ciכ)(n. ci=H84{n{[O1.Tvz{_\?雛;#ʆdG 9J1́OY z[b[xѹe!1$ϭD8D(>ZHv.{ ]z&uGWk(=!}ҢQeֵ^^7uoCcoI!{WL/̷;=J>O/~R_$p_gp\.1^XEJvػ5J{ D/^gĩs4Zbf2Ӝ"Я62߼Izoo@wyO-3$*zE,1E"*P h8.E4rν6Jŀ\$.4@aw^7-E~{y߽]<=ǵO9M?eȓ??6^ݿ? K}nV4)gnLGߍ~Vʳ‰1؄{W.X_/@?08kͺ5v*qOu<*gx`Nqb,/oHOxغlւ& L$_hb*"Kmu$`8fI.VqܹO[f.Mx64ZpM"Ŵ]}MǙX7m}q cb8ORPdY&),RQۄQ(qO+ qj蠯`@vqGg܉X+.2W.YO:$-Yvt"˭kD }T:6;u|]  O^Vr6z \6HB0힮z~n!*%++6ffbun2gO4yG 'QMa5;gz7Zb1Z3-Ee<:I); ]IZϞ~B*Sl7Ru4=|&WvRPіB#?*@b QٽbZiәuU;"LcF#q,'1"@:c@&IńfhMaVL/_uPa) h۞bNYޒ%є!!iL6HK^fR$G !i 0Ro6 OǚqS7붜b),֭ c,@X -m%IR05fTpa`H79eS7R[6 ٻ8ndW@Z*IɮF~5>\K3Ȏ~35 ,[b5UO&ONR 6}p.|g΀ glL}Xi8_,kW[ %]`>`Ne"+ٍmH .鷛̓b7zc:x͙lHwg{V tu5ywj[8l^m^hs;>p~t ʻqi6iuu_3Տ݉wy7aN/i>V˾iňO rӹ'8}FmKKd?$-nkooF\,oHiNg},Xfbы6'/gAmmݭ_r[ޕԗ#iQ֑Ұ;iLdڟn ?9:ŷ/+?鿾74mj5MVާiWWvir[T}TΙ͔)oy>MN0.k݊GYh5#U45i} Mj~W8,zD'_nJA\FY9g{Kxs+&t D5涄j擸?l"z&@8 %[ 49h-g ,;,YT3 ׄi # , !hcK.-Z$gĹ nCl/玁Zjd Ir("gk]Si+YsuqIBY T__{'L5EiMN\1g*.Qh;  *OVE'4I_Ttg30S}mFr2#IN[I"GSJFpT1ښ^mL|%[ӷC>Ӵ%H0Q`b4BBِ9I-C@-٨1fCՆQEDHԤ\U"U>[Y֔{ER%IjV\ "y(B#Ek-dy$8Q)NQO=6`f5Bv?b|1~ .ٔ ҽt@0Nքl4d FdG! r?%ZA(k{}\2Jp8lu`>;NvrzC:i_a *8Q 1AV*l˒,REԠC0,Xs(f5q>$ ^_fbεE[2Q~k|ܿѰܴv| N>ǟq:Yyϳ:*Z!Ҙ:PՙEťR֍P6d|pŀJًANFp*j3&oHGY4\J 1XbBD=0&g=V[2d"C(%/LƙAaw;t&WFĹg^;M`Ja']RdϥvI9^$,$"8嚉MvBxm9BƺTmBeBvUXىR e %1\(a-yjA` XYQsq)` ~ Rmq!pD-pL&\:N:jyftծ6:bhG{g&L*(ry!*ʨdeAmru}1X$yȄ IFPQs6jd)l0j|vN}I?̎P,bP*[D7Z"9A2(V4:BR0+ '/  'D :ɄWA["HQI\ֆd$΄D&QfFu hW5XgQ@X]`]3aObGPN'zD=(B9ٺ$bB)r&1ޓA Fvq(=|`G0az[2],x͑4e1ձQtl{F0m؏G3z>T@.6{?bSM}r^^\lyo3uńn[>]XsM\`۝쑥]oMJqKҋN2 $+4#N:8pM :fa/fEݶc:P`K$sw&ݦK_Oto}'o.]:wf6{6C:?\yu>vYϧR͋Wߗʓ/oԔ ]{3fZ>oh46FԳy?-}W^~Zf*?Hc=3olR{fb+c-ͩ K Bepm?oxn@m7޶!; Jxe"3$OGAHJ>EHB2 c@4hMW 1x^:upx}|Q|yI| ^0P4ު5 :׹-s$>g]{Ϯ̳cSPM]s zx'ךev3"Km題gcsV[ >C6GJըPSv6PԢTFPG>|$U7#aLoDÞl@yN||0u=νCԎ:d͆_5wSo ̫bb)+Z26)³&%(g(VJ99r$t~bƨoЊy$  6@g vkv w D(V-,v '];)O*Zr,׬ !HtV?Nq P"6Agr{ɷJ(DL66~`:*b;4(X,6j6A1nNnKq?_,|+ħ|r,6?O0U4qgFo[Y+8>HųLu㒪z=OE @ruCLç8e_Nm!/şGgf0gYBob$w_(g5[k(i[&LѾ$*]u4|WÏ@Jtvg"?h}D{zYo@-9 eoFUo8J@.2YB|TD!Y)tɛVB2%c4PjMYg(R^MT_atlMƢx.ֱ]Dv+/]~QwDˏ?RMDcV` A!kLzpv+5RœDjY{nt]&3q5P }cK5c:0UWUi<!\3E&[h|e1Jt@=`F͙u"LCv(.e`0 3&M&c,c16F  mTߨGJ<@5,#hG\MV1\dQT(J&KqnoxXܽO74Ϫe8oZ@sԬF[ F%k tu}2:9&MIkNֹ[5Q1c63f Ȩ*OE -ɄDs|@5 F5C ZuC)hK -&|^YbT 9>tF 9yR 37gXҠ$E$$$5sA Pe 9P@cʰI&P}j>VqC i4Ό>P& _[G{ދyJb6\Bq|3 Nj|<^(} G,7g[x# <ӂ`X*f=f~ ;Jw FfL{P\%y:d6|1[Zm*h`f yG a:X _n`! {y[ɠR!2QӚC5 !Xq#aҥ`]͋!^b u6[Aq*q ߺ*XW+̊~T}:y'*L m;dA$ ©q`M1v_?Fn]D]e TG<)} ^{@#AzvC.۠mU@@PCwAKB*q`\FQ]Jie`Wz=kr[Sցh[Ru ByAz 9@bZ۝ 1if匔Qv'{;:O7|a $/ʀGm uf VUBi5eP?zPZGhwE< כW a4,Ƣ tȳP(!c+9QSZ(U0ݯV =i P h2i(u5Ձ{%AE;&LoVA I#Y/i6ZU5їKt`yno9ݞX==_sL2Y>D@) 26{ 4YS@Pc2jwgU bj\ ?k1i3j8%kn dL 'ӣA gidI z ؓZN9>T 0C{W9AD;DT`=@ u u@gj*3SC],A[X Th K ]:Ԇv݀u?)w$YycaU O(aYT1tGQ,FRu#xL z@U hUH?xmLmT19cR=iunlb妙?$hYXAZQkl*L|MYEZx%=ݳjyנhdxԻ5"kwU &(m@ zx:XBҬ5M,; -(8$ ߛBjSQ AOj|d$Nҳj9Ls(FlnT Nx4& kE$SjŋH 0iIJst\!JkB*?#u+yZP^){U1aՋ7B|0{qryaj P蓅-Lۂ>UŖol?gP" (PD ONO׹ih7owĿwNVG'Ҷ87;8J @B<H@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 |@.nG=N +G[ូQZ%N" $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@zN rHmr!` pl@OD @ N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'3v w&'N~{@@ h =y'ܸJ@ 8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@ hd#_vvVSnj{}\P{;>wSNa!譺{=%m2O޸CT^K㒽ۗW lxd]nmW2Wpe:d.u؉e"<ȩҵj7 gU[@nQ2u^{ޑHqlp#mC^SWr䞡3)l\qq{S[WO8J!\9S [WNE5p>cEЇ=Na6!n \}Jm \q?( \=Gqs)-Rw`]o;?\'aws>7xck-˕Qm /tkט"x=.BY=Aݚ]K|=O*migc; vd~]= 7<?cS^>I v/~Gs4uUP9kk4rCKZM9WU.i ܗ׌0>]w[@Vc?/pDŽ߭D={ߏ..ƫۋݿq}b|ۓOE]znê^}\~Qbj#D.We Qn@mM'},`}Ƒ巊ޝ}'E&{'EW.-]oPs|]yoG*vx#}^' A80L 2U9EQ<6|3驪UuY»R:ǻ=4`$[N|yχBl?#/NIQ"y:)Mg7>zټUL*MUިK_lnm~]ĂJ$O*ˉ6&Pm[jsiM"w0Xe&['; ͔U@*;׃&d&j:BsA.Y@! ;"gos}Ùҏ Tg0_8s wJ6p\AHȚx6 oVJo2d8cPW"Y J~m߬ZWkjQŵf"x~ \2s1c$#GK'.pٛ-j`Vws1vQ\bX}r}F 'V!}MxY[nkLaKda 2ZL?ZgLmӰi2+ip*]|:_zst9Zp.<8+#Ϻ~ȶY۞4ZF*jrid1gt_j-*6&A]] ; ћ^~wW/_Wo(3o^W߿O )`L/">d >/Ժ8zjn1SےOZͥ#(Yf˭׃_Lⷣy/WcXՈɵifSrܓs7+H6&6MV!iuDEY֜>i_]VhMFh#F#̶3b:QNEDapS BaZz>uV3l]eg(Sgnnڕ-Y7\+`1XzicV* ۽SoLGoGܸ,0vZ`>SUwϩ8,?ɥϦ\%2B%’s[-ljb8*%JsB*A%8I9kCqH_6mh(>kcn#mdy1F I\-h>F`2%WecArLG`%' T{)BO?1+f26(A V#`1r>"̰>#TjZa1N1pIM&D .ʹ#f(BGt (m8(N?Hjz;El!\"sI V5>QJxbLDIhhV#&0y_ዞwEȩg :J.e.$\<MZ(\i͍c y>7zxxlXW]brtaF'r>khrx oVʽ@"ȭظ'6&_ jt u棪 ɩDzߩ"SdaP+[ymr `L7%ꈲ#lnVrRo+v8^N!bLJ@OHǿ6Bc .CtsȣRFe:%*EKhSlŗ(ׇ&z{vzrk-aoh&ޗ'Dq`m\Zwonmy7L?LZWPV;v3ˏGhm{xa}In V{w1q+\r j zwHnXz 6҂nM;|iAm*~q2oruA5Ε:|,aְJYp)VVF[)1zEzg T@ "[oslG1bZ-/1/cFu=P+{ INTBt䑳8db:I>K3蠙#JHe4P21C7D/(18s ,:QP%,wVzC bYV!.Pmȸ%[X48x޾ ƣ/_ kl*dF9JI(DT 'GʹĊ9Jdna!HP6.C6\d{8j+JDl m;blb‚*H$hS9wkl?+ZwtEk^kJ4x阡!D: kǼA॒ZF#!s냉2OXY>I]FR2"CG j0GQ1HBNЮtK1rևQ~FdX?NՈFF5O1Q#KS'Gi $i7J  VPdmcS P`B@K ͝"KsFp=y.:qɉzҲz^/zւ=*XXpD"I[g"ڈh*׋qGWEAm T2/#cg؄QюGяO)CCAPϦH&Wg$Skhd*mgk3֞T;?pŝMFd4+ML9V%C%Btͱ A8RC8qv%w;x*&T*>\xJiCS")˖t7~ 㲪Pԧbwe7/b$hAEԄ$ .Fb4ӆx $D goQ!'&}5O1(H @mQc 2PmB@]8{uWsLVWҿ{u~|>>kCVnl-Vۑژh<7?a/ݩ,U4CnR\j(CKrzKȡR#=5(u6f6<|rs:$0۸ ׆jĎΊUܹZMq09ƥ*E `@JSfFXg694928MÚ9F ք9 #r2㕑 F 9՜/,X Ή(\Զp ETBk<'J 'Y&/#kɠMfj  Nxld C'H:5[-Rg-ӚRO  41A BN ^IՓn tw\6Dcxx9Gfsj(Wh壳FZ 91B@D"Ec$FBodM[㶼圖ke{x ŮY:־9} y](fϮ>D" N%c"^Y6Ї-yקD8x>z#w*564H(DpQ' dLT3QP'3 "QGkhIl6HЈ>,RW8q9m//J7`#36 O@A5t߷o0k6l2lL1w9*$g3lDr3SN>/f6/wSSezI9h(9ɕI`[# cC*aM W7:_z;ѷ]뜾?z3sN_]l[H!Jç@'ƅQ{}q JXX !q 𧓞8nZe^ݼy9?tS yPr¬p|">o(q$%U!֨fp3cc Isdx SG'pN?PRvem[ܛZ|ZxU;/ș2U[)qo.ܑR<HЇ+Jx lUr^"Ur4M$)DRʙ Zx͙O  $%Kl$;[f@nOpEJg6Қ5Ǥ3o(=%LvUmW^zȧ(+Q"<5?zhsYPvfP  W$V)'! AVe94 @&i@QnV"9%T62nu4[mMr=-KfQ'a n zr< &Λpql.6Ŭq]ZG,%haU*?}Fw,G Hkh!".4 )%2Dlck92dR)++ -K e4DNQh78e.x뜡G `TH A!F849{ZUW`5'/e"~pݾ-z_v$`<k5i-#_Ɠ)Zfwt]w=N>qj7[IXFCs>eW l]?.fwtf%Yv- lYTܺۦ^ƻ;|yv~_}7ucއ/nm0<ꎎ+e#aN7F%nϛW"Cl{Ckjls-Rkɼwdԭ2kc %i*0Ⱦ0ԿWK].tQw@C43S "X`Q!£R3 Ɔ=(g=>ęv(z\>Xڞ$s8owZϖ;瞼}3rPCiМ'$J;}[ `wKnKwTNP˽poOu yլے%W1 voY= Dt~NJ傿|}pdz{UοJn8_* jI*xRڋB` 6'̺s`|U)*sm!λ:CkAia /sG>ۻ8eX_7SsoK߮gTԥ rw8)7ߑA8Sp@ ZE9_d[O7ۿj>_UH"n&Dk4yZGz'Q:RpMs` M :z/c;,CjN?!FRAЬ>Pd\ Ѕ 5/HК&bSej(w~c2Iy4%4~6r%ӎ;zQV?-jz4doeWp;rzߦoƖYP"w:1ͪW^Z}ҸA-T)˜[0^EOك`UFQN朹Xȗqh&|I*N lq Vo < Eti:t=H 5V&jx`yDfGH3lojؤ ;=e0yfO1V3M#O=fnngLtw,0QHDB<.$YG̑h3UK̈́h89C.%!!<sڕY:89G@9/Q]D$Y*9"IkDetx*E'!^%~~n:#;3oJeq{)D$IDzKsF%`ٔd }.%3zIfznK2|Wx@Putu()G.3UWVT⭺zꫧzEW j7&ugX7htpW+AeԾ߿~LB"w:IwT4`tEZ#Bhg#׺b?n?bGt1F{DJ=uu걨F-F%zo4<"u] 豨F}tQiA]=AuLp1FpxjD#Y|8]莾_DqU_I~6];՝Yp}3KV7vCx<~yo;9yטrvZ'UVm[|/som<]/yψux%z GsYEW햖##ٚ  фRk[1ړ{F4 [ 5 {w-+)RB:"0T*%\c(`Tj 딍ZNAKJĐeHTYi XgMm)K(Sj P%%ě6MEFUq6Ju'͝]mK2-hskrA;uVhJ*U'VPx*$WD .P`'dNL,~.>,Ӻf[-i띴fPME۴[WHNrZs,uݵPB8iڢm&o-OVZ. Qa:+z?Măo;'SQZsтH8e@Q&łߥL#nul/yhjI )2f@ɵhCn96ͯVY}8-ͳ$FvҦ?}nzŦuzzuQ <;%K^w9H.Jk;썘>}u{=Ξ,O_6]^ONdLkWt[OuWA#uc5#6L+BM%Z^]v=UJBH6 .:n~j_s_._sM|w\>RɬB#0;P"9lTb([_agm7_(d[>C*1 am@ʜem^%w}.Ր3&U%pM#tU{xb`";8d{>۰egg 6[n]pXuhuK- vOsfV<@<@QK9N4X,3XRȽ&\Dhy<%q9LA`\u913 5Rʰ3PU sG CA Ze*L@;]E.Z #Ӡo^éG&O<(`;(1b/)ojpՐNB$`$l5> IdDpɳZɩZU-TV b-֋;廇n@CURʊj-&t"8U MXc7()hIr54.cѠXNk\!9o @gf' 0Wr`[AF$c;$~Yc݀6QAp ^vˍٳoCl֝UŔe )lׂϾqz§7?;2CՎ;ټ:e|=cw#l"w #jT8I%euTг׹g-qlUu((ĽlWq% PѺ2Vś:phd0r> /WzyDVq=\(tvXNf$mCvJs~xЍyh^0=zbx U+O?fTBC4U#W?čZ coFը Ќ 6wvJ2Pr@tBG%X1hn5āCC-0Eu )RP*P6M>J)T1'rP^D  'Dk䉍bJ XlRMqQVo 09vƉA6NľL&Gb%4r˨f?o5dkRb "XddeSH>8I>Qc,bRe>H`3& VOrfIXIxZTd"ΜTC Lj+Y jdadVd1imgA}sJH'6BX#gG9]ܾb˹QXRvXdLDf{czϑ/W V*&"0GAOa06{VYbKXv* V&SR[WBjO:Iw@T ր@ȱdQ`IX8bY38Q*M' !e&T!̆rKbTSšZ~+=ʀZ)m֚N^b,7+|43w Z3BLx(#"&"mEXF9WxzQA<"0c8d_KGxAew4|!9OVڽ@%ȭq_(7D&"lq2}G2%#'fBLq-:Vux%BW[luBC 7Keg\򽱀[i}u-H>ܰ%4d-u_V:lP A༂^/͡N ̿"8Ug6ty1UpqCH+vb*ru-?gb']TaobIEWDChՠQLƸM\:b*_mֺ!TDN bG1#QL m/.V嶿˛C [_^e &*>.%ꉳ.ϧ+/cѽ֧ :E:R:!u1u\|Ui/ԉFJn({3 ԩNe}btE--q)c&rn‚;3f08Ö!sdjJsIVTrL`U±L .<0ުGCWWqX~wɸFm}ʒFS#(M LP@++`L D æjmj*٩6 m!Am!O-:9;$n@StIuOJuM{&N9~[RND^{}=&“g}_'rpur?~Xl\++r1$ԕK,YrkErTdJ)kR/-r?*hJM ZR'*(-sk7.-J9EnКq0rv1V.o(W^uᆢvog2R<4:qqqg<ɯ÷=|`H>* |ZE"5(c|`s2"mxzCZy ًEIIԦMԡ`(Tmk i譎N'r;Ƣ頵N*'İ4A. \Q's99@'RNs"xȂ EgԂ5k^( R]'0]29ևSߦ~N[#t,q0W#5b8hăF\*8;#,T*.WGxiL("Lt'"QzTյTD,{ SIk&wmY4if7eփMffdCbI(Aj6)VI ؖ[uoSuLCum?\&|4uJ{͒GEƺv[Df\`UľdQ@ koS,5+9#GTs $/zxx*q*I`– [2Mg}F qFQ{;{?Kͻ~Ju&"z7LDKRRkށ#$5wd!:84T:qd!9Br'ϖe%.\ED y!E9Q tL[dыJ eʂ Brў*$OΈT2s@ zOF֝#Ļ@ƓѯwwW~_4I{uYs.X/F 4͑EFK+ Eb.zݩ3-P! L#ܖRZp@E\ZL=ӞL1u ]gV%j/JT2X?dюۇ;)IQӳV:6o9Z:g~G[]z?׷ +p6iÈT㑩ruRh0ga*<U]6T Mv |x)Ʈ;[rSV8fᮒٱX-[k,k0`ϿA YgB;q%)=Muv`?cr}r[\jXq o~]ռAsaqŋZ\nd8ޠoĕU%7Xm7/b.^U8d)*[˓Y?Iy?&0^j.Xg 8i ix+)Y,<í2qW< 6CCpId`@sˢlL[b9a cJ6QX>i-HQdFE 阸o+qW BQw,uOsg/]n=_&9"&rLZvg u_K}m.17m&]j2&md+J kM e$)B;{vTv-{Gr};[d9RSΔظ\Jkfr2s3)S-&d/l;+EJQT=ETOn!>v=ݙKCOh=U.mWs|i .ރEֆY$5,OppTn58Xٛ.,c sIHg' ֜CP¡R@]ܙ85Ax2r7g]ig9żq;7Žj+95ovsTr,#$G 19$!{3Se(<Yb6 ʉ+:ylDSܩSr))Pd༷Lx1ZJ\FP(]ǻt&Ξ̱OXp6 :g|ܳ掣⭵XGMg=),Yfwtݬw=>>Bnm: RDƮ]^ߙ+\0^Y5&غiM歷ws8v&s}?ͺClI%ifn{|xw1=> koqu4[?˙uG wH|sGQo[NMkI䡿o}8}ݟߗ,[en=7?7s׫m/yȎtԝ*V,i&z|6e uOLse䘺zBFCEjA;)''K! 9ABe˜uDk5`c[ϴq]/JEE$Kۓd]|tk뵺|3rSY-Fs0p#eZS|!immF炗IDu< Ja 3pDJ*/)p|ƥ͐MRKS.QD0)Rdk%U) ?\C :\oWz;E$. -=[m;vves^~+mYgAEC}2 z&ј Z0Et4g F3.cozZMOoy^ lzdٛdim Ɏ$ cTvJsy)p݅2nGȖ N*kԈ7U*:8* 7LŨipLZ`zo^;D|BJ_+:h}G]B}(].\|#ԅ/ 4 p393كu^JGbHァ Zu4a?B;ŨZ!eJz3f1;{Oi#L-RN?q33"/Up0] dp5DUoO3l?| 6[~+Dr3M, $KlcGx 5QHix`EI+gHeg&' |4P_=]|vsfe{~Hzh > vCy4dbΪ8O&o@Ǥc>fD*; {!(#}Nߧg3谭vEr'b8r'9zp ,l IISRvub -Tj+v67U[q=D&C\x#wԜ JuɏW[`(u6EP\3?U8"EZċ)"竇WX}ts)+*m>Mc,4iU䓤dO0WЛǾzƹ5+ $\Ma#@YG3k? 4Y#_GB9_zJiV Ÿ_Z=JtW$ B;9*Ԃ8w*T.:z=pm/` bઐ+U֜tQ:zp%lW`/RP{B^!\)$e ^ \r5R%LB%tW0.Ԏs~9pUȅQ ;\!vW(G$XQ bB:{B\BFWHԗ#]rBj7W% [WWjϥWoJT~~7} pu0* p::vY)2ZdfwFp-k;`ؒR{xw}ЅyDG}~` 4y\> .i`. # 'ƳH8:TOaC0Sְ%eD)<.wX8mpQCiqICY@|7O3rp3_隧_S@ELXg %q+`d҂YUO7Tkw8[STn-Y^y:!1Bl]*Kz1TgW{?RO I͕HX`S d8N;kisEޔvpo\nҒ"Gp?VS?,{nҠ7vxSMm \~6? v JG!jW``y}!i~zyj)lvRK8(DgJ_=U!تBrUrOU wTS~8r\m|o߂旲"&1T wD15יǟBWi>y}9gl\܁D2+T 2s Dr"G s%7FMQ*\JJfnٔ&| {(E3C( gWsղ|9!dc+\Յf17.vGjfN ϭ<ߣÅrsϓi@ vGE6]ק]GӪO)MV&>M ud {3ydߐVJBcj5=N`Eo٬:Ė6wn=7y}GK-peǼϞGl_aϺ׾ki(F]][s>yr޷yJu04?r{u3BSht-hN[uD+ F"NS>;M]=7u?xiSOe!9h2r%" )RP9Be˜u&p-LAET JgHY0crYDmpPOΈL*P\eDki{hj!Kd=hLZ-":3~A:ixM젧z:yiR1zd7(1TvfnP78Ϝ1F%#Nib< O]}n-3#&8Px(oU*:8*%7LŨip d@J/"O׷uUh}6" " m&p?-Kmxa.AwasfyLC Kfg<~8ŨZ`YxSb'Y=rls7q7KUh/y &gyb׿K׿m}b ۯtG s4:g$rXuJh3מ(ap۟z{O^(rww{_~=(Ff@% D$+~Lo׽ƅCCixwGx`pGy#0UB!s1 Lt3} [b Zd!ew׿yW#hrWN#Iԡdݟg?5ѬgnI[ivNc yJEmtĔJ&A YEs~U)S$a\HfMF;3BrC@0@̙曩(&0jyLX?g>o _1ϏMd- "R =ZWUYls8׿ގӧڷm[{ƹY8$?nn߇# *{E83,V&onpȯ}LÛP+ˎ;BV[s7h|FZz$1;@ +gHEg&'TK(6㡊Bwwqi8d(Da={a̢9/?#F_QKm-8 9}vՑq"3=Nb\rԝlHK/yR!)1""zeLnHkT57V torԃ*!^+=VsDQ(.J_]KQU2SHNX"S*,Rĝ ]2sqGw8dg1(ZZ3uܞj?NDQ7O3h(hpP<0fWBX.s;'!Dj$r$ |ҠeqN;B&V1磶,GK#,FΞ 4= }+,xU( g~VĦ>_ͯؒTFΡ R0)%;f9'f>g.ɀj,(YN),*"{'(*̂j}>G4mxۭsZET0(ׁ('1lq-RZATԣZނg4SA2S":6͔q 3T-0L9Õg(;]_mWmD 8u1rorD,eA|7nŷ.v*qwwV޾k s/LUT ƍ3/w!T.ZzոKVIjܯ5Rӓ)ɸ0_ݎ@GOciA#Z:rcj<'4^vc4BT5SOW3$NzCA*!)K fA)R3guBJƅ.RYJWS<ƘrJeNĽB/ `㐶I8 q-+;?+mm!ijϣ*MDz]Sʻ>˝ƣZFT%2kM@{K`$! [B]!3,(hʙlWbp|A1kPeAuKLFwX8EqPw[3;*`|6jM%FPo_ԛ.NkhRP)"Kkìv$O܀i6.d;eL!pL28ll799T99A Lu-spkr R77y.],uedݬS6fL`i fVm)I'ĤIq'VPM6fg% ~GvFZ]cC̝6!X0޵4Ǒ迢RH1^F^{b.KJnwJeJ*iJ23|PD)R,A? e rH \bN6^4+CE5`X.C|R_u`*=e>r"dIA&P MHJ8+ ޺s[1/u呼S':N 8& mes]Vص`lhXCqxJfȔB 8. x`I*ԠEנ`[LoJY*} %nKDL1ssh=)-aZ=?{lZBwgAf:z滬.ߧ"bor|Fbg=6=}{ٷ{C s5^ý>̋ӊ_[9\!7/\U7%g w~X_v벨`ϲgD.:= -:xkіݴ\ʻ|iwyݮAT3 9 D](Xn8nmZEWdjg9{?LB'*:V>O+#:c|VVŋw.~4L@nY:Qv 3pvkGᣮotW< \w֧К Qc}Ձ>9E`d,Dl2vxl/Gvu*~g@VA0ڏ#~w1Y7v*m>O⇋Ozy>nmͣQ7ݫ璴}9ՑU :<*sv]9\Gw/tE?뭐ᝲ˿O?+4|ӏ?'}Rz)į?jkgC|a:Ж.CǑw}o׭ ~>g%w@`_)SN9V;n>gzpus;+:mUw"N9gmV2`ϟFyWW&/}Wf|xi 7ù:h|$ڴoG+8MG(PMB:Hl85yJ{ئCLFNeQ"U1ARBjǬ٪7@ZFfF1{:[ŵ>L&6o\洝IQ0i;a{>SNB8)Wܐ ̉<ÒB\mQy6$,TCu͡s Ny#55P,Z R&,-D >S6Lr@"}fXqf`TvIߣs;;uVmԷn6|Aw#PSJvmTɦY6YDb) %;' Lz̚=R$MDְPU1A $Hl)ՕlolׇϮAWCU"\\[*=&Mfz?5ndC卫U )G7 ^mkcX.Θ@ 9O`Ӆ BibCj8hn4܃נ(9Dл ޸?z7%Mt|}@OI>NS+eww@D0*鷂0^_KQdH"x^:q`蝬X!=884[2"sa8^*fc]M )Dj}޶aӹ^m"pgE.ϕ oFBu3=cQ𡼞(2Y_13^}"#ǫ6[?Ui!SXDe3WWL2SLTjb:喈Zִ3d1n7ڈ[ĵ[Ć.;"ڌ X]mLRN!rlЅ*elWp +bX u 9/Uzׯˋյy.T>}i:]tM%i}iM4?u=|AO9u0 fX+&]Dd 8&fk?(vqF}YvDEsy^ 6l[҆ bngt Ar\#l' Mhq D6M-Kud9vr|%y;`N֟=+V`Rqy 1L]]j_ 2vt!8Nk\v-2k`]2vNH2Z̅bMl [sUq34idK%(0M!9"R}r|˞Ūgt`"bȎ2%,)ۘ -[r0nIחv7z]yW+=yTn◕1B!V )6R/! =죓LHU]ZJAZ ̚;Lsh0~}dC*J*[[8KƜ 8Lꫫ1Td+NVAd %(V>z6l:wԳ zx[*#r֘c_SDTO9$F_{%턥 T_?1wO,pS/XV!\rUs[Gb3'~#RP| 3t"YpV%LAa!1FRч$JJWgqOñUda .",Ӭ*F*\It97D.xʍ@rR@Pyֶq[)3E/E}+ijEu(l8!b}YِZh{uS)6רOۢ$kGG{yqiutrʷma$ziBœlD&F/RԂ-4q~&sѴ*)n\^׷ׄ%7X8ezB&9ܛle_+t>ҷ=-=$^Od)֕VQlNL;8ɲ%(6\βןy9V9xPWV>N_ubXo&TYݝև}ݫW~7\]9Ssda›'6I@?1B}n/0~|t|H9>< hp-*ەb&V=cW\5Unwze(a#Eb'JN8y]saq3aL= Wm RyӪv$ rX۔'vF1shuؘBs.blQD(12si&vO5bQL% PvynSg#x?>hz[scO1gDV3lȂt.dA}>l;YP%Ǚ,;$ r*>Y)յpIo$(^8 @' !O\|lNC9aɸ֑q-"[$z'KjMQHP)r*LQEJ0 BtV$]3 Ḧ́6גrH$hZS)]baJ?~GH eQo C.I`J(c˯,FMC=`b M}; E[%QӹbEbn&by38KEXz0Kb dŹwU`pd0c}|H%CRY#JT.^( \ҫB]o7l:ÕSm1qEQr UD#"Hs %&lYAAK(H$ nvS} ő|@[Lj$^<+\z҆MGMvW..ڦX0)]Dkf8*r*ࢯz-S#G2eU|p@p{plć^++K>Is=Y۳ƚ^[/=3Ňn6 bkTKf)^b-0`ؙ`sjf/~ /NŇ{k~ ۲ۛrGto7<3{c&͓3Y/54Y[Slf6pUiM}Rz'kd@Ԑ =[NZѕ8u])euu Q3R\Vt+ n*QH`hDѕFۊ@K4J)κjtvpj,]N(\kGkat5MLWa¬V=Yvlftַ+;u])euu aRQ?/~y|-o ]+N]Z^9^ۻCu*gYDV/7'ﳞ׷U/ne~c:m/)/2*o˺uir=o_PEZ ~I$۟<0#/[xZZ" 3~u1(׳} ǭK l5?e}vqWn;!g:K_yi0K\22,I)׾Q56\ه(wQ6kmڽdCVdg2eR,|pC#97VFr >SJvHGrYur;cѕZӊVhRʍk芝x+}9ҕJ2YWG+q)JCjFWZWg]<[G-eW ɮ7VtZ0u])SIGWE4+~G~+ dWJuJxi*nKߐ,܎K+R8;f~JB>VW ڙ\oB+R/=RJYW_UO ƂOǧu58zQe, }DzIBLڠ>6(mSO29m8´zF966+Њ6ug]gӐ,,Jq}jEWJuJO2u><턗 O8̾tLvM]W f $[P3RCO(G'K44ud,y/])nj&-8u])̺:F]FӒLhFWk32J)|uM ])nhf0h&gP)cU ؛v *mFWJ;J)<]UڱөW,|QʁWGtgpeU4UoBI>]dc3O) xɈr#ũZFVX:iGVq~~4FU7qV6W27&7$& (訙me$ZA|e0HGrY ƍ ܊@_zsJ)muubCb!])pftc'+u4u])xκz]a!]I;ٕӊ@v1ʛHҵ+ۆ])nlFWuO]WJiy*K5:׌S+R+L2u>ɮVtI+PzPWɒ- 8r3{[ҕgWJFWbvz1 -<݁g{FhqBӺv%fw];!UTEHѕԊ@t}$YWǨ+g†WVg6Jr$Nwߘrs"3~>*n~dIHP`v9l+yҦ0Ba0opN mHWaⲴ+ וR4u.UӌG јYWG+a+}CRvt+Puu mJq#+F+tQWSJ!]a$hkWv=LuubcCvť"ҏr?^WJ)v*Ħ]8zṑ޵+vRʍuE;V=~8xyWpӁu56ҁ(vFf]=)$޸2JF+V6*W0qQGdugLgNU['؏*4 Z~0>*Ly}hƔֹc\-7s-O{1=N$mC#9ӡh*7)GrO3t`Ќ5sIirb;Ϻz][ҕ'nFWJi)N]WJig]8&798@JioRκ:B]I"f!(pCpJiHJg]|boZҕiFW+Ԋug]B ޶tgPS;Z6+ІTJf]1D sJq;J'])#ͺ:F]%M1x/˗W" o~s_^]_]i\ݏ[YMnU?xߦx'$oZ(vۅ7ߜ.ݙcg՚f} ח\kfbiwt=5YwF;:r=mq@QNB9U{hՓއt܌Vt)N]Wκ:F]q 1٤8|Y@6.OMq1> x|yf™HR`$[޽$- sLl&oP\˭ J&RZ__@r`.y L藫,z~*o|ᓣ>SQO5yu?xh)7^ܕf wka|WF_+r9 DO/{-p!<,jO)?S@P/QMG^"{Zu~OŽ.K}UjOy.l9YC%]BvޙR<^3c*Ơ|wsooHcu}@{LZ^[jne?퐩7&K.RgKs;lDK0IQݨ}~,]HSΔ);gr1+VBbp`!)>3m':R|d?,4(|_ :0 un%0ηl`:  ң̙qBVZN'm5׊.R\WzTF"J() Byrr9ICgVZRr<6uחH0·A \ $C&%5pJ1CJh-pϥKC(^ИQ1}¹q]ot蘸ԘPן=3M럱eKK5BeІ 8d lT cw)g+*J\.Ч!deT51J!yD!DnI&2_To_]&{8t=:uL]Sÿ!Z$CާݽUL>!SAmh!!Q>y:ھXJq>ɀ)W9$}m::~N?\[2*dMF0Х@Vd8/z;R@w% ( D^lJȑzCf0uӐJu1(2> N^,, s% P*zр~;0ɚ_v-2#Jnla}rз$xFtXdQ">Zh9~4 n8XNƅAD&ʙZYbU|狞J|XdPS9FX2Bmk0.JYj͸t'*( u]ۡ|v#W_eWc` C@po5)7f=zR3c9PAm{D@( KP J2 86Xn>a'x QA3"\-"U c"̆= \j#dJ #(JDW_ e\L⫾CM, "Xft&[aE$7w5dBHBA>;*7bC2Ȍu]ek+t>a e(!2EK 40,5=ucau6@xHA¢#~ mT#0=W}H L8r!cl] ~{=Ky/H{ZTTNAF.@3G>n+,F5 ̲ wB{ i :үJa@wC0q9D5ʽ׬ II)#g?tZ^Wc~jBrGߚ" C(g u'@$1ۃA7l5ǨcXk?{Ƶ] O-! (pMylhRWH޵IlSq,,h83X^{2&236ޡ/mb^*"u40e .ؼ(`NR"OhmY;l780%h[sHSꐅ?r*{{Y] lDŽ0\P8a? (}p+&{@\!i8R)tpLECLɗUXzAzjN DLWXMʴH k"CO , -VX(;KS9H"PQ? ܗ9t,Iݞuy   ")ڽEXRr(a3m>%d@;뒄 XyB+P(v kћdG6!XX-;ݡl,r MȚT\EQPQLMF"h4 ?xP*Z#ݑ8dDYUõ-"PaTB]$1CdгlD!HDUb|؀?pa%퐎:F& h%D[vj譊#Aa-A5mU(!my6(a*`V-6mˇXܽwwr. lX?]N2d5 jx0u1vtrL'Kjc+a0 mۿ) DaQۆiMEѳT$$I"d05!7vQf 3:6}QaFI* wQ"Dlu@mk"#* Fns$QQfa R;(Xf$ hTQ)RC"E7nԣdؒ7v+TO+ AISL1. ;wn_* 2=PҢ#BHmr3y%# z),&W@QYjQ3r `NmS; ,7 sHkԨTg j̤y!fj)څ'kgi8?T赫mC>A[rgYyP8kL.3|XX@Vp* [6"i/zB_=0#ȃ9>d(N +RO࣐XRJ S\F*kLCzul) _YoH"ˡ蘅@VRp2 aׅK0q$X DBm'3tECCޙG'ՀߖzQ!w󾸮͋jS Bϻd0ȴ͑ǢZ.9{O?]an]hp!#F >~ LNRl &R./Aή^UlFݯ[~X: Uyc6_MWo B7.\lb^V˳S3yF-OJeR,NrvŴo06ycE"7b)7*6F,da& 3YLf0,da& 3YLf0,da& 3YLf0,da& 3YLf0,da& 3YLf0,da& 3YLr@<&# ywEYO]/]E)5-Ӳ+jZtyXZõ'Ԙ')/ro +_:\rͨxMBJ]mj=۫O3Q95u$Ḿ?}?MoQ.w~}rOo@+VTU{+xI%܈"{*Ό-k`ooȾc@"zy%u8Me=_+?'pgN~8of)fo*Qz- O%_痋Wm'K`Mk*,9l÷[w,գm{Y5aW^ kpagg<$^n4r6MjϜqp*m}ފ> !4ج`X~ӧ{+nAxPgm<>Q YoSa:u``ο{q1n!IjZb}4q/^Z+|[O%XhrRlwc$dct`BŹ; $o/3۝Nʇ8m/՝A{ e!M tZĔN/nߔx{bj5K;%W?+nܸ=}t׏y*O?|# ]rf*;qz}xa: B.V'P!R,[k1fû{wO(v;BGj txM.JB0Sv"E+t ]7A6~T;҈v68]'0:[t&SVQ,{b uEr֥Tw9WRnP7MԠM`LxgOs(vsMkB3UY[/ 3-9U̞ֈPf'q3O,o:^}J ]J}jW0RD4FzDKj%88muyHq]yN\ |U |2{{=y>kۑ|uOߣ >a:RŨ}ѭJ(-\ZhP/genZMxM mqvD EïWzTo4y񅑍/l桋 iHzj'bʒ ;;EmyB !(z!RĹXl eqXhP#[G7A/XK7)a{IH˽DZ9EfObn`HPUA}8+'\- ۠"tOC!L9wjPI7>Sqw*7r >sѡ^/]c֌ޏ巗cTnM jp^hk O)P A?W {B5n,DVݽvНIXH^em^R;XE2r]1^-9.hܽ =OΎGψ쾭?Gǫ~}{{qB7}3A Yж7TԓfCOYrx.uZ!jy0! eX"O:FR=5)Ae|)K5%_uժ;*iO_RL.k,xK֜*[HkI9GiLDߥN=SQiȨs4qV`ְ@?>螔q1IR&jlU(c ۚZ I]Qj"ې㸥cr~DCM{< `)s'U2Ѥ o4 jPq.-COMrԴd$%ƾ7q#cJa}͌]v½bFzojwd,;q1w? eXgH JmV0LРIjmf$5s *٣AiUaS Vd s;|Q3xSjkh][}tPCAfǡP{`!c 1@ʺ T0!"ƀ-&E%U_n_Xfl>u5)hTV.6 `OX;7qaKy6;M bocWD=#q@;gSh3yJK .c[` ]"s㌠/J,Rq&$V5&iuv12V`5Y P􌈽ߟd~Ձqqެ|͒qQ~qQ\`O@oYEVNq>`ωAS] s "b@A\<Ycb;@HFE):bԄBBk6 !B i(/Āu'uMGcAMHD"Drײ[#'OvO9{{znTׯխy.짷GSN(C QULRYx &) 蔡$c^,U{UbihSu8dR{p׺RV+dE MA rZ )Cכ8ەxcu5LreսeqŦ,]!OR`rq z?jULz~eO]-߱T{Ɵh1A Y4kb&@pj u% ~'5|g3o>S'Uɐ)`$MY^Сe&@W7e IvB,:d Jbq&yd*eEf9g%zFvHU8U9XkHNʸ&Ǵg*mm-vWw.uD֑3 ^ |6ī=UZ AJ f v =tJo&" sL3fUQkE~`Nbv6պl􅦨"5"49l+ 8˨鼜KyÍ<{#Co1^@~9\ル|e\JD 7K9bOS*͘8a\tH,=O'%)0ţId,|$E2bu!X) XQ^gR2ǘf/"'.O~^z >^ք4kFyzzgc ^\, YwIVDU,F?(o/ʻ^Ȗnm<^' ۷Yט-o=VJY Yo=\l?-,A>h{61PC9γ=?m.d@Ud/0JA3ݕ)IyE>Y[i C^>rEL+cc|@<nUq>|~˞e,PȨJt)Pd1fz4*DdY Dsy5` -&@PYeP:P`tt9 fk2֋#mR8$po\ /bΖ|<ٮ:\G|ìqfu疻2ՊZhQ0􈐝PUe5j)N$J/<t8ItBπk|yl:;>N,Fy>oo☐JB5DiҐ4v/j.C{տ?Bx{o}][ {[IF?ף_IF@R6U-6cbc}ʅ@N糪PW|ơt;Hǡ~۟_1VwĿ\ 45ܲh#Jm8ϋߣaU~jM*6N2(e1Č1:_x] n~2l~|ǫȲH<5J؊c}dLdJT)6hb| /eH{ɼ{zTͩ9;H6 XBZr34|o .mFX箤wFm/Y>TkFRN$$jd;F^#ʍQ*1_Ng}>[KŹ|U;z H<2eYGW4ﺐ՛ Lz7Y\~1fV[;UՁQ;?f#Zg.# yzq{ѿ9}~w<9'{aEX!ъ6+R=:6o98^v:q"{rtKR 5\BN\ID2ƍbpNFo_Б9d'W,ɥsMLy&*HI吔C) +4K1)t(A@ ҳ~0w C%R8(>j}84LT fKV$-MtlAHlf@)YqTkƨ$15a ޴gu MM`tSnCwma;ghqwg9)z {ɫ>|s:Fk oO4u߆Jb,;TA| rnHo=~9>gh}nsc54ϞY7`i=U8g: iͷx'=X"12rN%B(FL`b=8?vxst :*& F""V@\LMNBEwGD#J5RԞr1~־4DAz\,*+kjSQmgt^@ [WAմV`IPF4Xm!@Ǣe`}6mÉsI=XP#Q jbɎ bۂ'@kev6jd > Y3UzYZOmOxbj|q=TKw/]S&(P D(#Uڲa /UV<*i%r\CR@.{ȗe(.*W21T JU57qkv t(]9cVsYDxwBhIb`O,J \Y֞Z^ZL/'-T,EkA«WdzWhy)d~X Ք׵,?L\~~]?LJ dFh$.U=kkl(_[xI^iūv?2bb#MCW^r.7k-i5.y9lavu6M6g`h+׵8CEs?c%Uwd{mKHi)r% PM`b s[ 4JRMh3kѶo&Ξj=,\65ImIn8SBzȏ'7gWP#6 p'fVbٴ?Ů1"M;~yf^"h1`l (1^o$)6I_{РuYwݔcΣ~JKV?Qq]=3Z]NEE%c/p MKnTڡ|قלˈU-K`Q pG%n)F{ThKmռ QN.rF{΃\HTI ħ,II4 ZK( LaM0*۱p+:~Q,_Xm/QS$(Bxs5x È-4|Ɩخ9GzJ=rpqNP4.T76>Y q4.S>}ǣbg6i"[d{N6WJ\ͤYv)窯f7N6pq+|K/Ue91z~Nߡ::߿9_ͻ@ۿ=q~#0>25`,?$yk{@ӪEL}6m*Uw[]nh||Jcv[n;Q~0G{kE<Ȁ+QOPl\Ut`u\RQV/CB4=B (x:饝 WCr84ʇoNШ'!Q ^9m$$&EssP[FiǞ|N瀝 y) YY~-M '*Qޘ+$+*.z$@@ D jMn4(aQEYH@GFB$,b*U)$dq8\ 1Ph-qvy5PO) Vn$K&MLSc4YJk9 zk1>V-0K9'8)9&>KR>oK tzUfX'TõNq1MJ Ee-^RGC  Ԡ$1D F <# >FQ>xoZ}ɲMR8͜Β(s#똴gQSl5b&^  <%bhdG2\牖 Yj5Қ8eZ<:.}\ϝbȌ/ӊˢjc%&oϣ# co*):U;^=|l(=,.B2a2;K:/21Dz^Y]=L\ Vxu0)APWSW}8ǤP`x$cQWYZ]]e)רᰂWz_̎^qwUYĥ=(6'Yq>%ŪPk^~s+FR^)刊{O?g)kkS /yA7AVqPJCv J73%4܄|}S/=qmN:B) A &Y0PYaqH\Ȃj n+u uyxw3b͈ͯO𘆞tk=k?R4S-'  hrd$%(ōI\0!2&g@N%Mr'ɹLh:/URh;:{Bإ/f83'{ݒr}̹6ۿQ0%ڍp`Py"EF?0i\Z@v|_o- $H@rm 9QjI`3]U ₨%m;D 4|t5Ǥ3'ɩ6JDO%cUH9u,Hə*.Sڅh 05 h59Z[e'eؚ8{Ld:|sH s'u-n-~d"FvZv {5ʬeI7 ?mA|C*%#D0"pN\JQVP<'jn$Mq)vh$*qjkTS*dUĶٖ8WFxe]ΚoE0Ih3n ;KBLݧ+-?| L(iZ1XHQ z8mQ#U( b|=*l SO 꼼aPe2NC(w5"bDxoT9~h4ޔI^G-u'7KOxk| f6Y즽m ڳ8*,3ff{չOh}[Q..t\Jos4QvpඏfY!,dQ6j^]ycGk%χiotv}<*hWj鎊kOqwzz95_ݗB{sX]%=q M'%,*qYsE] 4>~?98A L0pCxtJQxd2` =מ*Ƀ3[=G@])R4#-puŢv'Ӡ [0ej{XXb|ʋ 38%xfjATf$J;ki0Zx>9MFxr=;)WP %AH.'RglHIG)i#y]*auU$!p (96b(G&(MmlM=3ྀϢ7擇JgW㏹͵K>k( l躡?"z4"RĝeeqW!.jbxt$!*X$%Zv3/BzUOy4cvz$*@=;$"p7ցe* ˄RA iEdn0gIq_Ys?ri 7RkknV/dkGjs9ul @27%T%D#Dg`Ѝ}@)%rB 47Q;/ ;aТgC "Qhhn&p>!xeM 0O\K)DC0Yb!H%X~:`^iN?^o3{0Gߓs<j{*'5뮝y>󛓗yws}yLp@6"@itq LcTAPdE"9Pi:,ϣ^|cZJFc={6XhY&fscE)~Ǔїgzfzqr\Pz~RWm6ꪖ7( YWT|5W~w c2!}L *R:S6B_d笊]bA7z:;M,I⽮Q#/:?}4;-IRjJhV)4 fMh/ӶSF?0 ڦuPHl脺ASJÚoPGS~K7c?ei^w)G?L?Z|wuDS9Q&.buS5r=#zXk[1*=~=>ok}J]2wvyyeKy3&S-ixjr~rzZ{F`ajkG* v|3Gt2`(PˏI5wكiٛw;޼[o˛o(.n]1_RFjb;h-_nFNގ<싥:a_FFZqLEnt;]j-~ޗGt<=1M[Pt^ } '=.7U*AE2/gJD\#u5C5.ʬ&ͪOŚɎ!w5\1T"={+<9hg~;[W!xKIʌ9'dheTtVZURm1϶+Hg4m@<}x:0CAE H֝| E)mVC j"DQ3w~c krX@~5EVV)W"]Fpe6(BIr l߳ MW+]3Y}Ym dr"؟ity@6!oHߓ!nޚrҟ?; #Sk2(2 4 OTB!PD[ћ,F"@ 6&eɲHi UDt 0*e,w|Er^\fbJn++J8U71,U[^^.SS^Z *RdeBtmPbX%MPw`G 7j"Hޅ{J@d*ڈEL@"$dXpÃXip3lX4f5V[)>uS~"FUbP 6;fy fa Z.1gPh)'Wt HV5T<3f$Ӱ ޴^ӓu >r LﯿR{q7twyfg09WKq8l[4Î6hR*MۄB, N*7tz:xwnշ6)oUڂz2C\lk"g[ޓ$EI'V6ț(1& ]Q.|?tPUuZhrR}mӔzTr FEn wi⨴A~{\4zhg{w>ZL&g{]W@n;k;DӸ@/M'br6&+ VV-ƀvvjO[l+t#Z>W(S 䢗+S0&O`M2S( jTW,f4'AN9mbeQ{H@+e 뤍IhB&)nb>9|gCbo}d4q?1ª⋤eQ(chzڪ{A;s2j[\{4.wBge#MI6ZRR IB;75[llsnٴY6͌(XgU" -p΂T\ \0)86'1 V/eoR uІbA12YU6 F: ZMfq~|t^ye^!R{ֵ(NM:q}ܯΗ}«F9WhWrXWjQo{Cw6 ݝND0. _sؚ`Pj0gjjRZTE[y*P]lQf@@7A" S5 ZRP\%l%V(ߡA xVtVy?nk#s>tyA 咠HE:R).@"!ue ( &xD|g}苪E@1x(I;N'W2^Uc90#dL{,?viPz7A9, "`L6R1KbAST(@(aǶE8!Bdah(JK.b P?T)2%1ҰIV$D HހF2G,FHbJȌ?Z(4~'oZjg׽:; )q0&Wbf76M쯾55%EwT+~`/>:&ͣF;y`)Z+x>PgP_ܰ<ӓ1 jUȊ@ ȽS;rV)^q\,7k˟h!ͻ$EH9 F8w[Tȥ ^q:l5ӣPVkI͍֘kVwx:8t`->js `^Z.Z(0{HX=Y~sk>>JRWkϴ{^=gkww/[(ՊxZ&*W#ɴkJ7=zo<}g&3isL}inwNsud<=i`]zL'˅]9>y8;g]>d߬}*A˝4NǼ䑶oOR|G'eTƾ+ۃgP{nt^kގcqtՋ/ß_<_o7^yN5ṗ$?1jjo9B~ȧ^,53#{}dVՑ2Rϧxօ.ڶo\չzlL+L6qFi1ĭr-UjcjEWqٕ~ <{5Y+5l"aOEAuLw{1r"vlUWĶ ܠ(ΦO#OW1a.D"ɡKlR̥?O  5ސo #haK%liBLhC8e}",Vb4/XʙőXH#'Z YuaWHʇLPlx_EYߐ S1:U|.x0Nd \П@jLUsO+YS#ew6o~{}fy,/SX@;e ?>GC=avs"g͑<+fNtEFZv.6LllHm |*u uAjVNp*` HvA#V)CBSEeƝ%'8orNsٚ"09SBEJ!j,B7 Wh]C"5mjG/7 ?&Z AhRH@( ٣BW*X8Ö]͚_(*:*~ܦߍQtYGf'QEC940FÌjmsưsOC[b[\ 퓏at~vFYx}oewޓ6m;#g˭}`w\'_T~8@rG'K@I9Z>KiRcq kb% MѢ+D%F@@*JH=9j.ta):LVX0:0ߦ%`Qz02,lgl* q`YvIeኢl&XUYOfЄd g-)mX"Xt Z5Z0Pz:EY+DkH>a]e^ *FdʺduYEj޵q,ٿBf~TwU 0$ {"Iz8EV%Re>P-ΐ:}NOuUp:2ƣ"1ڭyGWPz{pe aԃ0>k%ѫ`C4&c DPPA)9Pjb,H`̰֗QgѪ(Z&n>b,3z2fp*kdX뼍]Ac_Dt-#G'=^@cY٥rNdŐ=Lό0;ڛS8R'U*΄ġf+Qg%DbIӭE2u`\nեּdO\]\x$(? ^f9\˩rʠKa*r=.yGWPGe ?_9rrf ~Ʃ4iIp}iIJ`35+ՖC<~ZIWZHLT^5E@LaPjubi0I,BAI)vtbA!hIy : S}{ζAkpNDUX;cǣTL ={lj(RUѐqδyɥHZ ^> o~܎b=J"͋֋_c$';[չOyo,K}qr*bN"{* ?NB`Ē} ST D Ov} ,AGǑݹt|1QbpO^œH-[0<:ZPJNH j!7&̸ ط|>9&/:ԂR-(7,I-CR% B:fQr1hZ ev  -!Ƃ-uU'%E-ƶ"fuC;TlۈV*2,yc!?0*7rfΖqֹ٢ʇŁ%FE$K!ruNəX*VYk'iSmle=Jh3^ρz >[E;,P(# 2 CbB%[WT.`OZZINzU!-LIISZiH bd飏~=S͠Z j3XeLĔ9Q$ٹ΄XQ(R A Ԃo6Q9,I+>[ajݙȾG-"t™Qɘ4reQDV("::D5]ܰ8ӇG{$c*&ܚpqN_n ϧ4SŖ`qJ$j"!O69U-M[8t7sq ~ǓTG Uc繭ox3,͏?n ߊ!x0Q?T+37:\g?.R7Sؗٲ-#4Φ39wflef/LBScEL>߳-,rؒ &b~iQ,jqt0ﹹg{_0ǿfn'^Ѡ6e oyf]z51.Oso葱Zύ[!|˼3/#]_W&iȰ/kާ} td%,4k )G*`~#-X=pkϾ;t}6t|5otK:=*>[G9j7hmmɥw21]45.okMj|L-?˃%o/tR yPoy4*{İTֽ 7y}އ*dчhNyc׎ OlDG]~`xvgK9t:s b쾺aW'RN٣QAWɟ^.ty^7A+߰RΤVqԆQMW~-}pYMdY$DI1Y `6$j9BO+z'ܾ9+Kp/߳{cyZNGLJ#'E$Q)무Vw^B2_T`lHZ=Jij0BgŮ~4h_ۇ/4D,bc҂Mf8O,p@&dM`L+Ui# )tfs d(ikaېA Pv9R{M$e9Ა^2FvM0;IY0>m}] R&QNP3k˴3YrԹquyQf 9$}X-nm{vt"vHMx4vo1 ?4@SJ Bs3l$KFUZơ&)ON7=IzgGi|p!Lhl3~䷽>]v4FJ{Ju"Je8To6)saP|U?LQ|30˛yXoqm?n%ރVy~QfAP>v(rduFk6ΧԄA_RTxc53R4llM5(jVkT4$g+ܫᵱ ֟o]g\ [YϮـF@ !~`2n^c)qP B%%Pk.mN L 0]\!=L֖LƝ \Uk Wl[3+@xN" &8zj-U5•A@-`p6pU5\Z{pV:•EƜ\ kmU5fZ:vVR=\BBĮN \$ԹUVɮURkW4Y<#b]Us>֢:\U+]*}NbGîlvTkMURW/Co r(N/w3؜:=v7sn֞j}7+Ʈpe{wx{.ܻҖ7q2o~9n#V2l*\m!틺=R8ee] )qzo7ps1A( @qK=V/,ǓKZA!hI!e_tBfIF|e {5"/A%ZΥK ,t )%$J.>b]-JHѶPFK2d}6W9C@epRb0[o-7j͜ߊiDA}˰rҤ+-4?y|ks6|!_ش޼!WGuaNIqA۴U..R&@ A%%39~)z>IJ14)Eȹ"ydفDpggZh%+oAJH9|;#0 gIl5μ(^2OfGΒ(Tedmb](g$kCln9Z3g8՟/ ̦:~爥mB%FE$K!ruNəX*VYk'ińVfhdxC{)?-povl(aIH{ÐfPV5 SVuR{G}nS;”>EThiH bd飏V:^3 ke,>/S<-UǕ/>^M[bn`D΀)sIs P(ɠP@%6ǒ2i}ɴCVDضI_k.oQG"< %kٻ6n$WX{ U}I/}HR) X<ӤBR*k _D@. A?x藫cюj#d_p}2F?: Qm,\WF>寯&A7*)Wt\ov: D zq ,%6Hr6yLG{Rvfg }}Z>Xı޲z[z 9>)J7 JT;dp0N̵A9ƍ&i[[TK{Z{|9"jSɇqˑ(ElDV)1e3{W*Ԛ ),kleMV(` fe5W^Ŭ&BͮZZ+{oD&k%JgAYS"> 1DL;VxfNnMɧdt w)aP\>ǕJ\dg3D}It052&ΎɎ#K>1_|ǥ e KT2!K`af'7 s,I9̙FeŤ3HyD8!"IR^4kʊZM\ޖ ͔h)0QT'{lvAI-Ncrt>+{*dڷwv **gMx =F=K%! i@$f\ȖhMƃ1XIh[/ QV#̘҅LԵ6+>e5X4" #aFKXs?F2MCL}UFD#bwxB̴q`!ͽx@XXJK HAMNq5TEDDEUd'$wX.* $US$íL#YBY-WPTFj܏@8kuLkFɞy]\L؃|dڙ^%=r2 .WIɜF R=.c=p ;2]};G _F]zg~TuXH?@RPs:ѫXTl~s+*8B2@M7C +Ťڦ0#$#;"GHwƇx:h've扨cdAjCD!%X)-XY3Sy=7'4ly0iڞnK1N~t1& As V !%@c Et#Dz^WUbnU&bYp 91 +yI m2 U(Qcjܯk57%h.F7_II2ڏhcփ`ݶYZu7]v5;X0:2FCP>F\pTnhVMH`&=YEo>&׆C^TxշΜ/| CDLh<(:z4{iЖx *@u'Gtqϳ *Abp$##0D0 nc~|K-qveO$7ۙ dlg'U0W)9?iZaY3sr3#(ɕƎpߎӐ_$fX׊HjQ TA&$%6jDA uO>$:^nj 7 QwKzc}iP뒤KG1nѩ;ҞR=rN*T+? BG)7.KiC«k2EׯБٷŐ=L'0~3!xMIp1MW1zBVx^iNzJf$UhŖ8kVזl:]WX:R:P$Ea!Ut6 ]6&`mS !V38yihavu:x8~u; .|z.zQ}g~1{J|b`A87k}Oh\ȯE=N. \~{̫wܚCNn=\N9TRJlhT Rp&,}JNcoϻ<YYb&;m0+}bQHuYK!XΌ89~UO$epEf"'Ux~:%]Ҭ :HӁ)xgx񒾋mY[%Nƙ3ܸ^y(Dwnu(m6y: b[ 3~9yc`SŮȝVgu&5<7 ВeH.d&K'+>4=9gPߴM9)4#UjCNrN:*I.#εKˍ軮9D\N;)U33J?Fp7x<]a>|=2ggOg/;ʖ@Q(1Y4TR11="X^z`x` f` i!Ȟ)&βΐLʣdt'!)/UYH DŽpD@X+^ΈdzUu4y[먃\Zr=C":,/bc!IsA f=,$`NeMa&ZəYSx& ==|jSYDsj E$cG,B {"ȭ7odc hʸ{ݺ>gF7 zeeo(gLw6z&F Z=WNsBp QiNizʺM-:tvWQ82tP"E-a_@jXėjxQelߥ'BpA;ҙ$0d t@Ng&R*0xzO;UV{xv_[/ ޏO'k2[LOc҉_~tN4|_z@&YnTuc{ /8µ\ dGi%3mAl%tq1׃_AT䍳7*lƺ!%χWӢPWrtHǩ~۟o N+;?7]|nY\| G,AF?ҼH&~I~ǰZѱA\ p!1ie1 l~|vzNO%hK͍ G}d {dX\lCsLCRIE-yS6as>mAsgfdmr Xv ̥0Z4*j@ܐ囙Ɂ]L˻]q4*zK/??HzwUd/~b\/ED֙#I5=:SrWW!zp>]GR^+c_t/ދCAʋCWu]Դ\E;x.w@a岇"ĠӁFşW4A1:^O]Fmbw98v>$mڪޕl#\[<"\||Lr6,Tqu5#Xscȟ|);=~vITRXޑfl#hQU*Fs̈X{DKHuwl<68/hplx`V59:в" Ψ  C- ,*m|j`y3E!zϾ=Q'^Qiu RڋRbY%J ,b= ȍcggoc3Va3H/Ð&IJɄd;OŠE21ŲF)O1Pm3Kc;Xml[TOm HٻzR_Lj?jɢAӓ"c##FrnMfѦICBwũcCS!+~<;(;3(E#Y"%ghFJ'tlv﵊@Fh0@T\T N4:1]8;h,5C%<r^0NӬ-k"6ǽfRr})Uu1\ tyq ~^Wî/uk|ұ_moiSK3aҁi (P5`bE-8Qdp0/$*/lobbb*cp}9A K<0m IX$Hť,{A)+ܜ(͜f߈p))\Kd)=ֳYPEVBHZcL{ 1pSYm;ft> ]$jzRTJE7}dh)TA `XlsY[@W.]G5ހ$l)Rʬu)%, F9cN͠XoCކnOۣAI|5Z%DJ {iaJkCv14IV20``}d)U@%.IVg>QM`&'.'c2(`R@\318w`OFYޖ:9?mE$oO[STN-~EYɳD*ڼg9?G_<0nsT2,-e  %hiTbΝP܏m?wZv/!؛J _*JRi4G;IP/-D77@NL`_ہBYQre!`ـyq qe)7]jvVg!kp,/b};)p64 yv e:Ww`-A1]rAKq;tj-Jm([xՊyɿީwjzDW5w+*70݆/dnn\*e}vz(z7)yo7/{vMh.qC9ۯb\tKG)jc>N3FlR_65^BOF'Ko%yKs.qtcC2cNA I)(JQ3:nc&YE/31ڢ+{-2c&Fk3sڄlq'N IJR=|pT8'^g9"0#srn΃Ęqrh13 %!gHN\hH*`QӘ+"VGz׻4X_|>WeṟVi0r0-X!5R L{N+=t>֪ftzŦ'*ǵMLD7rk3XDbt&Ff͸Μ8ȘE&ꧩ'^8< #2|5Df|sqb!~"5;V$ 8z%eH/9eɘcϲ| `E#WXyӱ /?lU.rFs +!hP(!l$Gqӷo:$ߧbvRK{n ѕC ?:xW')iS QO1x#GDŽQB9acrx甇C7 p:8L\[mINz!ɻDRM.ԮOpzM3Z3M! ws{M^"#Yum~:qG\ʄ~W\>*Dc}MulMN4|} ܮFpuQk~{`mi ckKBŶff769X¤i8 {6卞\ ftUFnoumn+#BG5*$TrqDi1_F:qEaq<\ 㟀N~՛Ϸ_뛟ޜ|뿿>yKXqw` m`wN.ZxM5 -Fܧid7ve].oiQb=}*gvVKn58W|70~5K5mU7{jG<@ z i~QLĵ*FnXQaĺ*vzϓM&aoOsٚhQ..9? ),9xF3&(&C+M O&8RKP">sWV6ejp* wD֒d*D* A{sY#Iʞ&t;VU5llLM.}y;7U}~ @aivarq-7 XDRkKxnpRcyK yZ%"%'lI>\2j̒J ks EΨ{X-2R ўU[x^cp JA2Eh%rJDb%,$a":dJCkODkKz~%=ZEo=9c]|ɧWd_"D2lE`PuU`rJO:pǞ*Z x69˜ zɊhƚ&ehm 옍Ԙ`3y)`%2PN1 `HC#ۼ\rLUpH|bDI-MhYQ9q-4$?kj^6`ꂏFLo^[.`CigDrnwDrmA"9/P$G#5]a iz!\κBW^R^ ]iJ+, ]!\ vBHWJDWrB3thm+@HR]`MYg 2BB+}G c܈Ճ LW+~ah՞Ԫ]t@W>zʤ6Ct7w⶝. ]!ZNWmezgZ6z }\hƧZnTQXhZ2>JbYi"!my߫Ӄ:f3u,p4 hJ& .pfhYx@g֤(N$PU%U'8b~0çEFIҍΉ!^ozu sF:Z&*PKoˌBA0)k?&=gr)9ΪҺKFu9v"#eh;IJ>A<4bFJQ?bdn{lk4x5C<1 -.v!hʵYfXVvȓ̰xr xrVޓCF pTu]!\әUD%c=]@\Kե89Vv.]+D4CLOW/ R~]!\gjvBBxt`Dv+; bBBNWHWZhJ] \:CWV>v(*2!BwL%}%ҕ.eq0]+D[o]!>vѕ7DjaAW90jObUC 2+c=ZĹs4 ´kXJ}A?`~}oL>Yy@|mfԥRkCK +dD03Ѓ$w^ՃȋчR?(6&{iBf":8 1쌱D+)%+ksN \K zpYJ8x=qӳoG]x?9\02-*R\yyW/[וβG;9Y8X*Jh.4˒pa}!hxmSi_ WOzby#t65%]t75LqW*lYpoiVF1W<hCAN}wv|%$HLEB߂?}yUdwGm HL:䞑te/ԧ`z;q{ɦv{,X22j{ 2}C!\,+e L}c*B\Ot!\A}.\\UpejW*z-+N L-]t׈}⬸2z2M%zS\]#r|:7 ~r:kW6Tr8pu 1+N~+&7Upej;LeWJ ^YEO<ج6Tqe+z Ǒ7 _ܦ6jJ}6\=unMŗڷɍp6ȶTeMeZP#\CN'Cy9vi&:Fvy;,^8ћ\;wCö@yRdyReUnvNIL%gr_UZ?RQ$\S,• y\ܴZU;lۜ;pubKֹY2\% LErYeWWh]ąpɍn\Kmڤ؅pJ)h *%v9,+PK!WR*'/+,n\\Up$d;L僘=pu=1 seA+SÁ+ĕ樺Rx;hLnZWVv_ B%ăW_+ȡ/\.lmKꅟI-Ӆڷ ;[jp\=u5h}yrl-R|g^&\ O =J#^c,L `Kn/^´ ^1 DGa65b:h *:rzYW6qe*ȌWWVw7 $_W]jG6B\EMԚ].mr22{T>U~zpT%B2*-sqe*Ӂkĕ8yvqu֮LMƽ *\]#e  Lۤ6T*x0끫?0~>ȗSxR{ͳ|_XAYдٷC ?| xwhu}E~}Us5_z@߿ RoƯho"q}vG4]yÛw4%կovq_^ɻ}Wz#Y׀K7o{ɔ7Y8_1jZHlzE;#c^$|m~B iQ_|B>.5LEk4!#{_?b o9d7(eJ+!(s^"u_)U\TCLR(5*#l|%y#~ߨ}7{$F|0w}@yO}\sJ3u!f쪖|T#STzT?'Rd׵fBRi!]HGsփ/3J{ؑl>wB2ϐZtc`ɌN2]\) m.'㫜NT(c B8B:F(CyG3Iav* EcMtR-Vjୁ޼~OTbi[><#rq@2PȬqTŜDh{% =ELf >sƖ}js GčR\='WHBDv/^t6&S;GSFgٮ;։afc 960ИU1>t9)6NƴqY _! N~]Mv4-tD֊(Ń'97eI dQvNNV18RI+1n7SR-QB—zs?zR9'Rݢi(bP HGZGH}=f~(\9VQZF5SKjH)"jz Ɉm\!Xjq>=3L@WX'JΚ[h-PCwӥrQ$,3䮍fn)Ĭ(rb :=KG Ȏ^d^i"o+3+p4Zi8%V"'eb9W"agpQ:h[9'y*!M|kۥ0N;Յ|08Q.\Gpy՘j֕x3dЖ.d^ژ[MxkYJs\ZlLώ64FUm:)فtޘMȆZmpѾ!4Od^FGGB.NU9Nی=0J b7(c{TwP& $~)sb+HPAqT?iz|*\frC囂+ IRy}`4D܁H:#7êjG 8c)(|DaH`$<AȄ5:䙘cQEiWԭ 9 N0t00sbέ%8TR =*ѝy3Df :)d07wi(Sѝ5:G\`#)#ǝ&gYvVD)J6􅲎>()Wهx[]T! .(bi훜,`!Uka!ѿ\I8(?[,|R@I=;a^FjE}dYRrȠΔ~! ;[UQ}/VĥNml"9x> XGvͬ !0A14s¢O L|y-k>/o_R|K-x fHacf@m(G3>P..mUHPԧnU6a1%$$OU,J芸+k-Dm(2:#Y!y e:=-3E HPs ,P AnCGm!8o}N,|V' ;5ϛU̻  l.$z;:K~ y'ȓmF2K V}%V!i z<%R5n {q:EjC$*;'B-|e21\?kqEtsg"ׄwi6sVH䚳ҚXT{:Z;ʛS5>#0J"h⬲)m9Ѵnn-[|gĒgѝzѱzZl8|/Oe42)nQ]DYG/.ay0.ߍRk5"b8ЭyZDVPBʚ%ã{Am3F R^ӣnX;Yv[AIph!#jb>Etgr2\%0Udߴ"KG@σ'$gpQ7Y1ƩEP|x?y#$>I:yxXNr-)cX0jaR V5 :ʊmҬ]BpzC:7JXGj DТn?}^|Hz6O)qJ &Cgjޒ t0*_odiR:N'.l_0i>4 dz#&t8LC4\vojn5@/˴cѴj0^(9y\-KnTtN=1 {H@'2!@OJCQ@""M$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@Ki: 3ewH 08֙7.Z^av$ub D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@NYkD9DH +mgH Pk@z"…u"H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""hI ͹C$FwBNw?: *#I a9@""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "yH/ֻo*ڊ7ՃW#jnח _~N|}Ƴ0$xB]]Q%£? jߗ]QϺbWx]ʍMdWcWU;dW XB3vjcnWR8]`ߝ`jbWVC+TiûdW(w(ȕtŮP<TdWhWykL C4B3 Uzqv彔vȮWwƮPq]+P U FvؕaVpqv`ٮvbׯNj%'n7J`WUϕ{%Kl!»x25ʐNKz_CʴH- TV*W˂T=86wʗx4xXy?Z v:*$}.7v3FXGh-Р;׊§[߰ZJk}Mo.MNB){Q~ߚ4JjbQD$˵Z9t;UVB|JjY7EUS|uPh%-s^0A#xm_=|쵶-w(JDr(WDr֨CPua$'7. v3vr5g]+Tء<皐]R1:dW X\ebW Uzc+͹լCvݽ`7wŮP}=Na7N]]lIM+Dg Q]hW;˻4B;K lB]JK1V\uivxXݾ rB]JIwѮJviK: BƃT'+sê7=&܏] ޮvݮv+nj==e7vev+Cvu۪߁!B^wƮ@zݮ dWhW027N#'C4ʓZ$KP(3$ t Xϋ/Dm1Je.z*e4.^~ 2 po.N4V9:'k#ōNicD?|~1=> _]ʕ S~]ŬC'_}|rѓeϘp 0BQ[\e]]K]zNK žh*]`ݳ_V@j0%DfמŒRE4Y3 >jESM +3pUKnr0)u.NzNgr6}D%]R*ݨҀ\B#E3 xA,efs!4_{e^>(l{δbBLNW6p>R)sq_ѥv-3WRrj/}0e5LRtمR=Z".gKg%9Hl)M‘b4M P -31I턶R,Ke' .JEhoa0M,=On&M `b4P':m 6_yps%pߒ<~XvdK\I1`Y%f\rAqi17YjXSqٱMs It7 p4M`(.o 4aa$oUS6A+9##5>u!n6(hʨ~L/!u+&_ /BƒtKyp \ TפhknB(O~?gt$#*aa`hhҔ$D0}1D7, ({#RrԑۑnXI$^3O3,>FsJQrQ+9'h~B"8 %n]%:bH1Ƕ:9W[VVKq C\ ~dR5 ^_42R-֯zDA[?0/|zS-ZF|< Өw qDY#%&r W^|3I>+1M61rUr% ]@TFv7oVeqؤq\V5-?-k]?l_t8\Yd>hvQ5qrv=¼ eTW8K]/ϦO6'f_"Z1g` Ne+l7Gx M| 4U4-Wz%ٶˈve6 费ƭ?A b)(> /=Y_sp6~%snrk*ׯ䶫n˫`ʬ[ҤEY5jTh'+e?Lԩ$jM'6Ϳ_>}oO>~-}oa $z[%;KǝۓwץzKKnri\/ ׅeYr/Qr$ǣ`pry?Ӄ7֣&Dp]r?VwI"VկLK#Bx7 &+>?ho87mwKtOo_&vw$^;)""E|p&^/lO7D!L1ҏ^lXZ^҂|2E_*a\ * 32P{Mη'}|<&`|Q@A<8 2͍l|#rNR1YΉm޾PO9J20MMRʰyp'(&|֦2ײ0/]3C| T ںkB5({\wm$|ˍ]fp!"/_YRm+H䳃Ieʜְf"eᤳ%Pg!z RlWJmLf{PQXLZJTkr 4oT¦OF;!@}[;~Z'Up{2hHP:Pz<n~6)tx=Aޏ2ELCv/Nq[Y"y/o_g[ߜ#U`jdҞٝ#e6WO4y`MsBM(4t#|JP=!e(Xk5dd*!:*.) F| gp.9\8dlGR(c@cdlTii&rO:=hz(_;0B̓ OW}|X4fB޶v7gU4u~L6_|Ӎ`bv~K~YGkznn^UhxE-Y`'i\ŅW"J[W)qh#R[zjc,lALwee[ +*SX+Ֆ8q7ӎZj O"6!t*H#TIq0rEQs($b(ALI M&ox«RlFںmnV[I3嘧? (6Ev49!R#N9/R2AuNN!4Tnj`0rFKEdRwfCRh6ED4RNJ2ddc#n&FB]ǧ+:ܘuaJg;xm#y펾ۮ̞2螽j3j+eۑQیY~qːj{TDHuA+CzHUNzP_(T]~w>_݌{d0_^)ޑW_ O:nU?LS8ǻrA1B vX1~lŶ] M[(4Fe:Gx)CU(cTkzݓru>q :~0GPvB a Cr r2v pJ39t:*b CV!yqB(0bG @( υ D*ǜ3fQO5xC!jeږhq"l6hJ4$Vz5F *@BV<=%--r^FPjPj[~nV_!?٨H8ה"!sHcsBrYnw?> vLs;SBy["-xI cLbF (9hjgAT0NfE&ph4VȂKG2)v+:c")O5fig'עP3[{@QT{ Đ:Jq=^,FcR@39JHD,6,L ~/p{#]a&C4:BgKNKV[m 3|`c*Bt}1IC-(ڻ" xTBO!'+ #%t{$-# lq8N3t3N:Cxm1^٤(+uLz PEJ[vP*$,VkkaLJi=;k"kl|K桒DҺ(,ZjeȊ2*9'= *]\)6Scƺ4|*LC:M!xxI/Ki =%!a k9dP؃&$c2uiF%ݻkLх wQGߕ:@6@½iЭ\C_+! G-ܠ@dxGI#:?7<>c&R@cQ١`4GΪd]xk Њ=t{ݵI{H}> zf4mɽ߽zgɧUenmdaudܻg箘roj11nx>\vow_gPe:9ټUn .꠴M_.[0c'a(:&p0wh{ sꬊUƧ슑%Z ddTAgmE@hTZTJ|,}A0AIs.?uO57b6,&{nCs+~gκI |}?zs/zWBK77f>/󛼺-[VZީz߰vwv߃oG~ jyt4;7l|_F7PhTF~{ն|f8m8s/ayx7[*Hoݢ+] 1aw=hj?+dJ[]rN2|@kK4;B^C~=w-H:PmptT/1k֐j㡜m~@)* :d i':Pj7l) S6 @g!JvIJniMh!F  47ŢLʕTD,褯=ɭڢV C l"h,!x}t]n uxĒgggr;A9ߏS}D3~>^>kN6zzgM08( %`IJAh0~0&ƻ@oIxq5}vGGzr(v#قo#w?-۫Q^ӄCarJQmsvzu5Nd9Bpyi #bN2GkL&(lVH"t$ ymEALu1&/ b'" ~B,,QPYs.l3aoV߾l)G! 1uN%5=9o:6kulTMӅ49}E^?z7 N%]et1H$PA/$)_.k+8omɁ.']P/VG:510>I}r{a~!o n+NxtL%X+Z!>w^!%EIK1ΗRBvTX@(wE̮p #F+Y#+Nc(wsA=]\Osx]M )⃣F| >"o/}{B*trY{]Ν6PÚO]O*5bM>=쑓<>>ٟWvrYi_n'_p8ʇa::W \7MQZqTsy˥ܑZ)^u :X@{RufԽh, iU`Erw' >Uc-0GAhHtolĔrr%8əS,W<4߈L Z :< ^L?k838%c0Sw}?~%4ݗ=UV|gydfU&״F-.fZV a[$|JWkڦr\{<)ZOW˼>iA.T)\h0_6Z4ک8bH 갇t]=imhː}!\*#k!A!A:,А,Ԣ` rmj1Ogk&G@<g;aЈN@w. &p  ,'A  ƌ^D]2zH/@0R%c:^б]sU#Y}ӞW)r&dՕUN]2FZ;/d@Jc3,k.@rIl]i}8dcq;;h2Au͙mMMDG7 b h@.F/_~N>IT8?|Za.rKcr( DYm3g!uka:镶+S|Pă|h)8ݶC[[p /Dž[-\,^ k ~WGŷGkV6ޫ<XLj6..wfpH?e^JE)dˌΧƲ&jUtEє$RgNa ^PM}97oO5 RY;v/tu}{wnm6I>LjvYevOɕJɛmI1L5$-dbYZa `\$g!?A@"JQͭ$G~T̎NGc.Wc"* Cp+qA$Za!Kʹ!Fz_R1G Eέ JWdQ0A[R.k%GHu;~uRҝbY4q p/BD#> ]w@w)]*wYLA bcFy2g ;ŨSzwbPWdsrwr %U 3&\ (0 r#ƙdDur]qChGlS]qgfܶ/m#tΫ#D/MˑRsfvZaKK4R8pb,1Q`1VYE T94wFm6*GT TNBwpmpY>N|:o]edo)ʺN iŴgS3vyclZ?*_82?ea7y Lt4]| W2:?{uL=,U8L0g nѯJzWCeYJk1~*?ƨ޸W;5ԛϟ3tnSH8+k9fY4Sbf7I*I=̮$k/9v~ $g\Kr1s˾ VFIphF t&ԨhFS⠹JGx]8rQl/{Yu=S0!㴇h7<1ꙇT99tSN.sT4f\BӀ#tN( h4şNjTRJh/U&M{Ճ6)"JpIkV5^ u#]qE-XŸ5^ -kBBTGWbUU,4>h@^thMuHWRI~Z kJ!BWKWN|t4EM攴Эy3hM+@ҕRP"J\ JNW % ~?t6zu ?gt`MWGvd-=2J0R[Еjgx&b>% cIM`|[,7z8N :d%Daɰ L(+}Nt0*{pFZkw1#Q ޼QqLnܐr!rCBI'7MNs40VP跓ד_o'o,գm &p߼ tMG8VWl{5O(`&\f4Ջsk9f}6?f0.d{+Y'g) eɘx_KgMg66[ght/aqRG;Znbմ/{I_$$y 7ih/Pd't1I1r)/L .i|u01@?/`'ً]af\Z%FSuTER'۔))j"OmռԵNc,eFAu) )o)CPw-Ƹ} QQ"Pz+\ydVh5:+P% [(SwzJ[u>:ߛ}ZH3osA#Ι5*׆'3|41ݰN.jqtFw&L6lt^euѻ*h{xb.?G"m9O -ТΥL:ˣELj6E&\Xj_ h`x5"?DST^G FL`X/8D#BAr2)i8 !~4+N}Rztɽ2} ݘzѓK x4CfZ|!}PjN4p+5sd{yq3€UN<7gu lX?V_Zz-zOkkBk-TY2}[V#rB'&]К",nzHրv4wpqtRh29 ›LZK&hum&LN* ]!Z*cB)Zj"])eƟ:9=kV+Dx2dЕ֌i s./thҕъ si+Yhu+BY+Us 4yDW˽]ZNWҴ&ҕ5)+jWW1of }tE(M 6QƀGtA++Bkj?3H(lgTUz2l,].{y{Ipr(WtYqZ>!iݏ6Khj4kh=@ }#3J˸(h^[XI`y8 "* ùh==o'o'&? ltiQ%9t JryKJMN@G4uB¢4ԶҴU`+J}KW+ew#BXoZ ]!ZyP UJ;{DW &Քektu2Nvծ&"Z֝%SM+pN3]`O2Hpк4-7VT" +O :NWtzT=bLK(cųtU2CY8/AWCU8<+,3BI_JUwBɖHWʱ]dRZ:RN̓kt\:{b5țiHh -4 Uef9q2;] ZX c kp;h-8;fU39LdrVٺgrNh3frK!|*<`p•JBWTuYJqc/2/1o+B+yP6j"]iWDWs ]ZUi=D)jv]KW+#>ծ0\|Iz9BRʡ-]5@}JADWWq_ڗ u-]5M1=+,fI:dmt夒ҧd+B+$vB m2zJTb.s<]k.]Ck/sz9]Ե+QDKW+<+| &EW<] ]!ZtE(j骑tINR gNxT t˨|AO+0hc/h).PZ1Q@&n@s_B+]BM740nI=+.~&o9g]ZNWɖHWJKV3,9]\оu J\KW +0Jkڟ"2Uʛ њKm/)ҺHWp=++ch;/tEh;]J#Zj ]@*]!`7tEpABWXkWR@KW +kyOJW؀7tEptE(e;3Dr i$\@"ZY"WDWL\J^R+V +RLYdKW+'3u:c" .NQlt7A2S֛{JҗwJ Edȥv(ztN)-+('Cy!TҨn]못U[Gp;k} sPHűֺż>҇B%SW@ N{`0/y[~< 6q[{@jƈnp~_u' [<؜,f,\Ԝ{/ i }\Os]ggVŧ-j!ً9٬w0LagL#qPO>o'Б3D&qk4; cd0J"|E*v"}'sD:"D:7U/J'sOET sZJ H.8"ԧ#e] >B^EUgS~ZnM&I]XcCSQi=֏N;HI.mӏ{j]FpJe'zA3%)RBqe(nIےk.J[Ҧ4inINrЙ%(d$ߜ8P%vaIZCp͎@QmYsv&1ic*Iix~BI> ٗԣEa\FtKDWg-]eN%]AiԕlNJl0 |&wwj.-f!%l7E~&d]Jn ?û=ރpz͜\r PBrZX%DASΧR,G PdI Yy%Ӝ66H{3Nd==<I}"U%40{YT=Li}~M)U34& d61pINQ֊صoVc~VEX3k,'(VpKg~iR3΄U l?mWSM9ޤqgLj1)`>ղғnm(&*XƝm57B չ> t!!HAug i1${&CN0%mqIiDUВ{&3VkDQ*+1TIZ;hJeqOZyh/e3 |1/Aඔ4T N/w|}UkvmS2^,]W+Ʈ>$=fIjBk9zrZ#"it;mMS.H8,jR(} iױTd6%Zt]YIKdŴ+~0)::o]/w %aUkpΕ.}mqI˕V7.w'.usd>r+66byTL9(\Xc˔R 5m?4ɭciVlK\ʡ"N,jj}-] zèCqnK$\"QDA"a$ 6"}AB%A2SfUqp!u_Ja">pr"``8hwNXt~!i>${S [{^CCKgݚ;;$?0Nܧe`s_Fn`QQASdJ6̨6fȳ͛e1aDx&׋d4Np%ӊ9XZX$0xO_\M=t&NR;\K*}H']K0աp Qj tv? 0h~=vJePRehNs .Gk.̍G;$?/d/-=~?("sܹ\O#/X9Ѕ+G~wv1iEvǢ )ȟHc;)^c\B@љ26oh)ϬK,ʽ/~0|\z˴ iqhCbGwZByG#:Gp$|Q#Nm$jc>Aں2'jw~|.ud!&^gÐN~#T@8:?ם|(-/W THB; Ta懿w> N7LK8Ӑ. lZ˜%N`&x .˃kA+grjǷ2$*TxS@MnPR'z,޲O7O}雌o{Gp}wFIlV=K>hk$eΏ?H~>ATJkPJWe'{(,ͯZ:d p37}K}/*Y3~%ՑmU쪍>>ٲf,H~R]W-z;˭ӭk& jLRtgg3Z3Q;W;bi\bJ5gǎ{/QU>X` M<^+G/SR.^ }OxQI mI r`lWX 5s^hSSLw+kNeGD}y~(],\QHI@7GIY2Kp 8R{pLMLO} hLD/7|ݾ :@qk`'gPk)u^>oπ#y^xeZRڔH|Yjײ";crxPRݹiad@h]&arU*>4:qPԂDTB%Zb[c/ma kk;sN`HB[.%~KPvel5D8eϦeCy\ t<ђ{-DX|NY@4HhQ8TFx{_,%8 ӘbfaɅx3T|%v[E;-!TM;xۧl^R%b_+7D?_ [Vx&!E.nB#@Hz\KdLOUZ~԰ƺ[sӲg-ےmY$lEf4>CczFa):ԦdN>ラ*}Բsǧ a&H[D-}aJ?3vxb+{-rH'uyP>߶KCy~#!w¬r3#]Pz⒉L{2kze"ƩҰlbW2bbYvӼмRҦjsRtcuenqY~P!^VF53L|i I;C/RGΛE/#C{u/lTXKX/죆5|*sXQVl܉Ff2o 'ZH/<||P:D7N^;KuP-Pl1W^LlXc)cMh6qRm߾'Yg.Cdi!]PrJ{OqH1cd@M@dn1Uw^~ؾY)rNz4BZ,{P{yT0Xh,c ;YX(Zs "(w>^<3^a[5sS632안i)o`x9y,wrg(HB6|Jt#OX"2A }aW*+?T BEI124+{uJT]̠567-4,tRUFiiEM_#;R%ƴHsб3$2f7<;xzG՞){!AyŜ~Y5\S/CZ[ &]2(lXsj*Mbh<#v}'%yQ{ [>#W ĬݕWmP닽o2iLP@RUz"1Q U+aguhfMץxKNԥ#캺sOQ6덋:֊oX ss wIiD(;TtU԰Ưb;(*(* -(I5~jNX;nߠ?^-VJ2eYhÿݪ~ތe5 4hniiIk9/1p>A`to!%d&KU,`i /rDcgnqs6ZJI}zsLַ|+Bf3(bi` .qZ&Xe`IkL34kpAuXYQrغ*r # 55 Q1FY5W%s'%m)dM8Cvd 4Ml." {p/䑥F|$BČl~|z)f" ; J݃fnUwiQDcQ%N 8Q Fcj\o#i$4vZT6cK"Mo?>u "r7](ձ 70Y3kibyB W<؁qDqCTo M4̊o܆ϑƅ%2d1Ė0!a˭$0L$\=X!;*ҎHD@ʉr #`0z_W˞TE\@N V TESn& 65}?"Gjo1\-<*jϳK-*OJޕ0.߼g!\nYɢ0\!]~h SX~| 0T4BZ@*I 8{ Qf#P0 c[S%NafK4Ljfkt4Ţ?f#tFs\Y(IR2)hsqwV5;|6,Ij]RsӼV|K>!mlrV H//0v&>w5UBփƐ̧ BH!d׿yۃ=/ LP,Xi#հUSZ6Fq =S}^RB% S?XrxVۋ[H8ä> o ŝnє޴ulGiSJM!PIz?97 y1G㊾pg uUqfrDڜ G2`զwg eRe_)*9 }=J@c- qKx_ms|OVPsd@7 sLSE/BUoF{@;c1oPBKn޷m\,4Fhf(o_E a#46<N+96 Q{ s#rX368~/߳&\OI:>fw6Ӛpɗl\;,; ǑdDoQsFO2?]2%Ʊ J˫_^w?TJZtiӰF{<$-h谨/ۧh wlN!kWů/{Ͽgg"c αa8}Gj2&j"T7%)^Q>$#H'*G\nڼ<HlnWtfG~p5>=Qq|քG+/ޏE#ָwDݶ.-hs;ӑ{Ըn2#~`Zd}uɘH80yfGi_uCMgO~oAڝTb.ӫRT3tu^*f'Ÿhs9%P/m^) kt=Ca~;}cp'Z#TS|Yo\AY䋅5oYdg[|yMYԷyvyjDC> Gf 1 @^iBǹ= Qot@)uE_[Bh 1h0)7U27Iax0N21Zn pA,鹜(=( !'޹rbDyޞ|&b$8RGF I$0@#Wq3 qTėGS4ARD+ϗx.pCc02o=oEWcFr^`ބLyy`v!G[KMvȣ ac,(y4g:J#_"np Bbݖ=J]y*45vԛe'g'd(ax}30)XcFřٚ928B>j s~;~Y^Z.(*S pO Kp-V꒒MNHdL `b":"A@XVm9 HpehUh 3z'0+64|iJm=A)ߴqjzc },!m?=ug(L&Ш 1̸ (i^XgM)ő̧&s*B' 53"WȘ\}(f:nWM[$s7JZ圂BANj䐐YaGgʳo Ȁc [Ѵcmφ/"EaBD~OP{#ݍ +ת̴%Zl>!UU43n(#OP ׂ;jUjVei !0$C:Jk4+{A x%&C˵2]b9&bnO(P%+sP"T=53%e!2d1c|'MX*v Z2!Tv+4YEh w$iPIJrLؒޕ6rc"L}0Io0`'H2&hȢXRvマ:KrIfU%w&@Ҷ;w|D6j~7|vRIIBӢ__2)y/&I4t.0~XJHPJ[o)TZn5{~$5Ao—{{[LΞKYx 63ώUmf:yI7DvH-ųa`2z3ngIJ(kf,$p %N^1isUGrσcFI/5krCBN-Y|oGšoDCE 3x[qNzt|"=CS^)ehv< .D =v>C՞Lqu=Ƹ(?Oo374cpvC;Xco3'MbFkyrf?\0,+OKSzkuy~ط|> Pq][qdDuO` ӧD[`^ {eLq̯ڲ9R֌DA.7j6KV|MkѻB}OUB7G$EK鱹ԯ !'{(n\rB:o3-9tєfP<r?MK4&hg#X<a A>*~s){&{1'S+%Vv93&"^q꘣{"G5s0HRKa/ rmn~K]M^i/8Nt94jbnʇ:h ӿ9MQ㰼 ]ǯw Ih!},ـȘc|qy$Q.a$'djDʼn^{"CJtZnw팠*w\\w=-]Wbk[Q/KQVw9S!:qi 0%Q=2ў]␄wq"CZEB0H9!=лٕVfFlCjż!-}-emM =⻢,OESkCGws<a"jS*1{Ӝ?wryT,hf|1gluYJ>rY8@Ȫgߜ# TpYQ8;S0|\ TKfspS-)g[`BCB<#E0/C $uFaUIfI?2^Jv})NT2I˫'J3SIۄFaOqjȥB\ۭ1WO2cñxV>?ٻ׀:' C/o+YOx]E㳁Tm"iM3{xgD+z]y=$ ͞9z+y.^WUp#n!X|6bAf|9(w 4he :^3aX$ T9M3PNKp60կ$XheTCPB*}{3!HζҠp1s0w8[|?rv: !M1!2 mp:g\Mǃ"gϟ]Us$1:{Xp_]pf9_\7_^nɥN+q}{#;Ǎ\`Ӯ uVtg+u L}z8vc^̂ZKPAٍ1 g{qxߊJ^nU߅l+ο~}8fC+>\ݶ C~a;p'c(o6L%B~ ơD.9.lǛ}Yo`@6O姁"a6f5MYNf!Ά>m&/M჋G|1}&r4ۏg+ًHF\ɺ\sL'(IJ2ea]_ IugU.KLt\5]8gAvyos]p[o}+'Mm.ʝBPH2CxKMr.^%f}j9xeIxקk>m`ʬFQa@BeL!"=sM!h芞bS HP7I}jvp^UBXϮx ngBUѮ[}؄͸Aק HgR Bs,d/=(;?`uWVxT_moB2w\Lg#$8MXmVgǔi4R!n45rXOg Iכ0z;&xлh!e9q:_t^2:Sݏ3.G2G `6Wii_泍 X/ !T]1q^to?L>&0ɰM䳾{ЎrXY +JZ<;C%Wfoi5Q DWGRMH;Q%QUΤogִEkV-.3/r3:ޯac>OHн2v<0H}.pu!LA3B8eڻ>+7hj*.̎}ù$~R-¬I:#_佦c_} ӔurЯOU`;-0^눑2h\l‰[=}JыOm߭=F1:B٥x&KdWOցAt@> o'B0R>$+^@?_E']`vT'~RqO+hqzY%-K|xf0nW%C PB>W1nFXd!1)8˕q%'!+BZdqN`W%aܦN~fTtay`eWW%">{V4ogk vRI-7" _?=J \!.Ixf>BYy'ںRxJ{u "  /{`1s&|x=m)Rz@EG@z0vI*/9Oa &D8ɍb('/ܚWo{;lK1&FPE)6@ȑ%H\ R9n0QN @-FJ[X.%ϓ)о~mԬň5'rKPA,xb^юεᄙB`AȘ$F9͸K+_X5D V ֧F(=#\jm@hcjXX +yWgS@P}b :;uń W'!}?'w `k8.<A&_`]~q+i-ķY@xP^}ȿVЍ\_PSQyˁѠw.cc=`-nͲ{UIv#bB9sُ5vQvF(窦wIaXj/pڎo[)ܚ&n`)}ڨop< @ \e !fqE],>%B.jQ4,wJ&- k1|ξV}YHH$矯 _#b FkA`VL U d`m) !!@_dI4kŶHckcD uXQ@&9vx# %ps 6 ,WMM'kJ7ҫs kR/٤BGcDבXpOP;-UtѴ+JQMK31Zzo bR*4!T 3NPң- ƙvus?+! 7U6 Q?8R F*}9+esLB}(:n^}X((vr֓10e7oq>x2zZWB캊c1>'Ulat]7A'bʜ"ϔC$slUMwb B:gVnXm3eLސ=rB]_y8v>j.c6_| |CL *{k/=֙fl\bbU瞹.biHR&^\8g8»LlSE$h^$nJbxv5fZs3w$;e5@J"ˎնrAe_=ҔhSMQ,Nx"aĖ2L0yYd$ l(dٞ9vC̙@M$ 8.r͔TՋyLvMEn1o5CԽik/TcnB+83HB0{3^!vج2zz̜ _¾v*d"$ R7F1Prr޺PM߳A֐?.b7D.d߷ڍxE1s?syuLR7o8ߋuEky[ˊ.BںRTg!\vbizU~pi^hQ nI,2e?+ {F bOvlXD`k気qwuTWWS@ #j@69 PY/__W-W|Xi0AwF̡n 9F*׊ rf#n;0耤6ₗ7j\¨B!_{=O| A(BU&4dp3*Rn#+< LL'`NgI$UmfHBmBJ]LF%*%r"S f&%2,(:s$[y݋w[x(F53y B1hdq| a 9꾕(_b1f=M]/ϏD 3i,GJYQ[GŌ2Zyn|;7Xruʑ=AT\ɮWyF c; z>/Ե]!cϧc\_(/<נAb*d.I"fK5[CCN;FLEƈL<Ki41ޞ4JiXR6H3 ïyHy9/5oٿ^!cOSӗZlGRn.1T%6⸥Ӛ2Eq0E g $ ʡ2pF !5,ORR@Z +4Q7s_3mۧoCǼo,Ǽjƫ11O.c~.+ {s]n>e--" z>.~]</tӦgWg|lCf?4y&o tk'L݋U2,`_dᤶPOkHUjgջo||KC0C핻\' =P$ "uX0{Tx1ӗ#A; ,U$ fU# ֨qo(> !<B]y~71ol󯠅+혾MΫF 90C_b1Uwm(KvvP,wB;;vX9&㔲:n6" ;IƳ@G{-^D#C SR5xM%7%8Ϳj2_OBrOl3m#ьog8(qE#L^C4KuȂiwTzo5*\k.*dQQֽ/~}6ɢ5 >\rݭV\Կ !{Ph8+'dG7r;nePd/4`/I^ A\g)㈏a1 &%;o1-4,/$y6TB"Dލ{W& \"Q^Q"tGou 17O2u)L&2 q1]]`alr$OO #=|>:ijMHK)TeWmn^iAMjSbi^q RZ&zii2Y|۾˸:' PyUL.jO3VyXXB>olHxN=ѓ OShdW<~> ~u=Oʄ32`9'I[pY&)*Nb5qd^uFV.Y\bf:4#>6?|u:݅AE#Bj r^uC=\eO5P2~"\OI)m1]22SZƕ NpKi Q[?߾oӯ3eZ 2 F/,tW%f^EςiYd)J!A/3ĘJj768Id(u*S %; A|-TA'uwՂO<ƳL&^m-\@Q¦Ϲ9,IUd7n?JYԥi1u 3岛iONeyɢJ ln)԰soIeLJ; qJ>}^[!NHoqB0e+d]reDIɍo),TyE9ǘo `8XM+Eu!D>p7Rc]%)FJʼ> gKŠal%S lsQ$I='2 \"[dYq9?O0rj-E_,(# {sHB]csK .6+ :mՀ<4F+B(q 5$RAyeFޫ䔯doWem~KTXy\-TFBLCreUh-Xd ׉bt)2ޯm|_tKQL{WE};YwgHv'ߟ,g; MjRUbHTŻr/nipI ҥR#Ucǜ>O 1w:"3 SɤSS<;A*Xma%Flt) iB(| sb(g^|8zv6;}v͙\VdsTwğThy)REa$Y!G͘PK0C0j?0gW| ڣ\}ppmDnZ2޹ǹ0P+d)j<'EdŘRgRe^<'F),40W4},8 ש`~2^t #8*V$y"6^%#>lʓwY315Kr 6! aDb+5gLi^/>DC|8j2v]#!xZVf &|TWnIN4Hr üXՂ|%9(v'jgz_>B[!c`֯>)ggb}@sɮ^7[!ϬTvtW$'rIqԡ `p"}Qmr+dOnHyu\DEQ!c \ܡ4rf: nj%9bT!BN2XR"䝓0wŒr&}GfH]9J**E@Lȼlvz+R>;Tb 8y'ܐy|^Tm_\:Ubdɱ8% 6ӭWoB)&}^a++9:] +f`ry}"(qDb6eZY*qg8NLJSh_[BZÜ@y+dhIu4n|seGvv03u$ ;oǩnovո)QXB Q`1Y%-BƠSsJlT'4QR2{R!nr\O~=VW!cuZW12A2 Z9=Vɧ/fcp+IV'i6@_[+*uPWɾK޸Z =k :/Ϗ u lΪ4C) : X1 ;꒜kTzgNj 8<8+Rd)28 q.uA9=w)ؾp;!."a讳OUQ>:*fRosT"t0dhѫن˖^[\0c'_S.Uh,TuVxPG7GP-8SWM5;||f̲mQ;ԵXg|?E}ꝫ1>/Ե̀Q!c;h㽬W](ڵ2c0 tњPg "cD&$hib6oa? 2#ג&(Gl4DIc4UʲcּL ütrjsPN K\.=ϟ@9 ۧaNڦ$h(w2*;m?-I(el=~a mN9]n# ﱱc O.;>gh 7KX$2yj:Ø''(w @T_#c'iY_m^kNPh%`  3VyBF*0O,e\7$y\h"fpΘB3JE31TSGLEX-vm jWDz>CV߶V#AS(P!cڮbo:DrMtK:ʯ$= hT8~-4PDUPd&N{uӨd4]gͣRa=@3CZu|uc|RK'EdG9՛JȐs*mioK; Jg]*j*sIQ5L#^ǫmIFGmKK]B.eni9kn&tb{v;@.[cTh,kz5ktHpGX6xCXJ@=xXA4qwx֦缗wۡ̚7ӊ]U.VGSA/]}k Jg X(ҀĈ/4ӲmPqc.m`-jBxi+<{S2ז'j[lVC&|ڝ=[iIAWOEnfN&76k_{WRInD`?'~X<-~ܼͪs ;Knݖ>5{(ݜmFƢlVOS_Ey>.~]<.8}$4rLN}u+Xu& )l]\neMBMe-ubkًټ7N֓ܵm.1lKH ̢[FΔqKBӬkto {TahWAxƫ"D=?VxʯKvqAt}˟5Tnvack$*V|BumcfpL[8:Aӓ`TsD0$:+gt9b-F&˟U&,TV`ô!IpTq Q~.Ydy_$tEȫ:>{/VW{uclZ`5m+T(䭷i?(dx~* #QtGRoYu:T:=d: /f7QB`ѽ`[ѵͪF[7 Dq*{ڕ̶nOLEY 2c.Ŋ_WMM4YY%\v#"; y0LtꄊgutW _XT]'mb3P=Vx`K:]UsW *'[!c鲚qR6X DsēăF p>'doWl~dK*crVZ"#4FBVL_]X[sz noC3Wu*d ә#{UBT[=V_6J%1Dgo)]ٻFr$W,fЀ]Fw/awPiJ>ʒ]`+-Y6SCeY"Aň/!-s2O-:_kkhI 9Xʨ,*T?~{TlHSJd 9Xo벉k0R̸Kxhi4}b,TTU dTm|=UUx skbvSГPQW%y=X ʲ{x>J1 {6R|+C*kd AɕIJ1)j=w$wdYiU^3[S7Ybdx#,y20G#Un#Gs\c٢du .a\" q u j]7GWTŪu{P5Aoh|/7謭umX>Jd g8 %A x] tK0>®izx7ʂ`\6UC=Zݤ%5\l5ђ%JXUxKSF5D`9ܶ<ױNM v#f1͋~3BnyqjJoiSc0,(/6ypȱȌhoπ{ CG\Ԧ#F0]qʀèd{3 wpݭb=Jq263m,ј6ԏD:6_Pf_8[ˌPS16]%R-;#9[6cY3#ȌX h_ڗZHf^>:X=U;+bwaxϺo2vA*Me̝DuNMXE79I_&'ieY!w3[?;0 e#s;1@߅#4ipβ޵8䩕A65kŠ,Z|.]QDk-t AS+m`o]JLēh'6^?k]x'~#Z?='-XCp}en3t_Ce4%~-`7-{wvS+͔DU4ߧycOm͚r.؄V)'7O(4;z5oJ# gi0;si[d=`Jei5/O 0U9N~qK$p-rxٌߚ,+qC%א}P%[ g53.Ot"0]t"qZ !3xkh,'Oqx1FWupxHgM{Pl7ɎebOR\`rke f8>hS*@0V D.B`8c_t~srh-Q:ܚ.SOA| Lm)54~̣RSsg:&'BƳ&R1| ƭDgbF\<>-FS1(e3YRC;6d;MsMXd>SCQe }L(ҍc2\qlC#~EPz5~DRIҎcƱ 7%@)72C1?Ll׺4UXnj٘1{DP0tbkbpcPQLvSCbB9jzLl):h`eE'sΗJv苚`Cl;(:IQr$%V7qg k436$9zO]$Waa4,䄍$t$z;L$7(Iи=ܴvL>7/[j>xۭ~ͳZLO4c~ Xn G-`.uLՃ\I}Ө{V0Jo6}8ÄC?kŊ)CW3², փwQd㩡5/y`$vs 54NHa߄/Yl(A^֌暙Q5qPw tе hؽWo&])^/O?wQvkj*ɉws:-4.r3`) 1 ʲ&0XRY9u- ٱ# ㊵)9)Yx|`F|hrq 4{9dhGğdE@ `TD$hbSqh1" XB hAƕA2O. JQFVq1\S/XY6met2xyp eޜKXj&2#2BX  ens!Fё 6[Ҕtbڙh`*tc"3!rT2IfƤW$4ƼYRC_V//ZsVMXCGa,nc:+N@F:kyİl`Bq-jΡ$8*L`GyT@6T@R.c ¤aaRo)t,Y j4%ץ5Pj"P$ը:9+-Wv8%}PDzD獲Cʟl 8F'4#уEL&(˨Qc~=t>‚^9\-U[Uz33;yaac򊮿@Z7G%FfműlյOU(B}_ݴH"IJdQg6Z$k<׮M]4VV ݸ-rKV 0[%)&߮XD-LиV VC q4jG]KЀrhAHLYZCc+-NeQ`ƫԊnj+8ƾTL~8vZ2EV`):4x^)^qV #BheIܜ+OO5-Fga(ky~1ʭzA9\kϨ/Pf@>qE_ l?D@hn\c'M&jb| "~ *H yBcV6Iyȁ%b܉Hp\;!/4zxJr0 W2 iyvAEtu990+ti6vjt"|8W7U3} nΡH==dhW"P&Fx21&3j9[0ŋi3r֗N"2pp//RA OE3қDQM<ಆ1PcT9H=>W.Z1FcϣO>G)^uܽDoDKуAZI))%jK `d_=L$W0QzpE~UJK|)(iHH l| L:gi<.} ݠq֬ctOK)v?\~w穀D7=V@=x7ՐUgnr!K9Hņ\Tpr!GLԥ$>FFB5;4]S>CYcǟZL˰Yhq"i$-54w)ϦlZ">[QArn\u*sq(Rzwo}g=hrjqٓm/VQж@xXx8^-@ٲBf$R zmX$bԆCҫa4nfGƭ>^fY5YAqhrwב"Po+޾`Y=1ZG؃ד.bcԐSeqXX 'koY_^XC ˲4}rS^׺h(]SCϤJSL\Wۻ-+T&6;^ @ąծw76vo/'7|e؍k>ň>6~ȟ޼- 1ۇү+TK׷_];]͘o~6?sc~kAoo;v=K ^Z} ]}w-o{KO.*!K$CNZ2g5qBh* F wAo]&^\IٍғcO=!A)SuܦJ>|E%J:[1r|wE bmHӞ$g[Du2T8/ǘg99vTHsن DuJ*R(#mubysF'}2Hie_84^TXUٱެ\&| _fܳen:Sm٭CcEtǟ $µsw3Oί43Ac:ԭ4#wkYhWҠ}捽[-EXhL]ϗI0q ?#6VnJ'x?+9l؏SxoY̢3=c0-HL?7M&ʡZˏϚ(Qz<-X9b_VCzO=\^9ydnZF_ѥd>_|[J/S Rr/s#ghP e.PK4Io$AZљshV/?_;g,kEQaLM3T(S₻^R f)K2yHL_4/g旟 $`ѿ]ʎ_Fg:ka 6|L~0®i tF!^kT߭w, +g@姌8|}f]t  6&qD Fג+~mq?¡=zs Qr>ş.?>_Gez?|gs>0]vu]؝|N;t|$#̾2ōn7+y cc:k.eHTv*+GZa>/`޼CsJ_m#ޏ(xUT2b Sp2CJR횳ՉT'_Ɉ/T.U 'h)K8(j՝}u9ˠatG^MiR\T[9n14F*U"Ԧ7sҤ"%qqs)zZZN3 SouKܝ2XҔq$>\ 1đK].kJfMw#ݤΒPwhMIn13dkǷF@>(^Rg5;R@^^g!9y d|N"0Fw;酳6Tlj6%-zV5ʹn2Ful(s܉l|{F4ަ6m״j^,F$;} 7e'g&[hC˂c$L&H]эTBpU|)c5If֚ Q;H:u8p_!HK.֩ adlzKmbS,~h7ZƜqf%crB?_s[/L0vWx>-dFMVIQ}ڬ~LV1%HqAzi-jv6+&_PQ>Jo8:0diQ)%kIx 3xu0 92fRſj ] 2]rx*^khV/qM{f4Wp{|`(wxSS lw a=f&=_H|^6Fii6FiiQ`f19bo}s囍8Tci5@> 4Mł͈i 5 5KHԗ@2y!Byif`(4؍:\$+- =fЍ6=<ڹDFcE3&g%&oZwZn~ΌP jmaR񔂟,٫͵2eǩU&c Oc{9S&l-i1yp6ϊղ˘iR \6p7! .[A㏑1w2d%E[Q}(:_<p`9XxR8sFz<ז>K6d8K6dYW3KƅClΎښ ef#S%ha BSRMGTہM 2R@aî`0V#kk>2XY~Z3 "L(.T0!׷w!G5h kPX7޿wfLHKcFU]Y>9 wcT޼01yGy*g<S[:%7v9֞=4;JbwBeT/)K"-TPƹH s@BzTOٹds/?nn.һz;#~ "F~ivC▌]E:[(HQOaǿ%d܉mAl;3 s,݇@J%mWַ=MтeToJ6 sωyG䕣~s4$1鎁I&QWzѲ|Tҿ??}VŸl[!jL"MM`ka`"fa*"MUEԧ5:>Q "Ywnez8*<*cJP$ t5UVܽ9TD_<)%m+ J~wU(hRXIJ|1E>ӏ_ߝ 9ga)Nk/?Y} P gIH_#HN$|/Pk 3 Ek lo;~ӊ5cim7(˥Yfs2XÏ;HtP蕱cccz8 `hTg;6y> U{hD0HiC;y\:AUVYm:T\M5@`qEEH7V[#[> Ң-Czy b#X9(S/`"n} 30~DD<L.vas %gا2g$U.9_j}fK ?G^QOK\1q5feA9ؼ~ ZxNѧ2a M PNjÄ&%S01\=[h5X(މ%VH׻%iZ 2m=A+9&QM|I3 nz L1W瞦Gy#.-ZL?{=vRCi)c䜸jmĚlH;5U R\]BjbckT|<9~ YKbsJ^%f5[CȮHjZ,oR"W6f&Nl&? $2TR4.(ATٚ2F6.쁠*14JVzf|XgۡA tBGAka |h\Ҕ@G fu8[3tJj`-F[ CWT&f o|5sD>x}\ѭ6RkFO=z-lr E,hz ' `tɅhX}O)\ lC03 S/#+WUTjmb5I|Gac\@珌#߭k소2F;BSʀbpQxEŤZJpZ7Hr %pEU" Q0$/7Gw>ep/VE#U9ƞ ֢k7/]D~/u ~2"܍Y`=AX ]$#xj9u=+j쭈-.2iaBKBGB̶K،)r2Zt' ^ءB%uī|q>V5`gz8_򒇠{<CpAkZIJ׹A{N(j,Qk;٩:jCլM9 `x7}]ŤT`s9zբ-k!.j+z6fi42p- jr r*5%l<^ڦEJ`￿dǸY>޺_7Zc)&cu,YV h'( D/]"CtQZϰoL7RŵR=' qtWȗ)5K( ts&?f.h}wf!^{!E;#3\{W/~x_xj~Wkn.~zs[}~SQWR$zSM6Uek";ĞjbZ)%~nÞ9g L!½{K3LZm@2 ,ěcC.,t) +9%~)SR@" 3ip}A.L#;21(}nh۵Pg  ec/-EP(94$& +i:N# 7[ȣ*yM+!Y0Rv_!-ƁMQ&,C9%Jm7yY.5R=NK79Lt ]` \nWyOjzWI [O/ lwo@Ak>Du7;|8| >h~/q?2~YXx F'@Wٔ1x"7?Fv_0T$ª_OSםGt_c d1Gkg+/!%OPINO>~ko1tp"ünvf)oR` b-*d|dg, Gjh>s!<(adeY :59vg@z6g@{PpvpAE;ri.P{aps f?Zٷ*cqpxV^{Ru۩M^PӬL ^M%e,^;/#໔~[ZzQ4'k`5;;RXU6oŠm]?Kz~q_Uaw&o6<ܨ 4fXhXkˌJZYXKx+x?K0h;^ZͶ7u4i޻-uIGfgc|4I,~ }XկbSNaͱPpvp t*Y=0opU@mhm_ƻ6>S4v ~>{y4zq])f *Al  8US3g[S@{6>8꽶Ƹ!O&z/#EaFBmLWaĦ ʗpnէ]_ucYۭghW6E;.|6GP{-;e֞(WBպtxBTle ;hޒ}5IfSʼnis}X\Ӭ/{}G~- R4Sޒer tI(7rPlUxZrP8ev&A*N= gY:Q*,7oN99Q*,7N~TU(`#A}4W‰*i<>?a9$ ݶrٷgԹӴ;lom>oa}W2;AB<`BM9wv_ ؅dU ݶr 0'tYNy)~h)CJ9F%XS:<ђkQJzsvU[::~ZJzVZ(t m`\}-}ge?ff0lBH[@K`Y`mB \ cU#&cd[*Ψw/hL+TMp(bS,ܲye#Z >UsF ̨9Da;na<}U ԙ"ąm?sEBrNb9Հa3XbcLK_g#4=az?Ē{HՁ֤ۜ?:iL$dj*hBl M[CR-KtQfo`mx=f}3pĖZ3}ukl"<)#>8K%¤KSj.Ֆ"ejhӺq΄(TS,Ď2́JauI&Yo!PWB׻ >x+\h8len823Ǒ9qdf{"&K.<5Ӧ3ViQQvTk (bdzb\;mV::Mg1>sL#8kS]f/mn>'p65g% ىlvx GV`d(>JzcWO;/=Rs^.&36RvPiOCz gj1 s d s*5;5͚PKd's{ر݋[vvqJrŻq[%kf_?xy_4B (R9e8N˝QcIbH*`0\pv1.]!0s0Џx=B}48>w @q4 n8v?_nzn1u%0Қ9_ծY&y q/@EE]%N!ܡyDWliL9W˚s( z i+9 R-UcfqaUF\Q/{V]%9.TXV䕾k8+gVC"6橰TXuGwmyWv/ b{~;W ;͓3ZLfpZR]4gVk's`yHwaPfʫ2s$ݢO#R,` CjJ3jܽut ` ;G;w*x>MM^FFŗ9vAqNfSC=` R겄PX{z Bk!b[ó-)lKluwC?j(%Lq}v3KFhƪړPgK?EE>0d]H0rf9{lƋH1 K{`߃d,DjTwjounuwy޿9t>ffك;0fֹ9x˳b=OО~}E[<vN*oT Ö Oy1\oz>nAJp@jԋfi@%wpd-rHč>ϋuِ(wŞǜkp)jbʹA9hC$[eNs{F򳪲s#E*g ?6sKYioP­(F{]UKyy|2.;8]zr.\HQHe ZU?Sgk99o+IB.@>(P #~:R~bfb,<(ځ"ZOZgXƲAB yg{m9+ TZ)c~%=):R+ZhVQXe1bk]+%JOV`>>UZ{, ,@ޕ6r$Bxץ82R ۘ]0cu4p/o$EQEVbnJl*"2Vc{yk^ < eg! a u)D,/d+ '*Sxv= ˡDlje<( ln*AJË>V@ gLQkRgUE|@OXn-*+?I#p/N^>UuvZat?4펳K84k. )me-RSHFSX(JИऊ*F Q<(3+#i% .KEVM:)c؇ WԎ  bISocҎ)hA&}|Q.\vuHSwzcTf[ЯҾrh!TY k0x}&o2z*VRx3/"j^lP+0}YMW`k MQѤb$| dwKqqt%RFG_gHq'pU\{4fk:? < /]f'l,bAh0Y{HqStN*AM ~(XU#B*r4XnÂbkH]HWESWť'FbJW_{ J'&uUL]{.Uu/w]⓳W- bgqVh6(聐-]nvk} X Vz)ztLUTWDnevIhXs<ڙMQb Qْm>q*NQa䏪S;M4ybќs}#\kC 0?C 0FO!jsn) %FDp'dW @?n :.\%y$!K~ZX9T5`&U}xZ5e]ԏCQ!;[jKOώVkEL޲8QxQݕ=R'nݓWnStj+uZ͢13>[MOݛgPN~ <׿Wm7եhW^ijqYe1alSUTLM}QmO;M>ͦ^єw*c"͝Lnyۤ|SoO3F5Yg9֥>&,+Qe,gUSPn,Gsi' 63eD759>4MQŚQ΂ZWQ=8iNxB#Zj0RbEq5BrbG Enx#Eq,NJxm]!/n 'Z'tA.h7h>Plf9бł9֕l[*vtկ9WbȘw~CTbl ,CPncW@<2 dw`ۈKBqMֈX|x=a_9 &1RIQ"@1_7W`q]/qL.m5ǣfƨ'۟Ă7'dEM~9Z2'}ÐITEvwRq`Rz]{C?^{Jfsb 4R0vk7~\U A*V:/g~fk>tRĆBУo⣣:kxˣ8/GD]Y 2vv = 1%ke$UZdA.^!{:vqKmMeһiV7.vb=Z&ދIa_ǔl|uW7oZ_!#6?{54mq!]=k=W}xIzo̞{T &]kC?)㬾0ԵRv=/˺c{241y>(!HR# @=W>0*[Tn SXp4B=Nħ `G(^:tdFmY_3Ǝ"*/jDJXr6h~dzaz~ lk͑]@Na2 \&lGk6KN|l}=l5#RGۓ.),^J" dgKE-"f; zqfэZi~YWǁLJX;uNMjG==Q}e/1RTieG,RI93tD@; %g:9}c3Uat#c7Ś1,.j= Ģ?;?%jC`w NyAeQcJ"(9S&ik>c׋0< !VIò YJ=o0Vrt^H;e# $`}l٣Ś Gq.O93 r1F(z՛, =s`G_lc,Ƥb m|zdܠܪW$)bVoյ]WN~?a~smu{z5ĜX:^φ՟n'~rOOڜq$^FY:?T:{ xx+i/qF=Ů FJ7VgB}#\Y[坾ث?!kkw 2`l:KzOgù{vu_Ln#.vVi}t>¨f4jh(0ΰF9(JqX{dz?z3H$ӒǠp_#-bbOI{zME J]bU|C @eG~b{? sFpU6rQu/^?_]]O7ڡ9/\D/ޟ_~;̶_珯N_56$YeGpyYU,}@Wpϋy6׿)j9W<@%1I3yPW 2XLDNJR $\R9.A^R0 {[h%m%1̭wnl>s.lM1g]ZRb.1.ty!>]f&G]2ύdg7kX!huH rL 08 R h!h4UaLR9};07ZśZJ.>\|~39No'qK=MO;SBv~⨁C04s@TEp (;S8eRbek&IIW)ҫJ+t)C<巿?zS~VP=1;?m6?p1ЋJs HWC[@Bx mĠ@Ta3v貚# ]-Uw z9!'Wƀ%!;7v:OMU۠:>dy+j+U}C ڄWo?!g씍A^TNU^-X[1Az*N!۪$ڌz. Ա#AuM g9v%{AN' :/ hKn 6ɭ]0jZ5OSs݆bzJ]uM1ztB<)BUK@ۮjXr*'֣U(Y\&2g|rObiWUy  `þr\X[7R) Z07ĖD1*Ӯj~p D,s*% E]#)I!%R)XҴlr6PI%#ȄlSSq2yek*`\u0sd7OJٔBܝ O/)O{@֪bsHN& ">[n>43HȺg' Y3CB{Ylu;U>ۧi^)a\-]LSgfF&֫S^XwoyoJ9{}7.CH,&;FKrjQ yP!4:&8Ao"; ڭ9bk机=6^,T4 B/ کcFT5.!k>LD =dg2[riUsL!!씍|<ĪP՜J\ zvbMOsJٶ߼o/b<٧Zy]aڙ3aI  C+l]5?[ saA~Q%vG0eb+uP!66OE$}ٔsz\tƐH.zr!Sr_e#؈oΛ?cs w(LZ,/g5[y몤 By,3zX*j!_ߑe{Gq4b*,^eܪ.zQkQv~$,~9EV-YK}cod{]`nSf*(j>l̡ '`6GQ}V$r\F3f/1$%%p@K\}$spYjҶiʼ.=G3dB>Hz\|ٽ!G\iY 8-[Ş_~E_aP}Gxp"/cs5eybn MN@UWtM&>vUPM|w(tM!)HԱ.1NDm-yy4FdSƀPAZFл$~xY@W</֩QQh 0}pGJbc upY1%>[Ogyҏٻ KEZT}! Zr9?xǗj[lm2}ܹkHT;:t2dw'd\@U=0~; sPѢwV5T9[{d=8]HvH!ȞӘ6@'zF @rl3R}!tthl fϙ)BThPc@( v ӯ5q/UFǐ:M@yӛk %䪔\EaFkޕ裌HC1- ZCl^PkxwnɖnՒμ|~(Ċ@Z3/ȼl$tmgN:MA*IF幺fMwiӝ6Z%(ZҩjA佞DzZOMGN[я;⧧mTƛus?~Ϝ_[]YeЁ7Yaw,;Cnl!YT)&~NqYzzLD_si!SOP^̞b)XܿAѪY F,*߾h!ta2Wsώ9mJ394 Mu\FA=S'<p] ԀJEKo?ܾ1mubooCyQBH۾DhC:KP<9(fid+aZ/rcBތG`]f~i Vvћ_[@b?ٜJ)J͡ߛW_2熮O8`/͆ _Q$i;Fl*,H牵)ǃOxVē 5_V(fk=_ ÿZcV8~8\fD,DbJ9e "9+$Dʃլ(%'miF-*uԈSB;ęEc*xLѩ hLy$NiDC@j_OY{8"Y6bovU~:]4xԔV(eV{ST:NLR]$8jQk :(<*\2N6Su1Jg(Uv)D5kV(AjCp#ˤdmx>+P5ќ)x(bR\PDV@h,:䬔(#b˪)!y9ن&$qxQ=Wɢy浤xq2#@䂧HC16Kfl*yu t,u'+xj>KREfTuK)p 6Du63P w VcA5K3wFi؞7?[_n G{oR*LNE4#~ј >uøB9p9p]ǃ"ª 5ÒDB-&%=ʳ="q ͛WϫiMVUDJ)~|hۼeKK/ߘ$T[ǾoŌ.qaqT.ǸԄ\RI^F? p옘FYᛆ^٬ƷQy/c~n1uÎR^^]S)+mFF,BxYY߳sEpԱTR_>Vhvx*-ڕZ^atP)C!G^ēZlOҜzq;!0BhʳP@lV:CP,Qh i qRUm;Jn 4FupL[l`#Ec'ÅJGD1plQ$Ai *x\A06(bT=ڡQɷkumle7"'}C4623+.6).7MF|vpWs!i5|4Du)ܺ\ְPj 0/oXU۲_y-Kބ4 B)n4mֲT0׭vl?::֦0A!BNZ4p5 1%G(I уSD[e_pԯAx&.{fCN-252M=p[- (P(pA1TtS7v]dIk匦M<DD)brQGT'.0Ϭgb}Qm_93+i*ьA9ٝ )&ͰV\XЊ@{v|ZS3w-֯$g)9קPT' ()2PH "DAqJ*j@U8B"j]. o]}qV ǰqlsaH -K`CITQBb i_wChߌ47!n;t;56lHA\ܭ z[uQ,`D`˖Xlpzg 0\) ǽfF&:'I29R_*mFnJVFWB -g+rr(\(mnEØ-iVkua"dA(@6zPAZahJ!0%AGmkN":_:!fk TA9t"$¢XP$̭ M:CjbrꟘbt:Xlk"'>*Er[YƼtP(Yv9x,/m(FUrȚjEqi冺X,ElJ,4Pbpc6, (J!d 4 z Z!^-2OQYb/퐪ƃ{I[Zf515pg!Ίw<(էxUDÃ#Zoѹ!eQ8z@8?a_#h}vXWM9Jz~ (br&m` FV\*UjhpZ,>} D`Ќ&;~=32 b!mY*ql1&|q0SJ5?nom1NLohGuXs~u[nGQJgZ{ȑ_a^@>&b37A態I`I,i%q&~nYj%d،Ԧ:E]u=BAc9U騆giBfVg\WB(w ?ŪfI㍏ʞ i!*|k4*>~r0e'.?ojxZA Foma[Ce-Ō"8i&mnw7w!8RiQ!u(}hb26{r}h]̧ t3.OkY="QcH$jjلlj|,=@i$X l zY=5^Y9CƙPe &xTyqtw.݅?U\@a 2Ejac`J[`H*K5&1+Ujr݇WWe]W|W1k/w%s=\ ϧ t\%d'<ղZ'n " Q$Bi!PYl;rk(2'r͟c?cz?n^:twƻDYltn?jZ ]>8ȉl7?h+B5w:]fQA A5:2$g$dD(%1B *֨DHaHyrƭZ$`{RR}(B,_e .b@suq tW8GZD8Ak”!"#8q`bM“"@c)i3D<,@?'PҺڮCD@},T4WYbsa)䈼NuOQ' 'BA~PBć5["H iBJ> P L,w2F i]e qcU}|"u>>;ϝgt= ]x"Cqֈ :_gĊY5mE';D8qp.uzw Zuu^j* lإ=Wy9W(NGHDEF&& ν 3{LkH?btߐ-%Eœb]M#Ew#Z'W)͜wD$r*.s*:gh9S@Cn+EC39/ũ{"Ӻ*Dx?z<#rF fx_%J8OC~/4~q)1S[s7"+$k;)k$ ;jZd5~(.?b*+Jh.aw vQ&ze*YN7VYR $P FTnD=c+9%&]Ai.pnǽ>GOwˉxX>E? !R |3/_NqQi`iH/_2#| |ٞpԪr +#^7 :>1c ޟ?\VCa§ jO fSUVCA0O*s~p8FTh(և9[č+'BZukSxVdО-%p&[F sw~n:+3|ĭGƉ}xJpb(x4]c^δ`mCF;L"z{Dy@-Mʮlq3߃5z1JG.J۞{`Q+iE7$bch#U D 8Bh7P YndH @=4[l PKE+19(JIm:ПaP %iR@kV pc34ucD$O81g͔VQ%8>1ch&%uF)J@ic:XmBXÅD1ߡiGtw9-)4A%57H;CH^,ѱIE H*nkj$]jU[E2}"Et+WN^f闳ܸȵ h9D}9gsnso$NVzh's7>sA>[p^?9z+I?G=!`I kOH)lJj"Qձ:-`V2<3< 78@LjBz3L6B0uc%N4F*@f\8V(1̂E9r;J  δVV/#Y_́k9A:˼`$[2^Ky7`0 >{6YY/O/V)דx:~3hѰR .% D>pj)7 3~8e6kM}Vt~iE}О]Jٯܣξ86sOp_C@t%ǗM}e^2_t^v:}v2fO5nδ/iƗ+ fi:i0si{I[]hޮ>yj8gY/-=!}(j!Kſ*kY A Fy໥5T@JgJHO_\grcgg^wp|z9;#3 =gӇYou}>~v{i;.~=ߗ׷'x:wWwn^;7SSf__^;?J'/jpzw=i怦o59n)S 'j8ilxB8t vdlpcM:`!ץl=,ZOm&7f4Yj>^71tZΊ,nD>}.M(ݦҫ}iu[=̾bWQ\37;{s6_-2M'_CߎJw __r]M/xx7[/`t}f1Fu=z>I⾄Tvǣw)w~pχ^ʹfg]~t$mShoEiy_M\dj<_Cd7nX|[|rKn )+]f28~Pa_ƯǗ7t^hqvֿ7)_^cfn[׳r.ƒc>1k~.?@YgE[DJYXj" ߧL}n&j?̶0BynqFPbZ̹x 74LܷF,s?IA]"$5ޟ6cw=[}>O,UyVqTwոǭR,ݺd$K_A2*Q$/z]a!Uow&IΑKzB޿_B=BHܖ ·-eE%<&vnK2'qtnKlu[6,^]Pu{Re#Bb€'& [0GﵔۃkYtgeIK >j߮0d6$PH_HJ0y .6bҎ^@}@Pf^U$\+n4 vuyuo;O1lw5К|0 wbMc-9gPpTd!j%r8䧹ӰZҕlodOc}$,r$(^8oDZEf 0 0D4V6"H-$FV&BXtw߽,^NtQ,y}ztĜJAx*R& ~ Sŋ*&rDv[qt5{"Y*m>"Y !ȶƍ۝q V;_0#eZظ!"H6WmIPC-:ϸx 9-,pL$7̮vj6>h=-3+nž'k҂вdǓd̝1֘Sd̝1ܘj1VcP:Y A_JqsJ'Gk#✚ۀYZ Hfu!/0~K7&o_ʺ`쨤֊31“,;~PЯ .s=\233X3S ):-+ytA)x*:a͋\\9[Ƥd:ht.Ȕ4@F,F/]NMFY鐝ۊ*F=]ŸSfmN΍UC68mILqvcf](AJi-x@|̄ W͊w&BfO" V%٘7hBct;Gү뛘^.(0pbѽZe9=c#2ȗAE *ePQ jRW(cd%F Kő*zJSdCQr0RIn8VL}8zE zCG1<E^dóqF領G4d ?.||:7φpIqsq~ޙ+ɺ@X ٨;,g^ Ai qAR Tn5Z-sQK" \2iRhÜ9]6HzU meLvF?NvuYuD+؞aGmFÐ%f!`lcc !bb W"Uif쎹]fvqeٖW68Q[\[ &UxD`V?Fׄ7)pr>6״'&|)ZpbVTF]T߹rKwn8 Zے` X}yӫrc!тD E|{4n Ls]8,Cr؁.jRϛ2`ϜH_3FqBj\mQK$`()RP|9YTLڼ )-4Z),= d V@&$]=8{(Y@* #=YDʺ@"`ʼnYI?r7>4#^{շoql6ywS#,7t viE g25'u869'+)6t+Ӂֳz m˻&<ɗjf6:TDj\Y`=uf#nkkf\D{s~#L\p!vp8ZޡHXXYBW2px#&팰xJ(98q)d5a7&7\eP&~I(.|{‚+VXR! U?~}9w}諫w=h1&_D-USß j鴘idbK{Lܰ7>%5>YwE4s"mK92aI<s=)4QFvpH À{~e|}J-@=c/^::fp7 hH01f$i+Ba"f(tC3, %-Ocߤ.w(ư3rOIoBn$@$:)Jqջ;Y{#` *b^FOLTj3IG,Y՜{>%ʭ6hDVRӠexgJtٽ5 ZPeGXv`C5 J;{/8pؙ/_5-OXQ*:ᛔjrNF{ V2-q%>0184 *EoU4%!ChNXT"7tmsK;32!EP-(>&aQ*e/0ɍLWe @wG^l;Ou|RV{hJkFGfAulsv]֏>)-= '.1D;:!9H'🫫i`5[{_.`ֺ2`=slP ~,hdVF+LAʼn{w}3O|d@J3jN%{$ ,((p*YN{Mj^(!_(۫F&sH=wU(j4hbLc.}j;(ICYLh֎ŀɁHfǂ K,\XuwFpPaQJG@!Sm]#I+hc%oi aEЉTYRYZm IDu@r>ii!&JZXpʒ)'$9#%##F"M(nj[)ҩ aB[%jpSe%"i!="g SYGmkF 10A'*kaI@ct"$O’ JWxE7:> ™+cK V2aiRQꘂ#uiCZM 9V Qc @<̣>QEV{Y/;Ǫܱk?g43mx8~;2ͮ—eJ % WoZihW_@_IӏcvR('D˚Wa@]mxY>BF= Szr{3V6u_f+l.1 8ΗH@h|)DF6L jDjn"0etDI-JBsaB`WdT2SPeZ>|'%mZ>|u:C* tY9n'w5۱a\"L3Vh u$!LƆHJ@O=+CɘJ*t:m_F@#qBm'vhB,-<Ãki†@1:^+O8LV&+S$^:Jq'ړ q4nQO?Tޚ1ysʂ:Z՘sw HN^ A/crVw& ٕ߰l4z diJlh ɘpb,5#nwəU";T9$Ækb;zokY(7_Urcoh0c Z)ú?ʡ֔sš{v~Z&|W۩| T[<bTت?PIF14Y'FT8WgIƄ*v4n j-1w O3%|mRRcOeQD>E~vfc5 H+ŐmNhڍ=NIB?ȡzڝn3@RIA3O|2=pFBzap_gaEChd2xJK(-A+eAF"s8M\hE)\؏ݾ k-j ymxoqbӷc!X} 8& ŽZmi[ :J0Y[ɱHؖqcJV*߿F#1r]#1J29^}P9e (^pR&%U?]M'e>/ }Y?_ M & ;aJ钙HAc8KdIK}[!y1zKwW:]EIn@_fI<mCcI ׻gG7SQş8ҳ$mJp F"AҔ 0lwk?+u#2 պ(VG;ע >Z "LHkl$@jaZNͱ+,I6zNs"Kל0]9Ŀʿ?t>;{EPu~iz7ͧKbY_|nJvnzX*/ ®|8}X Tj J{53Vwu-RP/?i J +ޣp ne.ciYȬ%XKa*dFq+OTWq#eWa'AIHp|H-@GoDA Ub %Snvavd!dF6fQ qT\W9tP]1"hN?LMBI`j Vk]P##Ad(vQ)5 HJHoDHUxuhPq!ij4B`SvieE D,kfrХ^ Z8:It(tIQ11(B*tc'xn5=2ZǙO #W5PCu2b5wȈΒU4t"Е$m4b@唬RR4S5 $ ^-,K#eh ֐- ^Tѻhh/}sTboj*\86<ȩҮf,k Ĺ < S?"Vh8(NsYEN"jټ Jʃ!{QSVJDŽ #( ^+/!(c]+=y,g Cⴥޒ4mBc K%`}.aT 幩$=pA$ *(^ yZub_Gis%ivѠ+=ZKi34BV p cI,Z5s`A3qc_"c+9,ڢ6e6U$1?qSMsAEa,8GĠp#S uFS#x WO3mSs/ JSv)_FL&RXUjec@#%EW:n *z)w.42rM*pmo.=/b:>?&-:V}]W%DV1yg9GIv5&7@k0`v2*k^1>}+ٸ1YӛRv.).Aΰ$n!w 퉄N$0B}y׷inАFst[ؽrȐA.~y{s@@CQP'5c`dAyN1}Q 0@=) Xs3[t;C-*_ocEW{wxV wLH.&+רdNTzA7n(h)2_- 6@'%(~6~B]QBO P ]/?ߥd` J&ڋ+4&?s}-nyYӟHi/^:smjG"znЈQgre< D6GI1&:DZ *gX(dL 4Fʝ]3Em&C5:lo=I^D}خJj-܋hCIݼ6ƶy;+x?v7?4Ne.jcQe ꤉c|!٫ ibJݟ jXP!w|*Wˤ?Dƶ~~_>t@7^$7c:3Ovyys.7L1.7G> FR(X+,, gjF* crՇ N0"T u4 X4]ox5*Žs;nTG+8玝`,'yXO|LHsD/s s9e=~DfY>=GҪF42vt ?/-8cf虁n3l@e:pKktь/ٙӓs8Ce9*k88Vf&OJMj:`GR}=Q by>hDHwL+J|)e QVow"- іBL[d9h( { 4V+#K#OP Il,ܑRN@ ecTs?9F?z|Ы,zf,AaKϏk&V?(pW?Bi>H" Wa>SXӟ v{6߫/w|ҳ-O V@m`Gv3[) rD]ۄ<=DNSGn+#3"u}'ƥ: ߭9S.mBJ"|2VhvBB"[pba1qb7w@s&t>5UQ39]#\:V"~WnC:]@z,iRZT+u0xf[i]Lt^7&Jw?v#o`+8g3j (0DHKwvIS֋4݉ȁB64#rq}~'qK)v_=y<\LD؂fğM\"w*./T::TZ^N9~8]Z p6V\UMU;Tm5D<0hR ):H&l D b  <"Z83d!]F 8k ,9ȇ+ribw4?(Fr9bB:,bv%<,9CrXQ;zt7+ 鬇'kW-F&$_ig+] GbZ+lW#+tP*ɀeX{1/ p#8[/h0 kc-i++j[ГM;MkN{osBR 2F% ^Ixm*1.:!P=0 3߭qc:ܣD9cA q ?/Dӵ|Xaq_EA4]h45S4TYHxW S2tV/}^s~=^"Q o8*_~E]9ו9W B=^Jh1 sI1o{2lq,2|IN7$-8cgchoΑQM44QWw嫡c.v{ZyyR{hQpAs,3f0}Ų6=r^~sp=˧O `< 2r g­Z|Zw& cR&HݘoX)!չqˑO3)ՙ+Vv<8.<`rLW?2Οzzrfcr`sŝ&gOl S0}OOn1TDG}M#8/@ hwJVV)]`zjt9֘H$m͑R&(ڝZXԸ7<000yN!!ޞB> ȃ)c ;95U-jU3[1gxGViJ^n?"ZvQ]As3PxF^pOGcnnWSI0,zm'G}?ŇU_:R^uw9T% 7'z$Shexj%쩿mq/ŠI5V;$J@UW>BF|כr @ߺG S  c[0 "lW7ޔ&wL7ӷ {F!~=bY0Y8|1w?oK(‹6vO@F&U3qnP`8|a&o\Uܼ8:qy]_?Ǒo'9KI'FVNRF9S'nas>sͅ Υ{Gˀ-}c4NtD`GiNQ'I:Lm ?]MݙN2^2Ispx@5%$M3﷠(zE L<%Q$vŃ- O F ȞXnVyR*\"-5c(&R`m'G 9 ː=xIjW;7G?]_qX {q\CVkݷJzS۪ϡ*X.߬VI͚K)xzcU{r@ɰBRjMV9k볶S%,JܗB-"R.#I^ Y>B wRuEwiox;Rm CqZ;M8>Lx3}CuvZκuv_h|5C%4m̩UԨ9a^ 3MYs|v|`!΃KUeb*J%ʸx%  놨%Cř:=PrUHS(Ĥ^x+]~~7UռpTN.d40&ǥOJ!&͆#MC0)rHamyD؏fuI9a9kQݓ^oct`Zѣ`f+"◵ پ0]yOf\Ul?=@=ee"T4|Nほ\W~d##s~duULNԘ0='2$pn5X' TMR 1Ϧ$ GN8$A)2Z"*uYuDIDח?/$_NZ܎|)aWf=A~UUd[~udo[aQefo|x%N L[<~4f|y YF_#^,Mop.S~"zO;3ȷ~gkpnkaߎn. uf\=y\Ϋr5nSB L0v.SQA<zזpj?{#th.{Ge!n76"D(^ܩq=dm s{5fv$Cl"&woR\,P36˳fr1/TE5yIj• Υ0i2[5K^}s3gIڸ>{[(d_-L}JH`VX۳qE؄E'J!/H QƧ% G!tOO(|Ƒfcad;1a?#{Vˮ= OXd >saوxXK7{c۵3Ld)~nolb?؁m3p{elt3 u/9zp/ Rz$rie?ͩqq@ =r%4L|<fqQvٝ{'g/ X\c"X]Z! A?Ig0Y,S/_tޝo1FI(3;˸q zV[@TAkgSOZ$AKK(qvVo{RȄ"Ȧ]嗊GD ) *٪pWav㬝Pm"V8p$ v"iїG~NB_r75O;@pI,b-y* [|¹.끣^R_ ^tEo]=`/THڨS|3=ul ۦn]7jޢf~kY ]:ü6=Oyy窐סL8&>5|+"g0r5b.=,y_1[.a(UuʴR9j|PL|e2>5vA;I`z@'Yl8YM61 )ML0kbpBԩjy<fF|-nt. W^`FRHzbj%;Ԗ JRhT(zAHvP=9URru ϘZIS!C !C@ q7:eƼkŠ~)Usv0Vݩ)!=NnMR'.ĵ*DVٙ?c[Mv&ǽdÝ,15¯KҵG؍^n<•S{!Zh0<$PǺy֘(=ك݇~|JTx@kޱfh6| TH.%?ȹpi;`<0|`g/i;w_e]M]2PUU*sիgKvw+#6y/ɉ2YN.Ћ? 1[]pt|sVƿC*Bb#TU #-'N PMan (\f }6pfMG:K_ek=BXT?;,?3HhUQ`ԗ| XZbǬj$ "S8e#fc&!R#jXvUBb)PŲX\ \fNK.WL .@8IN~Nju!Ʀ9P@ۆQṂTIcw9aL"Ղ0RI) yZm"H\bOJՒՉJ [j8Yi[:)Y&ٞ`pګ[8W{.wgڿB{2jM%9-X;BkVb$1D $HqR I)L/=h.L}J osnp1Q@+d58YTOeEk4v` }xw&k<-Gv~Jw#7B\ۑfo?Yc9)&Pel{ziۉߣʀ\b!;.E֓ K( y&dSM]`7)mꠌ> ne0s]f`A˲[M4Ȧ lE[ʘN3SƎne:,䉛hMqMvc5BuPtBQN1h,hYvBG_d7ễꠌ> n0-l݂-nuX7 `7 GNn:(c:Ϩn߂݂-nuX75"%ytab Y/Snh1\~7棷Ȥ31HU}e\>;ش3?q9\dm^}l 󧘌bRbaY)ş=*gr3TQS~^} oϢo7f?_?gY{7+Δ00wX$ٻ`A& h6ŲyWMJ%S)):kXH֫U]R@{[KͬfnΓ$too|폖2# ^t8OmJ 1(#y}ws'nR5d2?'?,3w n;WuZ|UP.~P'c[%ZйB\x] wKf|]yg>Yf_3m%/eOGkJo9l>wXrbo[r uQI~݂_--pyjV.ɯ\+{˸R'Wi4h@-٘iZ>h]MwJ 4}[|LfO*jt͟;w?8\(z @F+POt}TJ Q{JI^edS L!JaOR).]C<>ëp2YO#Z*{F6W{x]A+B鴈WDF[~47W |暚&59RF_58L_&g5տc1c.pz=a$YS1GDktJ3j3B^զ EEs Ui?GWyلIYk+,PN֛Q8W`ʬ10G3!$-b'P(I5cB#u138kYlxL+)PH"0/=M6LPƴeJ%IS %LXa1:`8}>bpk>/uj躧cer[Ǔ4;ұ5lA; DzQ{WoؒOy= ψq”[)V)$ 钘Dk#BgG!nC>Mg#tʓl:!^]$ 5$}oXG~ƂjmoS\њ^$ʦs'E\*Nv#Ii߯Uw(침"X0%L3) ǂ:%IJsZF'1rR(ʸTwfUpT<崶ki+o v ZZX YFdvɓo]j.(,ū>,}I bh\ȎQ랿*\MWkz%_:Y5(ᚇb0Z&X5_1g "cpP"bR? &cOA'BgNpΜ]M5{o1Z ?fԊ;TT;ٗ hE֜bGq߃Y4HŻndϳU䳥]zIY=˧bR70"_dY@mASWG7}L]x_A3D~1Ne~C&2.e([ƛqfl=Z N.y;2@}\ew g l53]lWGowYqWף 5Iiv nrHfYU/WwV8#K;@D)܃YXνJ*j<{I+ "J k\`?و̓RSx0W  "fvC;kf/J0dy>P$ >dԺTB'vHi` ODO?PH0v>&kv{YFT>K\ӊqn4ǃtU+o(+!ZA3{7[{]{i|^OkqoNk^ W۞ H*] $fqwq0BY&T֛I["Uwggm5F9sү@)jMaV)Z=W`!h4۠+jz JZ㢕! .XZXtJ׵%SµAX@D^Pj:sTm]P ~ J*.aJGME9ݐX&Ș"0{ hMDC%LϡF+qv[_K~JSץ5:ӽP]#&iAXx~çlP,@WYuO.6@mWhMDp~wrj"zcпgMњSrRԏHrbRʣsYV',2΅wft='ۺ"+Ś3n>Mޮywzۆ˱##Db.\dňgsI' tbȁf:<x%=ac$x.OK8#h>N*k.ySQ d]\.O=Z޻M:@\DĂ؀NqYu+D| l\bRgk4&md0E`P9WˎruM5AQpTDm")9T[VX2M?@0+$x@+F^L}Ub|}m_&fܜvzD"l܂L~;xӋUyd (VIn@hg/N xhF< e)4ݒ#?m^OBYC曫li$;ƚRq`Fx*ed#-bԦg}+g=Nk\6PHyrZ$MK#'.itL=Ro Vy뢋,2R }P ;3[\XK4Ԃ'(9TAwRj5UtG]>MSzڰH$`hcgF@l^PAܑ'W`2 A72Ҟ]7#cC8k6ʮCS)JT:yC Ȣ_9ݣp(YOJ~ŧHJ"0IxWFjXk 4kS=w[mYf-LP9Q]ՄWa!(_[C8Y,zs.ԯɽC̈%0Q:BXkE `I01҆c4u.M4sD~fU()m%a[i't6PxxScA#hl χk*3oĥqڂԧUH t @nŒ/ ̶.]lmvVospn7 oM1|y%s7g:*R1BYCvP+DVꞳ)x**:ܟJ:ĴN).qZ Gy}{fd|Sқ_u(2(ҧ*,;Bry~wd*F){P n(̲*N7b,,.,rh%m}n¨O|,K)mX@ט5fʑ@8hF(4dlm>i`6ymyWKԞ魰q`X1Jd2e_ί.Z (8,gle%;lD z݆3_&#$)$-ߥdfLE^5cY+Yݟ#kjV+J -IL*ȳZt.%cD9ngTFv Op?N]#uT|A?:Bc6NFv6j x:4t>QyXȻNѧҚ&@KT 7JK1& M1Gɩ1Th҇߹p2Ex` q&6 dFs)WlD6y+'w4okWE,}h p?4Y dW>lITnC"(!O"& FuBgg*w/DyEOO6XT:Yntfѳ܅s@NMܫS<7pI@Cxhu=(#XJ `yH}(& %YLl·R[̵b:xG\SB(9JK hxsQKrm&ʞ˻xAk \ScEQ\15u`Q}T( v-eeXK R4 frۤ3JU꙳Sec$DǝS̨:&DI9'ԇPFJTwG_"5 *S ,plc3U j;GJ?@$C,{ݭd4h6߽X Vr Y F̉uGh9^Z)m}`|<$yUEks)ï9DO>zbP:rW:\.5^ 6wݗV_oV]R=,F,e- y<[-1Kb*9Y1gL~&  K.wGMX}P 31d ^|M6bN>KK (sBڻ:\1}EKޕ6n$B˻AV߇l'lƘ#a'0x4metE3%Q)jJhͮ*/J |ݝL>3Hnם 2Z./Á߄n9)ER5Io'mTE8wV:<ݥdm̯!;\r"nJ[Ԙcqtpo=zyNؾpz_MF TR~V@SUX [G9ӟD-Uڱ[75imsJSΐJƠ OOBIgmXkOVu: ӆ叆q1y Zv]3a<} $!bOdU~02=Fӛ 5e@q>9`6^MMO ? ycړLR]͉!p3T3Ap%`Fz΋it;S̗r~) $5MH}!f[Ǜ{|Vs<``0ߵ oT>>qiQ b󇻻c#\sy5Jڻ5Jh zh%WR8g WGN%]7) 3`srA',O^&w8>[&5^z% wp0`C*ANUF d-x{H)zvY9 @HpccA" 'V2N6#'Å5Stk*od6}N8N8ezf{Bz(t1S%.ˡDQ #왪U7goulqeOn<り.i6za0cs6׋. X@bAvip}i@93h蓜}iNO몛LbXn=4cӬ\tgi^BORF2s/}ހ,6.2aHxE_mM R&䞾 R` iYF7N>݂2Z?^sR9vQ\d9f>ڥfEp V=; "#Qe`ݰ"$r^9z(G3=_ 2 E*J} PyP!Wwޖ޼.ߏGWÑB'  Ip. 9D\OA} ZC /}7ZoX^ԆcXaC.1]f.뽗A,m*蠨}5y:^БEy~(ĄtuW[%o8yj&¢so45Mڲa ONK}v9S\ap7:4՞3]\N=mGFj` d2>Go О`(*c_UZi9PS`yʳb~屖?I1l5kc4,pLS q[Zk8MJ(e*SofgZPƔ3gX*D^t}=ҊlUA;o](S|y^͙Զoa}= 8jwԦ}b!s:KOaTf,'w;{5ʚpo6JXؙ̚l}eT!XfiS5. dSfY-\cb9#RvԉeADS9Kb1w}pL9.95eW*k";Xؔ2nHj 9ӯepaٵ %dG$r*a(~jYVwlN9bDl/:w?H*hni?V_f./[I:6{$Kd/@r}y ϑ{i(RZ1 T]DÀQb p*v.л$iRѵ~}"|6#DWVԪ:<!Z +̏`!@{:Vx&_< 9`z"`P:+ gʜ.3s%_Z45KͱPJy1&ЖJ.)+% ȝsIT Tkw0]w74Zâ?ٚu>&˴tE!sOX =uZOȟ" edG8Re%c Zcn> 77#M2`h|)z@tH>(W`)ًփ,LjMd5> ( gaiqјd-,0L_'Kiz$W8r~Uzēe)?Ջ3=ހA ]LN ֹ#_s4}lB$Dv@7o hA ſЎ;s] k愝Fҫ*&.2oā`'A05otljj;bY\|P+fk#[j:fݰP O@04c )+" |xb F}nͅi0nā Z0?cnQm4*3g'gI7֊FOfU‰66|͢4v\ ("t ye0 v[=H٣)8/ ׈䴒Q ql K*T.)ˢǞ e 2i?wP9ꢮdpY(%#c}l?,_ Q?jʌj{6ٽ=:qQ-uLnlh2CN8:|͋3W Vƨe@=Gy_)^ f6hٞn:.Zѵ==j[JF;W\/m̗lz QשcR8it33ӡ^)t]2CN,ÔP6jZׂ]l(Z~E^ V5.$%w@2R_$T7M( A}ֺl^Wқ~]{N~v%ӻ2LusLy9KxJ+EVmұR=cE@$G%(rWLvRM3kH% /k=H0g^yZdXFD鴆Ѭ6A 5J. 8M.rlv+xI U %T/y4mCEq4v%C="OK:L;ڶ19]`uN|^"'0vDAAd&&t}P(͔f.a6!؇&`ݖ 8]K2HcUrb5 ?.g6Qd{M3orbM1?/Ǡmb&ĝK ømr*DHU[±OÉ Ue4~۫NIJ~Lyז׻WT3AH MZRMI?r< =0U̓?eSQ[0TyG^{\ۋ&"GO? 7,{sG֔72tx1w Z65{1zfЗ6nc3j j1˙`GnF!{1tuMxM>+kf>R5_qIMo;7$?(\m3hi5}NDo>ćPz_+=n396h|eS4kU0PI~{-ini6] |=*# _o(@0FBOH 6"k$U''LMt1˗'%hKz껞wڔhn^]oz-T₲3=5DŽD `k*Ѳl>*=z1z] g)"o0\̔W"b>t~Cp W ]3BpYy޵SgXh\MW 'כ&fh݈vzH Fozo7JpB%nd)o|1(BBsj4;#ƁǑpO  IH =ABs"9Wzk įΫ3-y7Feh*|<݄> r*^ao I,UyG<@H#A_AQ"8~w"U D_}[oxB G41 cHBjՁ̋"4aTfSHut o5DVhzC1@7&?·=qzXk89#b@O9ɓ-R,;?ճ3=7 2.U{UB5smRwnWO|VaTh͞|vo Fțe|d;%Vq+@bajyZ7V/MQj3N2fgsڊ]p[f$wчwg}SRg>,n?_O 'ba+[/-hJX<Mz #pHG3V.wj[{Xz+j>,wU݂Zؔ=qv (q~Oy-T#\lV^?I, 9["I` "  tY X O7{g(tstS0.PAO $(?Ѝp휀6ФC> $3-|JbYTJёs2V$8+5V{n5]$j"6*Dm,ҠF(J0Z($8f 4D"X⾻ܝ @D6 \t"RF,7ǬoΗv8MP-P#tm"tZaTZvr?(RmǾGwGc`s_]ߺ=2sd&!Bg\Fe@+N/QI9Ъ~+}xlsWdFfQ~`4įG팡ىt.4rNFcgJ4$HPzkؠ ID'%-J 1#-Yn`h:{gUضE;AKLvQ {&wWa /isα,(Xjs"m_9ܜop!*a Pr牕4x |, `>!c@E%K u ySViY rCl̞J>4HUwFX Pwf/)3,Q{gHd\|쨷<\-L&gJ"Z[  9"X6=>_Ny9kΚnG5E?jyʀd@Tc[1NQe" FmP"W3ڛ1imV_I,r%ƑJ>^h#cIr:$ Zq_+.?A|k)έ*+N=V6-ϓQBc:Y>F@b;LFlӆ`4g|w%$=E9ޅBI5ͼ'+tq7B Cr['3#* x~zdq=};I5[u3fH0cVLZd0:PAGQ<E|'!v+&ǥ2E<s/o趠LIuO%y{nц)%UWdC cLc_:@n|+V_M)/e`wj j|I)h/#N?o5Fxo(ψɎsoDD,Dҍ톩5_$"yMy<֣`- eE/S\g_ މ7؂9JTHp|{[[}Mbrys}b!%iu 45Eȿ&݇SWwn5n6竡 rpx^Wo_z𣭜ʉ7\%s&I}{Rxfި}s5 H"[{"S2w~ p,קduBYRO2߳ǫ;͛է(=dh8 *cS2< 9 fKtcJl I_bmx[ݶ26VsH:IIɈ$;4.HYZҚ`}smF_&/j_&q`헮޷B")ٻxl\6-Ws 6y'A|FliC $j ^H♗ypFPڟxc?c8^ūב袺_bۯh5_lUY.{e=5_ZVHHW9M`=!.70meWUP\:.W_UܭV>73!`jK/!թ\2IEn\t:fI?j:QԝZ*gp~m7˗J݊ՇA0 f꧛ʦ(IX+Ƀd)DSZW X[TvQi/UPP$T uJ81!Xn%U*0~CAA?'u]F8!A$k>(PXT3QE5A4F }C KL"m!H@Q ! P-iQ{ Rqf|:+R@)N!FI ΊDqGebU<֨wT +佲 9b7._?kɒ遏[ӝA) ɚb^rVgɴDbaUNX1+Ys Dq5sHyoP E~m,4hulr(ej;e"\xif?#N*?)\:j Y)&:|eK*3Ýui㑃h%"SKrS:|ٙThX嗂6P;Y?0ƬYĜTvF٣Ky񋼳hb1fr6C+sF4ec).F1%HK(1z9(w[r%Ǭ<1= 6l/+e=<%(V;*_ZyWzx:gu,}y Quj ;MLz &߅ )> 3a⓳#&x_>=??O'%|z@v4&h*@Q5́y2^c)31se3gԣA#t* QwC?sЄ bSѺڕku,mYJUÌKZu÷w:ZvMYb~Mv[|wCSKx1{H(V"Oj"VIc/hzF- \5 Fm6pl#Ƒ"iU/(j }/(҂AQ4aCY;CB?:IءsւOF4"mt?)X{p$`Leb ڋ.in\Ӭ]ˏ dn34wVC17/!'/eD?s26`Gk`;a7P`$RPc #⣑<4hP/JekzèS/JECMKa\kQYlϋx{s*|Z>|7w.p{"o~%7p|?7yoePzy"rG/fs 2gk-SXOM/|ZB,:.Lp >旂ܤ鿏1Z݋bWq=K(H2<4DX¯r` &Bgh7%:yBVʃ蔎FMdh>v+&4WWL)Lg fsn<ȁNmۄ; FA[1ڭrakdZy o5k .H30O/8X뀄51xᢲܹ!Z)gqjfڭY)v0 wQLhv!E#[Pg-:kփXܡtHl%D{-9 Ŗ3]0k-(gD4Fbcx0J+08 2FN@*2 P0k-1GYk> WʃR:F7./yfڿ>!Z)=ђh7,[)rS:F6wwQLhv!ElvKwgqTȃM1Z;MEhڸ21S)N= CSuK.tuqA&h 3 E$1\_4m!`g,"W{i")'G1r6 ,6VyD??ZcQ0 DRY=ly=k_)rKm߄>F]Wɴ} CSgoIv"bgY^cڙ;`ivfK ޙ!%nSn<2c&"]WLNܺZ*EgS,1 c"QǻP^A1S'W-LLLfU9emEDӷ$|;PV{sܟ__NVӴ%KB&>Js缡5=9sJ+3y5ş#> }=9sPNP!4Ie:DbVJ!*R&-%dO&۲\ @+knŗ{}/ÔRb' %)Reb'~nZjn"^,ѥJ&s>,IF-"L&:u )иQ1T;E M4JųӨ{Xf]?TF1$6k}>P)j5g J2@YZOF ZlWNcV'|>\ѫa}֐C= ˑPѦZcz1S98>vSis,:+ | oy~.<^Eq>85޽KL@o7CǠgjļ[ǎ^ S?wjn1?;ӕ~ZN&}]ms$?~{pw w79mpj*1lݘt'ޙk."^Lo̦㱝~4Q7vz2' XMYQ:y^m"9LmukfHzճkȪ4UJ(OVISOSܟ;d4B"k-":ZOsZ.7"FNjI[0$6qk+sTU-凯ꏝs (P֏RkRүk A8SXֆRP94jj)?{g6,PDBX :ÃjRRVeCF)Ԇ2[O=k[޷jՔnc]խim=OkOŁ1z G)N,4ס=trj?O\{L\bn=W>Ln$PծYr_נ2o.F1&$~(|/n>Rlo]YaSGUN jLzNE]$h|HKWRhfܑɦA|Q݌|J\D ⬝S#M"Bef. {re+BMKxn.yOVa]yQ;R]ٍ$u#w^+t WA޽{GJtGo6hBۮoYg]E֟dS[ѯzz :ݭkP9:UN+j*GPB;`MOJ]j7|+̳je=,-%Zc/h[/V :PܷWeš6Njojof%dn,W$*m,^e91£3V7*JံNZ+n4*>`3Qrj?jۤQ=Nl[)A"aUF>ea~_- R 9BSBJmbZCoqT%sX*]m5kV/W䧣w*MSY4|wK*1S]^g7-c%1r}UغD3=[ԒCRضl:]Dh?GuV.uZjwpj4\pǭ ˿5#uulunZ ?#$иWF y;e<4o5|=̔:SHÁ?ȭK5VRC~c!R;aEۼiI~m$=u<{9hc3d> $Dg+h i0K}9D@u;r0-ACp7u !S vLE 8JZD0َ6!+|I+\s |{xcQ@6n!ňN \*d5IJ!NF]Ss?,M|hj"𷓗qqGŶym"sJ-n#S{7Cb݋ vl p/|pmӄ t%,}qH:0 Wb3p } 4B`g\3$8r(TQ#_Tuzj`psխ}5}$ Ű^Az$LN⺖ș沔5:_}D!>Dަyl"EYrqJɻ~)EC ]]~y>,0ϋx}Yһ{3ArZo7Û#\лxH=I3<)gF6;ӕZN&}]ms}NRA=1p&? `|Xp̷e>qG\b>;R>޵gt CƼ7tڵY;k~gVZKњ@õq7D"ӂ:LJ 16T8L@փ~݊j!CNH?!%K.r1E{W!IQfF#.Lr Vc騫)fL9¥[cDX)\(w"9&D !pAJ(r3f% "VR23͉v01/o EU baQtP`aGy f]fVl7ɖܸ\5 ]?q8~]{e/.G EO-r(,rKZ^s})QG95vQzCjho>*ޜRjJv_ZgZiUݎN4֭AV" $^^~u6a2٬\xy3x(EhAK{VCv/Tq*$eC*Kً[pw3BA+4lh/k/ZOaV0 $.D0txn{Fmڥʭ#NF4Z{ꨀgQH{'%z _Mnx1K[kw[ /H? VEmL'd8U.DN-nGk) X9mm4hm=4Ѩie{9%LhVR&'|dS* ˽_b8ׅ'NN!>o=]`HG_g6?\ ?)qjQ t 7S;?r4vϒ0-mc]O ;.'oE7EAj’^Cܽ9d,˹ͳG{mSt FHwwj.J׮mjz:'cvFTj)emfH_5N6{V?: 腘i?x"RxxzOnL2sbFX t*'Hx" r!4*۩}RNɑ!TtwZj ̓ SWT9J@I}ΎXq# 0x 5\>J\C-C $%Eҋ;7hG3Rڒ _*hӜX45&NF?0?!P@죻 t@ݫfLOq({ms%2#YB4 Jpa[IHѡD<(w1Ch>zSdz blthD2ػp`XL/oBNioKjƦ}(ڗN` ۶ "ȒxnrqkU" FpGD( Kgu⥇ll$)1(Y8Zf" %g0@ jThI3 T/ }׺7o뮭ʿj(@*pa4dO܌ʔqD& +,zg b3AK<uud'~l>ò"H",XY,G hk%@gVY adyșJSiM.:",Z}DZHQ"b,@VDmfgiUAʂM;.RgNPE0:v@$L223(%6y-`GD˹ ;0s8 Ÿe| od b!}F` u6rd/ 2U7HbCT&X@UX0]afHB{ff:N*9m?lL([q"Ovq$}1&Ũk/E@,ryP_'V72sa}EwѾI=EPgG#VOqH ;EA?Jѝ[e!0$uG; 'OP=I[$FW2p9&__,Q:(IIXNϸrc/%̟X<{]X {JFF`}O]j2syzo]υ=<,m_6[~ %O9,b`pp.J.Z׍dpަ(3.;[6k)xƢi2vqܓ=7Qͻ6_7CIC\1kiŨ7"]W8ZqA/N^Ӭ4E:zQ=ͺlMF7W F;fEo[@SSf"g#ŵtL HFdk䦬Hӭ5Jvzaff!AlBH>?FotLifCA;bseu"\ FtEI_$|QU&a⬳ ."R덢+N+n!K( 7FG &VźiGĺi{5h<\Rō+.U\ lK*UzyR A ATm4K S@s7K__CSiXۚKܤhiќJP$:[緹vKB h,.6#CA1Qv[5b;ARo;PYGp:6h0MuN+8SdhJV{l)fDb'4\ց5E0)|V!IL(.*gO|'[=na*x}aqKT;m$2I$8L* 89;|`~fDۯNI%@upAr5GԇOkt_ |y"[vf Mk8SwytڑZ,S^mUh-8MlWӐ/j6d~^JhJ9T3*]L3', u1)L"*-yuKD`ۍ9^dTܼ~+"Xq,E2B-2 9xC{"E 4TF LJBߗC?hoNpIKz7tSw 0]/"$18XjN7#zLzJC0[+t}5/_S lש|r6ukAUOU+lxgDACugЮ2> }0 @^F$חe͸"'Y0{*dY+` f Ik0Yt66 L~r v- 0`f<'PaL(s&rrٝ/mAoCZxo fthR}ovC"CVnV->O~I:-Ztͤ}-N><;VzgqͰZgG kɼOd<{TZdR^vJ$^q!\o` Q{ZJtUf/]j ³FÚObpPHI[Uu,slH(iXsC-}.ĺbn+9vYX>a*bj#K@yI\(c#LTˢh ncPM1瘦YH ?'Z9lZCѨAZ1Ux><Ӆû %}@}|sY,TSSi8,f32dޭ[ԜIf x\S|Y֣y0߿4;~)|i Ӏ^MoٟGI\I'§YX)-A0׊_w֨L++Tv}!=i{aVa%mIBq[(X˜\SkF~px'+o߶c9SZ(,_~cmPJ_T`oݓJ2f)M"*2I8UzH^hW?xt:(F!+ɽ@k72okqԀF]_}i0(Y.8niȩMZњthlMWsnlZ+٫j[,t߽=^.NnnRMtKQ2*hc}F\ݓ-55I'/],1hTm8+,bʶu9r|c͸< Db2a|ka2_bp*SJBpV{_'u!BgJBbNg[o\J'Ifqm=xy%W{l/fk{3 țcc+~t[O?Le~CvqV{iY⎘ur;1Fs1R\Uk% N,d .WBnO5SJSOG>]iW*cێ1םf*>$*wYzߩdjA[H-y];[H,.޸ C0@F{/`l3 $_*)T"G*- 'ψwTWg<Ms&WiG[#mH7. uW}6OZ96|[vQkYo3ߠyG h1O[FzC,6kx{<.Wb? ȂNwNe gfuIj d{ȧȍ#2b. @,|2bHew NQh#yZ+l[Ne0R0pyvytYB){עENNa^#rFT"8BQ*C`P+^3-Zpp8w |a+ ZG/@5Op(Al.ޠP#GNbԥHj6pbA %3&_8 f-ω'}ZTF{m#ڮ#ٻwv`QdVwzߗХV7wE7|s7͇7l7n6߽ŻmISU,ۛy9lQ[݁KqCjq_Q!d! /nRM9LWD35Osj!{d॔ &s⅁lJJHʓ_ ?vf Aυ~Oq9V[ [7 a5Y |+F'xIz۷{6ԬnOP$FO5:3lDf?F/AaT3Î0!L.JEp32}C$+8F* .gڲЈDgsW,)7jj@{@Qwg~׎FK> #.TC.[W Ϋ<%jWqYDYe#$/= W@A| ѓLD1Ѭ({*pI^J1Pf!ge.s*Ź Uw|"2 2 V! 6<Ϲ k K Yâ x24,IbXp0EX̥ǒp1 Xo`?jE#Ku~xpCZ;o=|SreʟOQw`#o7BNWPL9Zvz:GYڱ޵ǐ̗6GOg_3W$n[?Xh븧ܕW\l闿jA:嗻nj5x`Rڡ1oU NP(֎o~#-nSDBAH!+9Rbm]1 N+. hHEGu aA#>_c`8 di #!p[f?9ɣQ-QC?ܪGX52y=mґlR0>^hf;;,>ji áo/*;?Bu;bb(BvgC0jZF>.yFcԐKܫhwуi1Ë{+^F ,/&}.A?=j)׫ߛqQ=a; w5cx{FnJT4='R8=('6]"䝆,&3_Ff }OKc>Hx+,\2W@Z]USj a} #i-p.M|Nh*#N/^7jڧj9Atncq[qDn^Q{OC@r8ϒ524m~Cg7}ωcT{Us4*sյĺAR1.* mfmNB|fϺlGVU)cFPEԮY2Ԯn)FyW?9h5㙃S0ǑQJLQU S_-~!/A /W06tP<n?",uDH%[E(cASA^2+]0X2$^0A `1q! ÄS aBMP ?&_Mx)pXnU0v*over^?ڛWMz~X%ִ~S_e3 UV} !X9Y/V2B) ^f *p?9HBkʘ)bݳ{b ȫ:o^xR\&{ҽ2Xc=2WmxnA 2t冞8.X¹"G$= "RR"er eCe 0Qbj'놣:#9O/Z\uωZSvֽDڬ{?YWks.DoʎbX뫵E[vjb<.D\],(ZXP}v`jbBL3JЌڌcd SDƒӢZHq(2i'U0QY ~]],^X]]4""],._],%^XR~v6`n*EKI0j=-O34^pP &Z("* Á9BS  QHiI%T LI8!aPN*e%n)X= !! -#ƧKG` AGBa.)qf<)2Z!bʨ`6*x!殼לr0eG1Q$׮)0W+5Hb/7oˇjMӛ_]X~TaWig+ߖ7?W)3,]F/ Uuv-?4YYM5 mXzO|!R?Q?S{vfe7sl/@ 녱k{jZS3BKyz^Mj&IBRHsm>j 6ΣX)o}u%jQS&TF|}Ԍ/T`0)iu ͶAYQJ -t1v?>S[7ū UM6.0NIFxgm>E^gg'$vXd^>;i&&쯛G-$;n<*[#!W.A2k7v ŠF{ & #iT [ r )>G' ,n9A vKq*`-<oCCB\DCd2dMQhP Nhm[*`-Zid~[E)bj8\2v !W.A2cNAbDtBhw8N9LibHȕhL)6ư& Ġ=V!S .{4"$\ɔԣ-f1y1\ 0593\<A|nc}V!M̈x1>+ANޕqdgTGcXl8Η@SbLRWMJjncWw{ "]1ש 1լ1w5Z5ARպ3cH՘w5f]1ת >%kL?]^M ]1ש @m_>1w5Z5A(Ҿ3-TWcjuj@T,_ ai_Y(ٝcjjҺ}5f 7r@fWWqZ V˙p2>uRu:Yfҧd@o>=Z`89?;5 A(d3 L{3|qzy}7Ia8Ap"4_qP4۴r{=I% ?gi7swY3Fo5 OC"ӏK_ haqڛ_;d9 3FC0>7VkJMfBf^&5~V^8U괤!v)T3:z/ЊmT&B'-+ܑdQ,;7wCY9k+4M)PJt`82rkv j wz4NtD^l '3yŗր]1:_+]c 릣7ecبf\{98pר)ϩ4afQqt6#8 i&3}4;[xޏ-wY no7 >B=Xa6{ mf}nx7r7J8Fj[*l7QitՔv4s63޴j#h,fn,kP;Fo38RM$!:\zQw#Pfd.nO81ӹὙIoiM4Aw[]`m}+ƺ._ziw/lw/lܺ{a9.{a{akS 9 *\|^ ֽ u]SXw^ G y_&{;V`q%,=Mn{{7&P~R~2@(/wpK,_d;=췳~o>L ׳+ЪDjuZq/J+QIRy&RnLslzer?`H-&nj)/3,DKD#%i|?1p5'OOʌpBIgn. `?,_}v5%qG0"qi Lt.bf\J(Eh ф48~R@ ޢ `Ы<⽓[M/n 3LrpnFwj& R_R[CZ#u #%b ;pF.!H%V1Ü-iI@H rk! qbMDGU0e44v,rN(Pb.*Ac) *n aC6hm$.Z69)B ̡+BXl9Ax):Aޢ1"R 9C6Bm`N3VBF{Z* )(Z$%Bདྷ \H*L^D8bUB+t kb8S\dyCkalr)R/hZDBS)$RDDQH/P᱖^2 h$Re&XԈךC(!3NCܷO3XĜ8Q#h[Ȳ$r$#wBߟT|1pR9K??cmm!Bj逧ɮ@xb"[吨P36zdn{vLfڋUobS.(+?Y hg6,2gnBF6FfB-Zy$3\ȾK)Ofe=Fl&6+/iu~öK>bD` 2w%(U.2{`ۼN/FQH6hKar@/_)Œ[",8X  TIbΤӔM^I]t{qҺkymuve goס+p 4?#TmPM)hrMR;_g 8䅖޲l 8<9m׻1sVwYwyopsI;*qz:/XEvNhUFh/);9rw{K}T^IXBrэ' zm潯 A1n~%|?*qMH[|cL\&1;j*;|e0lߜ,̻)E}]ٙNE˪$H eE)_x֛-"}p:ٟdeiiߔa/ޜl)Y9O#^NOҢ}YWfϏ$4 HX$ZJs+$X+NDҙA&*;a>Rb>[u} >sF M !3ol""lXף0o.=32?WMe3Y1|>$K1ʰvt17( 𘅻w0n8g\q|RޡGؿc/ 9aG ٦ZY[Ċ f#iuqx 4 f*11CP!Ok y#g!TW,V]ep[\D3M Q^Wp,&VyO  q!K@ nĠg|!VG*~"vћ<-sobyn~uu=&njFL)3M8YH. o\e] |BץH}yU5mdˬTfy,(?Eh4.Zcb̍DH :Kru",543;M9 XE-0 nə; µk^Zoͩ 3zG݌UX)ɡއ)[ɸ~]sR=/}&` :,ꃢd<&w+Ԕ9| 98?Ƴu3S ыF;(S^%.e++BM{lIw'b |FR*-P_8=uɻx5'ϫ0%yp»cp@E^fA1D,(ol^sW76h/hTH' WjYww?~`*ꋓN  z2Zl/WA1bM_]TX)0uOjT6?/]ukz/n6V$Oj3e!FHpK1X|Y@2g!o3,$!L2`B."WcjKq9:o^ñނvЎ%LbBE·"@wAu%pSϒa9Iݏ#8xP*\c{Qݬ/_-d_qKp(}m$1Q5>ȫUJId\ "ιK.md!`osR QFk $Dj~` 8*gF( :3`M01KǷnYI@ ^ձJ+})4kV Њby[]KhV/%l`VXcL&3Æ(5rNyZ5H(@v>_F*c)l%}@k (-lQ F2gS+2pn(i|\[u&0ǭ@>"l{A@| [U7MDZ(D_/csy33/u-ZuuCxUuIBRCB۫r~_5}nO$!:܉'n>8 wQӉ}8 vdj <8Ypc"t# NtJh7M]>:h@8,]S[&jyEZ2 ʫ8X7߰\rx Ztd|8Y-C,LHR*7zq6 Gpdόz4'>!_R/E%%y'}< !["D 7lT>A2/F5ARTX=|ҽ#R+'Vg ܬLuӼ.> Wv%o|0Lա&H ]_Q#JC_m  vy>͐ Pʯ]QZtqˣy=KVF>C@b9$ iEdgsR]|@o%xI(1M|=P*+L#1iA*kPsXrdQbte3̡m(weIzbK;=cgƆyS[eÆFEH${0n""#ȄR$$2p;[WEt/瀕"Jc[9!)o׌TѰn;gI9hWDn@S4'Jۡܓ|خ H>k }k&mMFnZph[b>5x<[̢ɠ ]B?k" `dy]16>L5?x[./$8AP˂["ˋ }Ttv@?`6 f4״4yd-qK];(4Xdta XX%nP%.,w1bvUu`%5-b*7'*+(`B` jTX@ ^"N9MgSK+k"ހ%g|gC'H0!bYˆmS[և$9kQi;ҙj'2w[2yFFK@GÍ*6XݮP!|I*4D:'1*tG>RLڑb\oo3oqXD|P^{/mBhdt3;@:rY=Ta!(ds?v2N9W_xXվ:%kIʶJMYNhWbapھT=_c: }|(ET|kDʜ7>6Do?9Fwy^cFj4dPC~UmK)BUR)YonG_Cm;Y|cp mP@EDAc=X_#5dcm11rz"Y2jnvB-B]⷗ipxO[Y6b\;Z[ms:Z%-ȋ5=ɳT< Z6O8EvRMN^Tf [/콙/"`9v8ÊԔJuۅŨmm=}b= TϷ N`Ze3E%aީ(f +u_lFo3wWHr/M| 76ֿg!㔁`5R F,Il⾢9}*F(GN(Z`o2O#sЖ\S 6s[eӧFlKPyU:vd'kuxݙ{Yyϒis9D-TvG$9І YW^z?BUݛyL {Y<l0ڱV.p&~/{[?뗉}Uwrpg# !R)qڮ飶]TwEDBTB{ n] \6Ɔ]X 鯰M@MH 4LI))0%V=6&q!30g0\kxswQRԄU m0ji6?:qgQiGTra:`] ~ޜIRuti4|U#>rIm;3UOD`%<2'O#w Aa;LwzqpYG,)oR w[*ܱ@c= ),0a'^^vhjr- ӽ{2s+9%ki ūÅ=L.0b.vU9&xpe n`v r,kMܚhHæ5æ/ȂQm}Y"fSVqs@ƀɜYe`F6J([m29{96U&JN,×*!>cPFTPRiʬE#$8rÄ"P|>M_Ɣ~S|8L0Ňj|tF, EN" (0䰴GxM @1G\0RIt7쨙ݰP(RIj~UCV!IW`]>%m 8`5 ɺ/9sI '2v/H##qk WCAr 9JnV;)DjiqE#qcMtY|0u.5zQGwsh8o1Dl%upQE> 7L"e 8ջٜ%$gᛳA3{XV8KᛳvHw\|5p L#=cN2˦bS1X^C9czM^jb$C $#) 8 :Pp S4j!MjȖTȈ0rjߗ3{@**IoWsiT^L ^  cfo.>|x4b `9x.X@Yvy4Յ?+vttUoI@}mW=rhd;+͠Țt>ﲳ{*s NB\ =Yv#Wl n$ JuwA,_jP5z|5wx͐k53B-!|Lލ?տ mӃ8{ɼ3E>b:7oVg `_FلNBzch|7rTk]Hfv rNaѬ1,GUl Q?: ́:pShM*}̳#.75='DK"[|k&ƸpjAt`Aj"#,D:]?^ڻk闸z)_EH<w0d`8%m!܎?g&gM'K78Z}?50wuv~5]Zz}O?Lp7{C'9A'_,PjXx&gIlC7Yԙ3%M d 0tzn#>4n86yE.|fVqUӵk0lN<9ZeJKujvF G5_<xR_R]H[< iFݥ6YG# FKG0'^pR,E0ܩqJ|1:^ΐAY3Q#ћ:brI>-o}vɨi%+D$,,J@ٗ&GC+Z"W(Ala0 NF& 9צf~S"V#zrȕܵ_w岸GO*X&R*1~PoXmZ;qXf]LݍnnD[1?]>\kR3aӰӄ.S~ an?ʎz4O% 7eJaD4RA7\ڤ߾G$V`e{XIW,$BV=ePj2%={Z':; "w8 W+OLe* L]UmZiu&Sp?OO>~qfr[Ճ]gPg;?}䩠rɁsl1msGY-76JET %gkI3PP9pFM mvnSWv? _1k) Kg`b :hy`gQE$9;M#& `sP5cJ{'_Y1cƓi!VhR^yaO@s&:k:Iu{lt1LU _h\A) rvi=VG3᭎fGt3*S͵LZ'Ox'S,?ˢhCMBJl}#BJ>+Re/#;13{2\ɖQ~ C~5dCs{uvfp&RQjc"3 6\EXߧ|v=rF,jpvdz BnbV}@JBwBޕp6[d5ߏńIko}5F-!ob":M٦K{%73 Z8>'^ &Tۑt2*MC&#(HbLS)(&ñ8TQ*"9XΩzJ +Y0!@A0vU 821N)p+D~e3nj_9P p峃7 䁉k/pFX4 qsZw^VET[Bq"K$9ipX.x'4`7{$CMu$8Gm`,>lA[Q39[g~0Pb{6UlZ })j"ݪ(q]kGa ~} vuQػgGKm@7v>i_ Bq ~@ذ1T,X0(BLGqpiH%aʈuMA{p:/"';6sQm{$x$G"Q8?\je Hx KY-Ldk!rl9:\vߚ6dR]X3Ƴ1Z9#%%2_v/&׌UB@0NAa ws1[Ϡn:l^w>ˊ'Sa8Of+!fg)Pd1w¬^|ަy0oyݡ~V>Uxn=rɍMm8fc8w | @$ ?ۆꍁ޳ed'W0fdyUSFf|>G`/VR\omZ:ux+?q`m gf 47>D&UWh^=ǚ(੔aGڪbhRBBޕ(JⅣ;@K^( f_gspqŅ>9,g.SEɇd<9n9 bC.Y΃ 29 ɺs? vW} ޷>fI" coKi?eTWdmGz/Qqc/Ʊ%{M9j{$h=oϗSy9?#d!4FЬ?6"`m|f Oζc[d<BA:ʟnep:rOT$V**h8qV=ƈdRLJ5:;8{iwሞ $ Yv|T_ -UMcd2^ >S:0Ď`sP+ IRF@@m@,QJ3E ZUDb6.l@$*8ԭe%5\Q!&K?.Q` hFiqJ)%r}4 BM4R,%XVF$pG=H)*4 @P)@'uei\vϚ$*lKtF9жU=4+m\n=^XQ`!{cU~g 3o^8ޡ\<WWރ[s,ΐ![$W$W ~g!_;p,F5d·f>Evº +wVH{DiCfNd>K&9i0c!XSL~dRʦC $6L9=|( ys5*RNRN,ΩֹRy8jkP-%=5`l%=_hi~RI/,-PR hk"\QxLS!$HMySc,X DXQ8!F{0|[n3"^Kp23ttz,u+ǓjDWtzh8SI1CDQ"ya!b"DDlIfr$Lpz@Mg4S"4ђkCq$B&F"a+)B(jDDP۬qpqs{B BKn& o` abd֐x-+>jޛO'q~z`nz8$1~$@+7>o4/q\œG3~l~9:TuAkOTo–qy ގk.=j3/2O$!o\D}dvuJ_u󥁋u;w("5捨uCB޸z'bرn#u ͗.*팧@Eei捨uCB޸,SL(QQN6C ʒ*K1?Ŏ퉰S{,Rw#ZC6b|g3 & 7j4;I[bE'f5p)xXG(0w&dN A]fA,_<'1|O`Hf+Eq-=$i݈S{\'Mֿ'WWآZ]ݛp4=Y gs@ 'BC\Mֳj<٢:{Ci[ٿ>HZxFs=_@laV΄Zɝ%//2$Pbp.lח{1E)%VT콘}Gً'Ɠ &?NB9× [2/HKDb76ggRy .;( ?ɓF\ɑAՕ~0)ba+b.e/`qiW3.7*,IN=IY* U6.HuRU`ZS\s1MpøB((JdbH@*44-9֫1Uϝk\c+ rID8!( impC,X̶8h ./9΄ 5p?^N bEa͚mfCuۥ7v~L/#;E}y!>;giwcx+-nS3z 3y,R4haY{zA{)ժΈlb'L%C+C|:DBo‫_xḦxh,T"&&HyKmz,Gwj[WTƄuJAh&֓ěm-ڰ:3j5DiA8b o蚔MHez␪8yf( Y4#QsSڝ ˚݇<0TJ~w/G;`?3+@{e^0/qMJ.[&& U5]p|E飁պLv e7İ."TmVzOU_QIF.$BX/!QoN~/D*8oFc-Ħs݇P6\j 4Ϻvs]l)0yldbpa-*lj{U&1/na-A/J ccf0[Kaۚz;Mq )t)LPAoY4e_=<&vufi1gWIW<=A.᪕ `p'&] xtOFu 2ѩLˮؐ)`H06qP#!RLCyg pyaV4Rm蓻8;^yӉ/ )Ic .XQRj _ x@i-bĻztn&@F 9**tǭ;2WqY\WGXNT 'YgXn:\Erea^\+jwM"VVG{Mͤm6∡C>Y&:X|A^䛌4-`Nt wRIyh} jަR#ݰә{wp#/U\ t"]Pk>k-ֵaWQB_j\Mٻ߶\XR'S- 헽Ye-/_ΑlRr#vmb=p~pHl忼3&Y9=sfgm9 d388lh59|),H*T*OՊd Eq/SA3\!1.<&-WwKm9t6GmZ (]Mbu.w76߸|j6bx5EgP؅ aGZH ɀ l"&^q):^ǦRCA,+VZWۢ4dX}<7_} X,~q>֛?};]%x[|~XMJ %:ݢ這^ :ԍF4!M ,gc`܁]Ke 蕉)D,MGɒWPTGT`AHن'd_I3VvV22Έ֥=7}8ߨ-Ǖqe%ol lF:6֐k[FX6 MRE|P?o-kKx%zduav܌1}S6|Bx4}rQIa.`,8a+c~=ly Ţ *TֹP)("Zl9$dO̴|]0ӈ{v):ҦO'X:/ZlSKgT(V-v7Ҏ][ -YKu$YQ5ՈDE2#Y K)s-SNI  Q &WEӢmDbMLbcZ˪.p'6TY,}-{?@ Y$CRFIDPEG{ $5BOMTŲL:}yhUz𷓶+%4}WoHhƪoHjw i~RLr?݆pLeQ(2#cH9ycnJ霨WS#Rh5CJg5]E7bDhsO U4&CFvkl֨=:_v }AجA!*k;UЭ,1wF $VÊˌGmɬс-dRaW)zîv#2ֽ4 ݪt j#g561Y J*/c4g.l1+ЕR, eǦ8*6@0#&(7Ǵ9lųQlE6fh3#^ fB)x((Geg*^;g5POjO̜ySMAL ddp%! {ljՄU[%DE*Œ=1ΚvG(:E+]?u&2d'PBJ`1¦HXVů/}@p rU d[RDBuX>-oTWBIhU6`jz1#KRކRrib~k3B %m+ Cfu͎UCB6\_~/ _7\c\ܨ㴕[sIcnLl(VL)ѱckK~̆}]_jvСHgRDvq(B WuvduGzz?n5|)%LLY?]u!_;W')s6Z*9=6˸﭅g'vlt,g{! M;;LC1 >}Ċ z'VfQǷ2J$|OQw/-щqdC;8~U=eƒ%G|`N Q?'>VwE'gI?rpܓ;/|-{sPw aq|2xq355ʾ@RYm>+]y5 ,X0XË 4b4Id$&BGRڣ4‚_ %z_c:!d i1ku bI.tq&6KQ6Q3[ TTж#v$q8ͻ?|f6;[ U\tcJ^xʸ*gv̆O l%\_Vi 'J-%̈́ޥlʃ +o觳WyJ/vܫ9>y=H6':e /U.7U6L^aHnC&ػfz2i y25kp-dcM3U+J+B uvl! CL2Eimհc[7sbC/YgwشD1ǔҔjwߘ.#wi[Hc^;drJmnmG%۵>}X@ `92^p$UcTh2ZwьZ+`* S/Phyבa ;w<˔B()l ٓ5K1_ "$_L݁ÃJ,v΍F>ecd>ղzL̆P&P/iI)ĕ|U: 0wn*{hL( M;h޵dUVvjUG,l$y[(:H2gHziSkf N?F( GڨvikcL89RĚRI΀I{[[WJ=Mk]~?ng"@{"sT~_N@Af3Op8£e4]|ۛY?[o/tZMY:}?=w4xwvd:qDO̿$?O`tڈ}3ig?}p'cCeUخ<]62yjzgDz0UbߤV'Ce˛vqw̟X]W2 3 ;3àGnbֈT\E,c_v}=izt?ZW& k2?I_ÂVħbY?iM)j 73}Oz?*vO<~{LA/5EѸ ū^$עμ/WǼ;fW=a?Ki}<:rh*@/2٪Ф`*= (h$ %6#F a;ᩭrGg-*]Gv6iv0đd=!ܮ ?~s\.;Sk%q4B VsBK|lͧn2z+***跛tppB1i,U I2EHsAuIMF>WѾ_x-TڲS͞ Aǣd#`"x0L9LBAdBu%a.^LTQeۿ΁_W0jp!%`nRYZ/ݷ>5 Z 9";FvΌIb,#@@d]zs>k[{8(QV]5dvݘFjX*tLpಳSeٻM7Ѽ|0lժհֵNeu㧁(EWO{nGz@~Vv?P*%Ixpn6h(PHCWMlgu[7-\Խ/j;JZmx퇳ycYKzk^{GmGU쫸qdv]ȔL1U%j?R"ζ*?/vegw<74 ڑ`\*̼*6&}9 jP4m AG0|Y>cQhb;N֩pO$W;ر8$}6qh=`8kM"$_)%Aq)2vkM }O^H:2 nC=7鮫l˫L b_WgHvϪ}V]+)\1ڮYu##, SC+Ù?HU_79a2dUH.3r/6ܕ-K99Z=3+g;q8kM zy* I4n۾j(PX  2>+g5h >ꈮMtضE";Z&zL0BxL^p?̈,cL+"VzA{id;{p;Ci{wǫ큿LBğ>܇ ~-'c‹jײ_ﯮkw^p;(~/g/E7w!fheRܾI+/K H,4_|efb#!Ӥ|ۨ yWgWViBKX~e~WO4  Vw\9]U9=FW[9=v@2}!ZNΧYsc+s^ BA۵/:7S|VɔQcruKրބ}kۙ|dp.\X?FӞUț\hJ=L|oM.CÓWc5~|M? ZN~Pz癠opQ#Hz0yAҔ4{dH kޅ(׹4+W*by߆uC[-JTv!Li'X?u-nchW5:;yc87N ֭%SvWA1uG(nchW=*AYw߭guepY{>ܜ]޼_0|ų6XuˋosD(a>Ke@ 0 0w:3y}{vgq\~~`QEp )/d/܇t4OU_sMcR(;M _/Zmhڳ׿$.]JM@P"Wֈ>H'aH)y*JǙ;9O+m]ɠ-ẙ#:'>:|+ ;hqCg/<ٲ7o;遇»:  3'p󳯡šg`ӰUr{}}_v+$Q O'sz[fy[wJ^(ąQ$aa41HfcOEe|d>6LI0VMq~2V$dcsC8RΉ(_?S@jIRHöD7㭖mZ+I&7Y: j$jlTTV^6 CN -4, $cC FQ!K'-i+l=5:'ֺNrnt o30@@3 V 'm}NKI."XPvsΗ,ϢT սϑ\J%EZIA]oﲧ&cKTFiΆoY'@=_dh+%mc219m_y+>n84XaN7b.2\D2kb.(4`j5޼Q,܇hZ)98,F/[٪˛w\DS?B( 'xhjh5[GI3S:mCapy+6 ($3ZMVfk^rhQb=I,z'^|ts%;#Y &`gJ+:IEȂ]$֞XSf/L&'y_uX {Ktx!8bBI=a9tG>٤y;`h$+ Qt1ֶL`m6H zn؀&lAD>OɁl$zsVKX-d@tmebN[XJ#gjUjTq"uC FhBJDעl>t"bREnba|( eܱ|#QV6A+N,t&i6aոh,Hdbu#4UlTc[煥܍²q3Fv p/ܔoHV%8'BI4MՅKhPƱq 4,GhjHʝ6gz?sJ 5+TM) hކpKalznG8Z?0 GX@M&=ỷU* hx\w, iF6s%Gr$ә&1.1@ucۭL'hmetx Xa¨yh VxZ%5LE7y!CMFbK wB/?r\˻#/+%ew.zb+=z lYо{v}(|9gjM|v̈́d&f&c+] eS#ȶO`gW/Sde8괘|o B옶Qw۰[r ̽ة 03$RsLMBV:.~}sD|'uqā58_5kSJL["`VS0#{ ;oC1e%5j\CE`CmF['xPZLCVǼ{tHipe91 h*Xfakľ `Ԋgj[>Isji m')N,E\U~BxQ5E:?K̿mۦiQR[iӨA* obHeU~ #O 6cT#/RK6F1>G{=cbU[ޥ7N8K\_O5{aլo޾]Êoao6#aȡ^ڐiU-?{ƮyKOsmO[/f83lYV$E]"Z4)`QD.~[U7FB2(BLPUVkh]Z^-TQ uc]V--$Xqެ+4%}--qzdDRE/=(MkiưNB1!i- _I\msub-U-fϬ$d w$$R+c%aX/Hbt&eiў$di{iD=S< 6q eک%mOlښۛF>U׶Vh{)nUrYJj%O'oO6*,9~gJh<璣?jLUxr3(a^`8걩&&L{n X'v:$Z|ꇾ9 ڭ!a /lbq]Dˮ_y֚!IYst5sZW~*B?_ N+ Z~ݟ5zw2D8?WIu&xb`]Tu˧̺?ϝӗ_u(~be ݙ6>4v~:g}X LhUݺ=D05es{Rpﭮpa{[t҃ri[4dOi/q^o~-t l1h3Z_I)j>v/Ezv3{3j5huBhEn [;ߢ{{7SHN[+='vˤy]g՟]u 6iY9˖/d_mr!F7, _NbU#^K(J&%hi}[}!\Ex,bCaWwHnRCqw?Er>{m?u1W _Ǐ[Zniyyq=,|g-WW?Y+֔lrןXii?0Ya.D?,^/JE[f|.74CMA8J)UnQ%A[1JATU͛V.&˵d>Bƍ%F{Ylm6}n#K Pzfn^7OuO glSMWlGgɅz(Uge D("l L Qg8'}/\2+K4eG/q"BP>L=,.X A5W GyP؃6$vC@pzhEYؐ'Ő)BXt:|#=if2Ti \F>H(!XCt=ɦZkYqg$: [w(&i gGMCB޷̖ۨO)ds $F%8nk/Zw!lUSjCFi[MTRB(g=΃vtt݅ڭD=ZCJA UI_ɾ.+ B!EYUiZRRc=Nn2j!ةˣ _˧٨E'BU%-%E1"D+B[ۻ~wA8eetkbxv8mA.qɬ#e89rmfe5:2֦O-`tSZj qjR WJ^!.K4 XU 5tdvL|` u"gT/ C\QlAvp~(g'SUbFБ1q4z\Sy"LGCTt)u.VJU&ΑT4βDpnpI5h}[_7>^jKzi)Kt(U(`itrIE I1(OO'3&3bKq\󭹫ZP5KD3G]bN7lӱ:/܃761`2]?W #E;@T:i?&ؿ;̠.bԑ5+Uz ! iaƌxSAsqi, nRPd$Veڂ_ td-Sߖk0SFq xRQ b6BS4E/G;E cc|=MykbS.2@\qJ|27"ܱxevҴ\~Ubdg߶ZϵB\kˡ&b.kD9-c!(wSڎxrx%z!R._3? o|?\^|]Z?WC=~ X0L-E~qHP(@Bx> LĦh .'9OopqqidzYz c-Z| +hI $UAD~T!ibOz o1:B2\0En@| ш3q`˦OE!6S7 U'ș@S{H\4p$صy݋ǶW=bR^PQ2ùkcڏ|:V1i0EJ%!HZHvp"sNCZ,-e5҄R8YLVT+ٶ%{#cO:l? `5$ K}'޶xSJd`%!K<ùh#clf4A I O(d \: sOEXRrM!d  +x.} \%B 鮢 :I]8I\fӮ,\ZeAv(#g@~/FI#{\w99QcR8 NQzFz%S\1ӏ3%ȼ@ 5DXik| Oe=8IӕƬ`mdph=u" uMuxe_aAz)%yqM:a6l:Vq@[Ã`05Rhbi1_/0>#v( ̰2szI+M.unnTz3;ZSC;iA ^,Fդ9[!HfQ%NJ1fd%/7ceu&qg}9oakRr `HI`mm Z #cڦMQ_ %|˝NoC($Ӊ Q E YZ>K#vQPlqdڥiHNScIcSkА䍙5Xb!&k!3*6ڤ w-@4yg[HF3y#uH==,H2eRuDԌl/+wK-V DaW,Q.@FZhŇy6&gG|Sӭ;{_c\"Y0 m'L!S` e b*+vvƛnfZܰ1~3l4sf2U`%fkVs8#LFY<`,p[l頯>h~Z UwN`jC,SPp.ft"g).gԈV,uao><#X@ts,` g? wxsJp9\^=>q<>x0 \0tf%pldQ&w$YͩK]+k\8 ǼrKpA W֞)QT^9zɨ+;;"~exd>qKNĭ4[2B1ԜPݥ@|RM'D' ExǀZb,EL؎7];95 Z2)~9:w_҇k IH @`z(ؐYX0pRyG!E@UCh k?F) 83| ?mrv__}E_9w\ܳ}}n~ӝCnp?;;7tlu_D3zx]_aD'o8n?n2=w'9x~m<Ս[;q8+P2 +[~&/5Lڷqa^L8bYH< ϋ_$,Saƿ:ihQf#q5> hƣ1+6k`@? _Xu~9+q iW{/oz؟`' A)CP7f/>pxh4pMpm aNWcP|ˏBc6}WJ5|T w ?^->`7ryߌ$$AHݟ@ە/_c;:??o{=; =_כ;Eπ>?nxۓFRle'*+!A%-]l%рpSgtanrr:X$hꝈ!KҔ̃f+)m%m!f+JjB&m"݉M2vcIH Dz~{M2+Q=/*ſz/5\牢pWl^ #L%P+BQR AJd'e&bJT+_wҦ? %ytLV)Du4ge% vDaA8I2b0\Ct5NSr&Oǟqqwu0d#1!p'"2s/NFGtfpʬDzb@bop=( 8.4WPg@+nN)X02iuDo nul` hBk0VeƇÁ` c1Ke3NOǙ!4&r='Qee (rTϏ)ZbWO(54*._h.fn|,RQĻFInʇNw缗R^s+9/Tp.)3j% J(G쇥$SΝ@VY 3z X,`wu ?LR=ֳk9&6T9^=¹P IzWM\k2J8^˟R`~W祚s˒C>#z) c,w9?z5⢔% ܟY= dȒ?[$Z-8ϢRH ϐ$RxOƩ:,)P#9F;yq%0sT)t'p+;bZGIëmjzp5HrvHRBozЛzPn lbFzUDRJPnu(]0a>K._¼1y/51OCEo_4+QFDBϒJ)!=p4fZZP@:NA"(-o/ G KTLjiS5L2AFڍ) XEfPPD%AH pyb~LFPܪs:*ѽD]xY7wGx vWc'Kp'Kpb?2:n:VڈQr2UY6qh3&}탨Do?Z/ 8Dg4pqKh݁lNr ^TЉǯ7SV᭳TW0(/T’kBH*!`n#|1p,`/[9QԔ8(Br!% 8i `C pmJW=C;[IaZq$@}BFBH̽["-C̸/OD2ݒ([嗩{ei:C+o|ckHZg^h:Y.e$Bx&$>6V4 dk)}իfXPݜz`~Z@UPez㺑_0UEVрlf$}EXYv,o%9+o-GF$UéW}S[<{g4u X WGwOpi|>9U8mG{jkfe0b/"y8u&3dԡ-v-&CZ\MV:P}| wښdk\ 3=V ܵLW %E*RpEq} *{;d tt?L Is2D YN!0 zط}nkWpt?D z||0iO}8s CV=ŻCBn||0{HӞZ~{H7K-F7KqfpgY}˹ޜ<"(5⿣v~ƕ =^ѳoI}yJ}|z%ˌyxѯ>[>x/4ْSBdzɯPpZZH H$GҀcR-Em'YA(ڕqS|K;?t(:t(:t(:t(:t(Ee2kliRy_W *UjtqFeHA+ޡmi?}ԵMb!t 8wd8roΨůhծgW}qUBjP"29o"i@?833qG3Ps\* =.TLG~lU87rsgg-{GXvMkg-xyXeu8_>_Y@հZ`ړ/^g+&>9G[< T+zčTI xkv?oNyϏcI+f(";̀$b-PzPLu&:2d q *qveC%Ρ1U|pŷC%ΡPs9T_%2A̜Z 80J.)Cbab,Sr8#.^8,kfǟ]V& p p pOnгrT߮ӹз:]ۍ^+.-Z;Tz'i_WW_qKS %*e9>5vSG'J8ޝWak#==Gwڥ_?Ҁp P//.Ξo L\~hT9gG7?1_%ﯟ]-V=۳mBy.5ӻ֔ d=6յ&kiBb I%5W;d囹!@4m=asq<Ҭo7 T|ήg9{wfk2sVד9)_eo+Z]L XG&-D!b3(&\{O6H1&\"leJ氣4LLgx2quˢR2zSO t|qOt|~?K=_gT1M/K/">ʧ!;v3{{?3 !EQ RTp{ DzEy*w3volMU$Gy+=$յ>\簾jfn)ds`x=P\ٙ1O7R9yZ\„fhE@y䡄#bC3F E"$%G> ) Lȹ_`")bZPlL5Sbx5= /%R_ n%4uPqdYr,jY/\!& u ?귬ޕj8b OIl~ 2[LIC#^]{n䧠uru%D(Y428DTZNIf s\†>[P;^H 5RZA\nvN&͆|^J,9)cw|&eN$)pm@F@,M=GʍmcX8AiLsU7|P M#$ qP8zy07fbZVww;46Fܓ_fϨ6otWJ`FoTMڜAyqLR#ÖhM SԂ-4K[a,13[L m#^^]M58e4\jq(#%~U:UC `YKt MxkŬɩg \rzظP fw̽ŴW ~zt- e2,fJm _Q)!e6* Z&bj[) )`g9ns"2q6QfRV!=(Ŭ} 99p%(2yr+)I5vyATE Q9OlRLN n)NUUcS$D(Ĩ6l/V&Cf6_eTRJc jsjQ2S9P4V8XM*}炱vo1G9R&N돼 ~/QV kH8F$jp-Z VJD.A;|RTa^]zvӰʘl=D l$FO{}qEIv 20H1ߘYj2D*;[2z)0Է3%RP_u,^#D*,5'ksnxesco1)˻G=~+K ^'B"tzXCahRΨ9*UBB&?uiCs&kaYpX"DժX`@KQu!> %}-@ AV•f <ΟQA$Gexu`m FzǏ<( Cmp>F\è(hF:*HՍx\-5[FŒl2DFRxMŤv'3zo@㲲EJ4*㥬I_~;uuZ{̦VN8;2[b;S:^2jRҤnWp{z9ceOoE=ɿtqw}`x7gyէAi8߽-poO?Oz% {y$[]~>Z~hUpv| _ֿ?`g(.M|ww/L1/ G~f])}WN'':TpH❾kvy8cN{yc^vlL"džqsT1a?0u?J7s l'oeP1(LBj[GCAº03ۀθiqS@؈W$(XQdP~ؘ̠qHZJԉݤ*"{VN@ţZq4АS$+m!OV; yeٱ!`<;B 14OR|I(>j瑨E攨z¨#Q@*2f2t4+ێR bc@_'@nX3nune;B JD+(O~mPRVAI3쓚JQtġ9DD#?vdLˢK_1جPt|铲N hEڂ0-b*y{Li@G/bkێ*;[Q{TY"jU2dFh< Gu>!UkO5.~T IV,ԣMm л-DMdS޴[QF6Mj,ِ&l=2IHo&3$v*f#+[՗OڟL;2VXx?mQqs#4v$0~X:L&mh2O>GQym۱1&o*trRqm)缃My91v"[ !SRkLQmî7DH+#lҫts2"m;6. _)\53a?ܠt"|Z=W^htۦ+I6d,#X]+laGdٱ!!@#gZٻ6lWܛA};LꪲSLRAӤ٤z![Rر-u:K-渖m;䄋:C:Z ) ].b1>v ua xR cBCIzjc¦&y#۪w - 2l`xŕcm4dFb*"n\ ]GK3?أIyC",Ӑ Wz%T-k#I{+w jcІ,%jw p0_g?&PIXs0N;l:)xXY ~[]Ex ;X_^~$ۓ4}4|!ҧ'Y9x&11f\26[•=M잃JCTPO-{Dڗ44 ewd9>*JqlRRǸr0Ǯ^1lpYii!emɹC+\HhwG3/s^m"zM)7"JE!88Nm&Tp۬[s}ogGGG8#}Tcpϊ+;Vr꯸BVҭRo)^`seP05\[4̕խl+o$b%gh?yݿ_ì*8qB(wk{64(^4!;B\XkE뿿ޫ䶗Q>Ef$R$͸?Tf"C*Z8 7޿?$d`p{ORquoiג!UV }$VؼK˘y] ׆8f&xMȼJJUb ^bA`VI?N?^/hT 0!Yq=(XZu=+q ?F95ښ'.2~T(v]UvwUN[gUA&.8qA =sDkafEӎ v&A x̤^l 5(VZ,VlNXbzKsςu M7Mq,)v^2 upG]^'YIbR'Z$!sXgL'fHf9wEx׎E}C ܑ^M?Md ]-l9揗1_*v$[3/ZoM&cތޏ&wp|`Gg7';hKck&SsOj2it-j<wr޵I@3!$}:,"\M' 8,^Q߸9|@,.^Ҡ BxЄh uUkÅe}46 $Ep DީYbl𚹔"akU Mi{˝(<}C^5Շ^ҹ'\kkE<7cj02lƭqQ1iDy;T"Qk\2]("d^x1 s:7Dz;pɮ{SL}b$eFJj>c"F&,$z _Cq}n[ s5g(T,߃ ŅSy~ p0 L@:^F4Աd@!V'p.x"(J:R;C6ɵ!щ.=k#LI)şN$3h$y`,~\q=0 ! T| Pdf]$7 BmgRLL,MaaF-1H&-FU0PZhO:g6^nj&_K]x s C(<,Uwb-Pp l o7!@S$|u7EхjA&؋C!,17ZZRU2K6)bk9T,y3*"|,Ac]N,&v54dSVF(6;P!}h8A)ӠD#v*]4z^d )$\` f>љYte[NDBuؓen*   T{LR!0=2D [(S`8V`(-zmD=CkDL "!U+`J[$BPҭƹh=H6ȕh+ީPEҚI!'Ri-.>fj(NnmVd./7O3)fދ<1jDBu=TµVebs'ѽxH4,9{C"O8(S.d<5[dZ3|yeҨ޼do=ib$Q.E*Bnu|rA"1oNa?o#~jrB9l{"k"lH%djs[ +T8_L`{yo4Ճ<Υ[89vȟ>֮jMEM汣 ;@o{TXSS08/trDF*D-xxkջ=k@OkTݚ7-'ZNlؒ͜'t h͙5J0g@h|OQ@i&_-C)gTgR(kmO {{EYۀ5ƨ$}ʹM%M/5tߞ-,+o٠5sNW"ҼSk |Fĭ/y^zՙLƪ=sb˸;I{ˋq~4S!RO&[.&Pzo9q#2{}{ܰ&CD s DdN!ؚvmb1qFElAB}\]~{v]%nٖ7rXnxh"#NA:BpxeQA u<,uk+HxqwB5@!SusM(PmZ-7`RHx4gB%倇R&Xm'CحJKY+\ *`pcʫGFړ.6&fi uȇ+d^"S+<5,җAM/S@-'\%>&u'5;vӴ'l~FºHU(t+*Qrb}P˧.ip|f2Am̼:6h,"y D>Y)뼻ґTaOYhU}hZ1Wf)j󂚌DL$e;ƨRfRRTUj P~h#M-4OGj>OZhßg>sH(wBm}cudJHܜY% =ğ#y[JZ_STjq`y?G@ckڪ cOrwrxD{6&b()x= sͿ~i \Y2zl|S`GCLjxUl..wvެׁL$y d2n}=&YL9gPydP笤 -R&JÒe`41*I۔,9UYɆSHoJˮ#ĸ8% HIg@J,Z!0ۀ'k/9 E#jJO : f)ҀLfnG)́]hDN]?lޞti̎ % 0}~7~hb91h&$Q&d28UR!1FU)RXg{Oee̾iBG_K{Ԟ2%hG&l|]~R@[)_ꁗ&bgȋ6d2ܻ^؀Ld7(aKuV&2 PbFuW)6IoOLipE4{?dXFGaWLXJ3S> *g'p*9:`4Sjȳ xv샗&zH?%_6Ȕ@o24(U_ E0Fe˱ձwmY_aC2fZJg3JR* 1/lG} EȆ  {}o>c8"_-/O3jFq1&1$Ƹw5-Z@d(ir rz& })_}ާ~t:N85v ZPOg*Mogiٝ;0z"TlYGp Ea *dge&"f#+ΒH5[?D \߳tf9*XjdSTe/\E HFw2V^x;O"Q%@)_gn R+7TOq^ TR YK~ s2݀c!x!T-f w*nNPLXx 6v1C '/NM 2c1čo[FSO (zM,5d0݁럻 Gy![}ؿBG#ɰ_oM ;Hy]`6h "+k- ~\ؕ*6koDp|`iɝc/[n &(:!襭 qxgs*ls30l.H]2;k!Yd蝝`^; mN:8w –\ҩX#`+3tgFHT&&nS6N?VI</sɫ&Nz8]<ePIl N߇<Y R$BT=;dc-%U@.5vG  jSjNٞPB5&-9%;$(F_ $2n]j ӶF*ϱ_pXN VJ{5mbIA{]9 sxQ&*O^((& ܊9:QrQJb5@¦mӐ+w\CaRb)vHL^߸2"IED QgsbS[~*(wU̗>ZYgRu-Ԃp)mlcj 8hOPtQf+AՙwRd6S"G Pãxк% qFV ڐ]ؾH![Uv Zcٞ@acKp'I1[%KԸsښg6*L 6,l RI/*=2M;c{d+eݘBә"/'M:JRt6#\fs;TX$ :7z_mB*:Cx ՕW` ՌpjeB4!޴F`c)y6"<` eocƥE]ĠAE$3&֜ig{/*.^co-ޭ2߾xERTuJNR)VH_=T}?7o^'qWU/>=,8 v۬eIgP2BDP2#%yQWxZBr\< ˄PPUִ!II-e栗zorHĭ$&Ibo1٢J? |*p?x0D9gKy8 |=|e;*UTl QR(&,?9@ %),cಳMY"tO,qZ%^?,|U63-A za2tbƙ[ȊPX.xU8p|WQ`O%Q:ZBU'-r>c"$)YyCZ&P,e"-&s-xB( v!7K X̍Nmb Ƶhx$-{ cVD: O 7})tMY*";{hC1`~ 9t{X$Z}K[:-!tv:;ٍH[1;a PqאCЗ2t26XwSOj6ΰ>>o!Y` XS1 6è3- Kiwf\Iz]-+lԙbv!Oh@3!byc9׶b%UWD"shYRKP* ULnEa{w1Z9WmEq*bPؠrE}q6|9C&޾yEC<[bAƄ#Z|^I"$D%bQZp9?` P`?V@" q])@&ZjBaPR^U踈J |]3rI;J6kI6QFNg1!C %FLMT9kcـyxPu|3c#їXd+L&Iʛ>+ʅu\-;/G7 ( 9`:fAbKRUY䁌_d KWy}%Cd˙SsSJ ڤSh|l0ldaGuj<7|[xbPIL4-$p 5iږ1>Vu|9DB~kYvVI*PJ(,&PՎ͝f lB#{8ɸIJ IĠ! 0oèBC C~*'OŸa23Y4.s&BL@7+ 1@m9X!` "a +Ɖn9U7͘&i9#]pPGʞf0lձR(^}N02PyN+0fB39"kY Vt?lGָ^" DZmewPCX Z?Xb C2tXvL8ancBIéy.J0t櫴#P'l;.argcҍ s*ߚAoMLBk0&~,8q7gNrDoƋ~0G W-#v>HQo-͎e˭ckfB3h_}gBa#&l̴ ڗ#3a|b&TcE5+f²2A"TjGdm,l^`c@@GoeLWR*q [m(>yFB"Q Zgby(&af(85ȦP q*`@()!MDS!-`Sy( X[E:'b&uL=z qW3 pHwn!ՍNF lgė 94;&ҕ59X Hl{5Hw|aF,L.z s:w C{C!d٨qlX``8b5}дY Ѕ9M/B>[$o@|r )d] }<9JF"R*xJQFoZiƆ6G \RɎQZ dUW&y;pC!]\\k <|!$'N @J3} CCNp1VYi,t# dfDDSFc|/j |J. JؐEk8BmAr u|lVF~ ?}u̠C#.찠7t|n)57l v032g%j\ɝX[&D|ЉPN7KK'>bF))(+&3PW~'@_"ef,e0UP:ybYLdf ٙk %w0&c;xȡtKN1m+QmНxx)*L;gz,[ L#kJCrG$[:GxX_t瓋!Hggvg)Zk^qn1bQ]hvm80.@.0nIҒ+Pbv C<t<A"]]rص뛝I7r˞HθX3҇Orh C#i>7Z ? v7~AsTkĀB֪FrD2|,~QR*~Pcf3l$l{#1\O'2SZJ'@_FyО\>\D.g|m9H~J r8+\h ͩhr7Vwټ5Sp, ? 7…>);I z£g7=EqD#x2LN9cΡܚiuAܾB-'B+*Vbۯ#`'W"QǏSM\yW:qDoSj`P3`N$t\tb:i{{|кLk<n{s(yl4Sg]̛q| #m~s=Fq,WJ3u=/ ?w|"c(r] y~ "ħD /s/?t}HpPE^hӪ,;/N"8Ѹgt&rH{`0EWK+7>>%V2eLh]⻞Iz6\jR"-Wҫݫ;DQ|MϿ>'6jzuz[|Sݸo o{? U2jcNG?S:3vۉɟOiMtH ؎|ߋJMzzi|9qxAѴn1}s9_/ߌ]ݩYͽ3ӧhzƏoa4[}6g{|ԫk ~C`{| r9!X=hN$&$m%˫ @zIo<&_t{L\>f?3y1RZS?_;e;wv=/{׋'{n}Mӷo>O4~V4Բ5Z-{Y{H]VڵkcgXC뱮.ZaÑ#24plk[M4n*J ~z?_DU`򭾩y{*<zW^Ӟ@=\V%C"߽"KcMU U,2Z܉EhE H|.уoU0GhtIBM<?w ǣY'O8%ی?Lx9?-zkbw󁗶~>QSWF`)Ҥ_'0Z r2y9E3͗N?'_oIoR|L>O{ۜ) .Ǝ`R!w9*T=Db%H;foTA/ԎAы8e/M&j!/)_k 5~ؐOn+Чv m2}B+/.L8/6ԈwmTr7[o4\u+~ɖ OSYYRHyUm4"E OHc'JT0F3Zpc^OY]+x B8/2u' Esv`f. ]VE% 4ocz&b_{}~~:רDZۉlGŦڅ?lMS/M,4fXX>y#>~ ?iKVdFob7y.'/tL 2G]s#Մs*)jbLԕ4AD 1#%ލڱ>mF|5oW7֠JKsjiB^ xSf{$R3Eʬ4^0X"{[(]|02;ރ?@{;i"22OBJ:*VBUrVIA{Z4\+A(oWJ$ ^:Bx1j 7RPmp:sлU t$&c0B!:r4 [ ^Eˡ E 'mcy61cBRM> w龫\K^yh<Ҥy0-wc^[6jYC⌶D&%%ܡiז8*om  ez5rX cPmIcya:]gDU>,$ˊ>jmoeF1E 8 B2t9,:y\ZL^Z""!ll|twG?4sY}XvߑvP?#Іh%/ݻѸِTڵ!3`ItrK΀oFeD+D=Lepp n-\|Vvy./PV=u'^:%Ɲa+K,#$.]'b紓"vDqHe @8!7bE-~2VQ?UQ?%5iE!뇲AȥF:jCtQ!JaE!%e` HɈG2{L3\,/`#zzI"iHOb9VRF01'@: LDPpdc7i rio0 37(AJMq&kwBZJ}4U4U"h[}E$r˜jvjݼݼ5E#O<aBΜADQ?j_,td6%/l[O޶ 'a2&zT4a&?Ł-5ݖz9 }8%Og8O ])<Ɣˊ7a9~a~"Eʇ!?aXՕ8)M)=0!'BgW(ݻ oF^)Yy\vhB֝^\g=cHˎחJ: 'N*oB)Y»nsJyAtD88%a$RD0\kip$/&qyf?suwU+ v㕠E #S5,(^9}6*DRHPItTFqnL)uBgv3X0:X(kGJi$m+9bdQF Uk5kz(s: MU'ZT1`,A: S&G"&a fX0JGi&:q1]?Y&8y! [4ӆ1Ǒg \ 4!FMMSզ>^dI~FQk+miՉ4H1Fd,Ե1 /&!*p|tHH~H|riEmG Rf+YzT5,H4QPV;X>gZQ`EHGhDm%u AS&s AvU{Tu3vY)PlIM*&ĉ;X`L(% ҂5RGI3Z0?/VJ4$$q鏩ָG'$6e_=:J!Q&zhqIq|.&JHrpdV@3w }d?Gx/ƷNB]M_=]&7tn j#Wi6vC&J'`Kt"wmrmdp _m 9c_plِiq${]ٻ]vDZv3 9ACq*;)nn@Ykf wVd2SF1/!hRT.3#L{uFq VG5,*5le6ND[٦$u}MJ [t|c. J+0B['e.< IdRKT;^ -}-30s H dٓ^|!ǖ͂T 1votJ wՂi.r-|w{L)wc/ W(5Az8x|}rN} oYM%^=A}xzL/dP61Pᣟמ{}{¡I1/?;dlOwWWe;N:\#8R U44g_mύNjٞ4#- w V^`1hg>='_?ntnUP&pUPi!d27:DK~v py!1ͮƉ/;XGη2.\'E|rQ|l!o*(6( {lw`bWc֟J0! Zʒ#/>|ɱ#~30"VJdkm;z3k t7 k9N_u绋5_UYmULiG lT@inJ}"eC峲&l]Wi1b'OP?eDie)RpsZJsoeNX6VtN) nXek-rL M&nWiJڔV M[ #Fh&6W%$5XVV5M`w6nB$o@%XmT.nRVTBu;705a3#:꓍(c+4Vfqt0}XZicIJAM ɝ,rƱ 4&҈uf-Bq |A#kq*Ub m9hO8^I/Vtuꐟ59Ml@'*$)0*=_vUYK[6ENθ7ZdBQ+H `F5gA͖}ay-@RؒȰ╂DCЭhԒp 7.qP 9rZ=SUbYA LWy'!$J1^B"۝DX0tSo7BϪBB+V뺕dQjf2"e㈍4ǘ1!bLb(UۭfhniNՉ֥G8ydlQ'-($.%i8jSIMIG)Bbu#=XdJXiFga1MɊtu麩*Oq@ do`Ԙk-ճ$F,RC ېfJdɉ!Z y>v'S'QW#44+xe$2ON%*ܗͶ$m55kF;4[xLQQ!<&fS$ՆQ44Y)i@i%݊Fj.iZ'9Ph;ڞNğZk@%׍MfJʖ:G|TzeKKL:ݲ1:زCk (MCth$QCKg*DgxM 94dZd`h[-ԩ8qZ!sZfXc &l]ϜSC[8yF9C[ǵg+`thVԌ=8'B%#L^!љ!GcM&Qu§Y$|"|T*]ngrlD3KbDȑm (MHp"gQY2H{)"v kڊN ? {I8?5B :BYjƇ'=~nPupb/j=@R,Ej!X˲B3GR%?m䫏_<};V3sPUQSB -F= j!Bd,E`* jR{vds BLdUt:h:?=W4~5JrM6 t !]0Rh#{T Mh7[B]DRVQ &DH5T+t%J.t܎+:|xG5+p{\ =#V4aiŰɢ%U dR(rj"aEq+&U@Fp,d<~mAQ+&SkeĩϮخ\Y 1 3.XU $ \gg S݊:C;Q.w>E0cM2kC& 7{zl$Nl|9oEn1r$s2Li4V9n3<:fh¸O9\ΣSPȆFx1t2ˆ$RefaU+qbMWz0Nh?.y j^̷ѿ'_ /<>ynjEpEٻG_h.8otZq"@ܯֱY4y8kolGY5jhZϪѺ }*%3jt]Ϫ lVDrjUҶ(U'rkc*鵑z^3Pk9ڐL<ob EVGFH\a*rVJ4`dM,%;ۨ#1Z~bK{@ct=i+@ue5~,y`(o nnFiw3n|bSa`-Yoy}ʇV4}d-:rk@ehTo/^Ue1KO?mF}sܓj-uom@5B {I Z AOۘ-!G)a ym~z2po ϳ~/p [>0-^ݼבp~swK?$G盤_V"K$S.HYB|uo7 $Г>; `mjx7E_n}R`X%L0_&M:1|O@g2̒vފNR˟`ruZMJwc% މ co3Χ<a0E\٭nnGv&)#3}ᣇ3k,UKáZx߷z`:\F'G3~͒lYQٯ]Æ]='㭨YY8^B%_yDX7402k]\Dϳv ^(dz>sRpD9 @BntF3̀Z<xI[) tӅN`j)i$Y3ϖS &A'H,)F#*3ŢY(,:+-nu;wZjYdx3Y&)lDg*$.%iأRI$3lhd!ІjctJ ŲE$?.(Ĵ/nDqqʞj`T]֗~\VT?>_}~Vx`7Ui^oߞ#gLgr.^j ߁owT%V&~>XE 3u6T%cJkFI;$-d2*E&$E@Zd 'peFГ2SzG*rd)߮V)[ܞ9 %ِfi֢h ki#k{Xm%Jҍemoe4A.kPHҫm3gG؃qÜhd"~MDym ꊜMv$@] P58눝3*,[c[[(4mC<3:J@er?{O۸_aBM߇?`q B__+s$jM'!XuWuWULITćټ58c~N/{xrbd2ag[L 8ex:Ps1Mxt_¬Co\KlI- W1^}A@qz J ~ h69] z-;^-qȉh#̽k^h8HbknG A#M+ :nM7Zgy ʜ[qԹȾwf~mr˹]|,0)aXZKJN :n!ӻxU1ߛmHsݏ2qY޶>Dz@A(Ss"QÊX!⽈c|r+FDSkvȂxxSdbu0Ѿ>(١|p'{&Ʊ7ľ)~}7ګB&nqγ028: \oc\k[(..XGjň2)1KJ!-k8 \[2IO:*NyǷw@?-+jZ`/_,jFr:jYmKb!Uni"BoKԵ#[3&? dK!A*ekc jTm;kŞ+heD<|7sKt3 f'Sy8oY-g b_~v;(0E٠dA[~ڔО׆!1/%]F_A%qYn˘.ȂO"9Lm7^ x=gL ɔ0m>SԉkrE`&#ʩNo-BfGf` !L+t\kR˸&Ii4vuGOnn2?GxK"Atl:^rewEhy[H|K#(x׌ʪk͇ǬKbŎ[fWETʚ^J'Nut͜4]Tcm %&aVBn~ .r.2EDOBHElwOe[,}pɓ>i9X+b+lD FDcG| YNdc$5%r8j!Gt#44R:8\ңPxHݎ?.8ޅǻxb\ Q\pYÃҚOԆP|#jɵf(ɓx.hoWC'2a z^9iUk(( DQo")iu V\lmNd=lqeHy0s[VVsutH氢K* o-f,a /`@:A p7H ݜZK!cg1S.0fSׄ1@>43 82u_S@Ji(t0T{-< I|Ab?o\~`I+r!/\YoA(7܀A57Y˕C7C~oQ6Dd86&ه3*|øg?%Jx{ >*OGGѹfLqy^ Iƕ,!yReF4]| [1">\?|.OaLK:zM ZT1J}~?5>f" C~|}:I4'`lԔb/{pF#EKY0mby:yN`>Tܢzύ0h^ g%WwNyæ6 jYGog7f ΢aY3ozp8` @\e>[]&R]&`7`J>p|cđW-R9@(wAK7W#Wm!\,i\AQD9Gnޅ')dsr+իNi A0ii\sk%LsP!ŝRDZMAP{i¯ \VWhOnxq3S3؅h&ȾE\KÂ%(S6A C WrXyu:7dwo 삔@p2As9U2띱Qq|nPp(d20bkiB!4K6YS7YEVG5RL !HyQ̓2!%> #Β2%H|@V9Ҙ33TlfkKPU˟Bה_6HHz|,6+F'\oY(WXvX,9e~~ziq@>xk*䪊eB!, IVUhE5Q1%t:~p%i5(}Ӳ[\+j+x I3QFX|}pbHJLM5mEGkȑ/N)7r7~^tMD֔V)\_,R9ҼWI=Rů4Nc/o3 wZЬݙ#iT3s P ůuy[ )a4Ba'.w&EZq 39NƂ7D63XsݧmjB4Ĩ5 qFc[@Aa)&R[,lO@M1HnuTw sԪRq}zT Ҋ; :+[@r8wJhhÙ9\# /VZ;w^.]0{6ctzl??3J~Ł1؋ilhܓ/jH˻~%D-qȉh\­i7)t:^iMnmpȉ OD#$V<"38fK"YRƲGU^^JP9farin Ȝ3Gsw+`˽b)Ϳ˛eRv<= [/G)DP+cuoe-+ "q\0Z$Eg4~TҰ8 4$o\l"|oOlaZ#ZOӈÍҠ-IF~Rr(6ܓ(:\Wǁ 1*b=}g28OxSszQ Ͱ۬lF&599n"bcN(Fmu "В'޹!_w s^uѢ`FJcVT1\d 7E5Y]bݍ/iXR(^U Z@hAUbmLVER7knw+~_(Sl6M~TM`Xkg 1kBf rYJ ʶT(HՖ5 nlYӹe 6:ofgwJ!FU*QLal3K,ɸJ>[ VBQk"$!$ Ӌ$"GTAj0I$3-_TrD;k+%zRH3κ%b@DLj8K[*@Eq+}ϳ=I;FRg;LۗGUgHJ~j#RcSe"rptLb1Thٓ\\\\rUbfPNK;tz',Q\Jw=9hi%JEhIbtFC̚.7GFhrb*G\YLPIqB[ /='b}q)ѧW}i96>N\"w:sj=¨l#ICK#Bͦ6\|A vt5S㖎_ww[7f;/¿?\YkUߘgz{̜xKPi~YHm2 v)cLa3c ˶ۼJPƠCyȫ8BNDMɤɤFwX> Q}Tmw_N\b7i擦FTU5v-d))Z꿃L+߯,]R*wj[Y! / vԅJ P6VNqwPkM9@ U˛rz~orܞ͖6 %6׾gLXcjjg[@W #C#T.>U%ui$[j Z)[/ f7ERS 1--OR]])ynf]ґޖ6^0Lz,`Az@>$9W?n_\/>ŋyQj. Y5w֯:;^VW`Iz tamɬ9FQiQEyOftt_`i>|7CE痢]nU::K./_zt+= p'3I$ lHz)8}pz=LB8> M0/Ks&P[yn/]e[})úUN[( 0VyQ!T- 9jC5tUj;茱f#JLӕc[vr.ݫa!b}_n7N N"> 7ODPɱa%-X0PXQ ] (3[lDS݈&+O#b?`Ƅ7HxΖ6"E Q _]hfҝ=+eZXѓ0bjFbPI0[JuI*}i@<JT0ya1ThN*/I.=pκ"T_bx%"`0fϥ¢W[`'l}|][X=D u x0+?:S 2 ]r!0?޶ˈ (f8]\/>]Lײe.s!g6  Q㶳7rO~fWrw10D\[)E>6%$XpR{Bi ij5110 9qV ؿi=Qd!HӊWEdR+rcqq09╴!Sb+xFlJB\>0&$+s*^x ǺCqH!է p5dm  9I!`1` „Җ`MI. SfKK%VIᘆm%OpP֤:-0d`iX̨I'R55 53҃d,7<9BR~|Ӫ", =η B|pBONfmE6`T y/3U_2 l =r`N)PԼaN~eN{ ܷn87\n6Ӫ-7IolnсS+RzBӽY/FGKX׌k4oeޙ{j9{.[g[Z{wefEK2oYy\d\*]nީ)lH>_niO?' r& @e)|EbAOz9ʳA8t0~>O;nghKXû>kc8kc]u5.F UJ Ҷ )%`-8Izh6~D&#.$*uZ.J99jgUK,y LjTYj rk*qqAP %gzԹ e)un{+5 T ͜إz3~,>pvr us]>$=%+H `TBRX#0D\\qw(ğnfwcQ1Zm'Jq f@t&-?Fm|tӯ_h 9ڋsk0/Drq)!Po2c$PxS$4<4X7r#eĒ]fNj3\5(EHrU{!l!}JeF4 71?xib/0lJ@]")F,-a KP h"aD"JǷ;>t&=|lUh*1ǍQ%%QR}=|R"-q Z҈j L;$C<~,w4*)ѨmZ _U$v&0Ks(Ƙ BN`q٬;%]W ]z)$0:,RܕJSK װ =C.o Nsd.sj>cXRo%3TIjNUicӜHTk]8O74llUJD\25JF5YW6qLZ9!H:]]8O˷4@U 8sUK &@q, Y⒖ѐ)aK`i 2 ,0i Vm(lq6JӜd}P+ hvDcd:nRq1|.ۤ7Tҝқ5ӅWbL*+TY)_H="hK#IhoicȐ]J4p.LU_]NM$C}fCeȼrO-35-e60fIaÃPLTat^2 4bJfttó^JbKj`7 j#M)%$O׫%8 !aSRmͱ +cYX*A$x&0,^gT10<ǂ{. CAXBH0)QX.)(h+r>xԜXSePzy1\9NC:So$8[=nEuwۻ;'4ļNcw$!*m[㯇U**&դ;`^ZF[YNܼr +&Tv{@SA-m<@ igBNn']{Pi˃{jD!*q!И4%IB*b_8#+LlZfd;;/ϳ&vg5k F5/uO.w1[?}wVNvQ! |a->yٖ :*)v; cX>xQy1@wVn;bGLj?F]t$Zq\H9;Ƽ ;}Yx]w;/.Λp ܯ+N9FtgdǵWW-K:̯3 R͈SN%f 혈TR&,O BB?m:q[OX~^g"/q?EJԄF</&WOiv DL0BH)L6W_O.gg'Q ts7-??j;QkPa>ht&: /?Fm|ӯ_h Ht.NoV_Uށő (RQ#pEJLv%0KNBPJюRI$Q +ͦ_e#FXAK9VV"bq,Ɯc:0jdKH C!BZi  BZrD͔QMeӎ_~NJ䛯w4/@lbS"/tWu6mMY^ė=աF",يKD/^MNU"5T/ KȺ7 5,r׻Xr31 &7$zt 8j!nFˢlԯ rD˛?o~1km`0J]N) EIf b%ٹ,%MYM2v;SRH_Ӑ}CɬL5TQv`eXJ*aHELcx}4PF&/ -vAGp3 &"zw J0OD) }k)(pIaIb4&3]rZ߾oG>حBS;ׅsZ'2aw@`˕{cׇoZ3;J>x}MwY=cħ޺+d6|\`l3 fDG*E0BQfƘeT5kvqjŒEž Gp1q[_DOI2Dy) j_߫6lcV{?`f{!VC,isCP lݯ7hk@UZֽqWhn`J18]yMpmeLahEc)Uޒ}nʺq-^V޿Gj7 {:{(_0%!(RIrn V]w?yIn`2PÎYeZS85PiٛFs?uFp!=+ i 5;x-(Yyc0\"{}jy.T,P1O^Kp*iq@EjOi TS%Sz\cؗhdrWH>FɅ/*N4`t_ a$ANp7o?wxh8lJ]ۮصZO~>*HB N%C :vkEARH;*>k·8ki!/ӿ*mDA_"j#0ee@Vs 0&4Eȓ[uvw G`Ң@{^"Hp҃0"cIJ"J&5$%C>|ӆpM8Wtc!&tܾgmզ{wzt(ajP@[OI.q!N/ygH713>,) 3hXh2dBIfmp˱il>p'U+H'kߍs"WIJFIy$" 5 (d`%TRP^-o_@%"%+l_&M.W`EٱYq3|JZ+)מ!@ϭLJ8?"4C]W8"0cMyJ(3(L "z&Sؚx| Ԃ3TI?CChhOD& {B4;EӴ5g[ %-ዼh>c/ y5l` y60ox|iM2)ςI" TRT!€DKK]J0Jg?ay%ٷHtW˙_2jr@%&r5䅿/לݿE= ՋԚIGn\`$b59PL&/cA(hcC 2xdL9h^}W\m X"ܞr7Wk3ja!E>;Hg!seQH\"֚DV zDcՙ1ޣ:ۙ0#@K`t@ !O"h,yI,`mJ{3QsE^zhDnu\m 9F:%}EFTB$i )(EΆCރ'4(P!ЏSQ+ Jnh.|c|_A%g p<: ~!A 1ňDCU"5jF',r&G=tב,)T.A63S$Fwx3][O%h9A%҄>#iD< =F{ouMWu>9pTjs5ecAC)wܮK|O‡1,ՏbA2)Ѓ_=-Z*%_gx6RoCɆ RHNǯ_as{wɨ ={S=Wk{w  z! 3}X,]öP0.5OLjw/]T ଳDℲΫ҄y.M҄y.MKYgN M2;я#hT@=$gh ,Cn,#Z;_u]*\1 GA$= >ڧ6j!{T277>^_歽oyAsRy:R.y*> +%P ӕ R4)PlQ3XC=B`EkIL-?+KhGɤAiړ}m)iN,ȾR}ӳ{?|9c؆_n󐷺SC c=ڂx`X9Zɤ>/Yc1vh9q0ւ PiCcX  B}&sڥEwo%0 Œ4}ϴֳ'%c:CfYi' rH !3+?'',wΊȴc'~Ew73( ߣ~;^2HU @(gu6e6R&ǽ;EX_ޑ*^Kލhrrpގ$r%b@'= t?z%UCPbw n!C1<1<dk3EqR$KjA'KF(i@C^hS{gV.T7SrÅJЧR4Wg]8 ͍bASs'Q4qg5chb1sJh0PJ TјuFe#L5Y眾t̀j]R9ESI6Y#9(,)r9@Lhc@ իG0E iZTVJPФV*Ӳo)8IE&)˨T!&(G7QaHcP!셕LlW8 WcO8!o~ l|Tp l}`ÿBql Y7w_%n޼2:%l=t7O F'aX *^7 mS[0-(Yxӡk2 hkC9%͎5Vy㕐k50˷0's>~` ӛO^EN0{P#vgNP=p!ƣ;տ?eM:e]VgEǠ&=xlboMXeo+W_OĞ @aPpX=8G8AZcA^M~Yu/zx~'RO`?&5ثgbW`KTF]Nzmx:lI4d[z[p Bx'U>۷qO4Qm1E݇پ)J s3JUp÷Aa1vc!1=d>z5(81?8?($T(e)!)egSw7wM훟j!CrOG?lͮG\r~тRu5Ty3cX?-hc7F'PKBsm 5l8z]._J{-'@H6:J`'@Q_ӛ+Ы#i-Q! գP5X/|\rt>chRd!yADB4=ɠ G| m ' QsvՠǨG}z7tk(\U}g!Z1菙7@l胒>FTަsi4ÿSEZ2*ҒH"EJa^{Q-B{' 9Jw~v.*xV%B p9(>+}xztbnw}mH_qn ~6$r5ξ]M$(eGJ_PeZ"%P@$@wCWUdO*َeB$0 f z|?toq-+AaCqUOǿzcUVȳ}#7WoVbi]rݥgW?B'AC]; FJkNv;4H-;]=_H[4Z& x]QIøhc?rfzI7!wyGr;BNNy?ĹM{z?˯Wv{+b@;ُfh:Ѭ5[tNӺCBxlCĕ*\ÃM\U( ³hkuުaPGkuު!R'x|o><ݤ֋?}\jx=7їg}h8)ή맣z1zШRUzFԝAzF )j^1,8;8Q3AQٹH'$ggՂBQ  0.+.A,3X`q9JA2j" Gisc C)\#(PDvIytgJqK|a EXb+68v\&AjA5Ut& UdHsA t֜~z{C̀q3љL˻w, ʤ֔Hiy$EwQqc3EfRD.o'9hGipz p TKaSPƅyll\['1K0:DS:滄V3|<.|XKt+ lpE_uH * y lܤyuY2s *8$ X,_#uWGw#$ϻq3yj{&*Ϡ.ro7a}xoxGK$J?rIh鍱_" ?_U1],Wxl6_} _Tgٞw-Vs|)`lkEPWW7 fs/}ı '` Mb0PfFirrlʧZ-pjDXz^MӷO-N]n?x8x>a(gz كe/x^sEp~I7vDwTNZAN!  s~?N˘.G獴Q! sMztte֋&QR*}U:VHgl k)!ݐ{;vUnl: NAF8N}5?+;ε5|ٛ&g9&\Y4I,!D&=Y?*{ "e$)rA …Y#(U D&K7f*d ɺHHr@>e!# [,ɔEMcXbc~6DDtat%-[D+>@,$UaTUQ %}xb_ET<[R !jx=E(V(156 Ήwn*%"/l8'gZo'ϓ kkQZ~}jC }um(?YY R:O(C"$6<T\`R{Sy~4[òLiȊ S̹DYNPF-$yQ2͐q=wW".F[BUJw:@anNl$NAs:BG'Q*k1XvӒo-`9wo>=YVZOfy?[{SJxqE>ճUvw-bb=濔 }_׶eq4k ץ8QҺ ^DVpHDC2SPb2 Є)!isaYr)2(.'ss`vR6]\\6[A83,q ,]@̆QJ Ђ}3 vAjv2xX{,b5|wᗫ?xPA1`1 L<_:dPM4b>lcW-!1#)>۪:6O?)v<)vyصطSFc1/maUqtmBI6& LI}A3Icѕ1!+o馼e[J-<.oh8obBa)zLϳ'r4)N3ImFIR iBs9vhn$e2׆c8& A csZ\@Fa}iFnQ_v o rJdm[owVW^;`l94^ ~mf-\fvRo-rW R$x>}?R:x4"kd8 ~U 9 wDrLJ{xdmqD(1Vˌ[;LYyWOUfe!v3>de{ JZ]װk;tG]~QWA]n⩋vUWQWu9.jzP,LU7HGRCy .:V'؀GUc=)l 0^:d()@:j(S@G^#Vg G6P%y8 İS@> P6 ǐb TX}hS}R頴u#YCMVo}|k=] pbrRmխH3jN3$Csaf@m(?u֏ZzbZ( ۙ_WE^UQN^WvZ\fr AЬNHq-j +tQ4l˻dˎ  J 4ו/@"F gxoڲݕ.tJM2%)QB2a1%9;@ uyq1Cb ³^&N9ᤉS O]e&w0en|apZ( ߸%s2Ow="DLr TGVF̘tkD@ȏ,Nks0z}w A1VOyVpiVշK7{jMQQa1:7ᣭw_݅<D]#|KMD``ItBwгvahWsVDaEzJ(9 ). >Lٸ;@_׷AcJ? ?c Þ|_z\9K] CUL um iI /;?ͳWAhT)kyM~. wˇD)D1.Db]$ SkoR 82hlJNX)$ie9hC`id8*4S+pSHC)Q"6u Ap5#'AfTG/"pff:P.%%B;[` W +F!wjagH 19u7G!ts\I-HTJbet>oΕ0Rzr°0o=waST.(}̌ꑋi9JӓnۨL$$&&ϗl`ļ>ǺVa< YE2v5/}"1?t~bF U ~>ss[mX*tR`ZPyuv{{"K xpDW (blym[l:&?Q%N7DTҡ t|?XYmRFZqWgx#QĐ3:MZ- v[kp0 q@"u"UƑɷA]6N¥-&NwC9nR&:94 qV#1fwRK7.N1`=4zr /v.r !5_vhj[cw ^֒1S/_-%pK?RbU?em%lYZ&]a2BR ߑOdN<&AP|llS}EQo#!]+҄-2_^YV'cDd=agBx焎 SKH^>ƂL(l jIQ#fJKFd`m܀F ~~el*m,̩ r6R3m#Bw/ <9 u'-&0Hf+K(9u|%e֋E$ߓ6H(rɝ n: ~q|@YJN_$Yr:͡qk)%D$yFrIz !DA2"ksrmU_ "6Us {Yt7A}9w.RMq:^v`?i>W.ϕs\LA[kǚ#0ō<ҡDQdBB%%!FJfB~uLQ]|4Xc?U]$b,&gڀuzSqq eM'VL3:x0T/?߼>}7Gɍ5)΀nd(Rm"EsVF@RRc x(pnJÈ-UD$ʗC +)CUhVa˳Ӂ' P>ǩ2)8UOK:%{KY[ P`?j ~ ̧lej!EH"41 !F,,Z!C fr&]=NaN=ȜpHX$F@bP# op(J.AJ2KGqhXh%$>~Tvhc&tH !V"q ׈arMW8n&9y8 {a589P[m'a4BFG,X+pQ$/*!(5&c܊,[U; n ,ECuZ@"oJApIE@FG0@$50yFpR jCY5D>ReVa+QlYn" X1Q$@ f AIԋVDs,r?J㈢i0iQ@җ 08Q%H*is)tj2r9x:])@np9tkZ3l)/>)E3NfV@~tdG[ 5OZF_7N[0%)Lz=)n<;=?0}g0߇UՂɿ1+* H/+](>*%08|2ݤD.A$G^zXݩPGYөgg:vV>P]Sq\((kx .5vbgX@_^a*XbYot sxm퇱$,@y+]^%Xa+7AqȐK\O.{rƋ'gT)KUb񘃄6ׇ=OZuId`QiYh)7qAGlڕ\L0j%;EZw{E:֑qM TZG^Mp&V t- wγiO1iК0;f c}/5exX˯eBsp=0;f^Ryɞy.:D)z>-9~%T>).6nij%~V4r+ag.|>Tl>Ix^OC &¡j6M.m7 ě"tc(AjiI#V5'c`@`bP$` ,@2 6 S `dD R`c ~\JI g "e$8 GHb5H _ , VRKD1RM] :PI(d}kowvFuP-sDnX;)Z ^m, hjiEa0rJ-`#lC @b'eZ c bQb)2u Γ_F&hOUf.& NM`4I8LϏ0DEA87X qaK۳ظ^#'w}V%;a˚:-ޥjX>8BPrj%G~j̳5LnNa'!ؿIfMoFSz7 47$hm-꿙;XEҎW~3sO<_7c D&K|3o2T+U3߬ǒ\\L1XꚂW"X \0' ぉ"#D_K[`|k.™kXrn tZ7E\$}"TrB6ik+3Z1`6.y> 2=v:ZE} šVH%t}"x"(IvHcAŲGd~\W " 1bL*~%IR>Ϲ !PۥQ߹d  rDS?;*qȏ{i7q\SNZo*`O")%M&IQ8x8&,8zڈ屋FtV]o& qH^H`߀| dC'\K5͡'\RM*r$"YŊ/E)4CJ. G] ὦC4QJJ^b%9l[Cbe>m|Ń/Uƾ(TpBZ]vDO~Y:☔"yb-hvÆBєa`bVvɴ 8\!6[P'M:^LFQda*Xń(rnY;`\#xΛmRV\͗Y&cRs c@l-Fժե> vҵud}:n}-[zذs͖^1`p3&n 㮒Zun|hubأOe jsRdORtS=Ś#3"PGR t[Kۡb.!>W~y dٚ LX+Jiȉ B"U軝kHх𵶄rü*ZVhiddMDJܺ*#D̢u@8J\;n%kߧ*o`nۇ=h};0HtgD;GA _ppэ,%Z_P02Ƃhohb`|1"_0R*r?۔}߿bsw%5 fM(eLwh [ǣ{;f5fLM#؏GFB8Gݣ78ky߿/R4~x..]{Q:{<ko,vz^^7ߝmś߮^oKz) IWo_xd]̖6ϟG+Ӟߺ>Oo|qdc ~~_Y1)8r}w<أ;;]W$HKM_ u:sO?}zXֆQŬ %ꖭw(5$O:E`r LF~/w~^JEN*e*@,`Zifrs*O5H~BN+C҆L-LO1oQ1?jv0ƴgZ'`}V>`uWA}t\f~Ǐ<~J TSڥux-u>E".5g1JҏU JUVH"X}Ah:ۓ#c!%ك@+/IIwD7N'>|dpx"XSnikiuI2HCđ܅3M{;ҁJCx ml/{q kr*/CJS.˗X6,T\p¥K|LOwO_` ?W6dM&RnvxܾbiP DyέR_f̴f9 F2WZI P SK %Ry δϐg:Gi)bzK7hdfI鯳$b\?{ `v ʉ+k#|zYt'j<4m)BS%7w7vWcXHjfW5xeʒqb/e>ƷgCo 'GJ@OP c |%RҼJ#G.I*3̤L(ϘPCQE:f"[Z )q'ymn@Ҵ\_#)Sv;_j^W/..V dx' yoe^r%D?T[V|3z yTY]>%Z\:aT7O<Bꘞ+1-^V|YUhiuFԂ u*k" qq V*&,!cMU9O9b$''4e!J$bR(*tr9'1&H_n@`s ؞D=' AZV-6C{)ca)T:>8G2oQ 5V|CRڝO'4mV_kR5P@&(E Қ&t1vs_@Ł> M I\݊3 [9,"Ǝq) %K*ׄڸsiMY%c ʨeM0 $re-÷ k,AKʦrt1Lq*L)鲀͉E O:Mf2 `̭g:08 t ڙnі>̏q--"SDȎv̉Ĩaaˍ8-;M7;$p;:da1U@ E|~'5)г_Uwz&-/ *x8@#Z@ㆺnӣ%h~Q@֎ۃC/'tPDTun >bsjP"MM;`GWKl/ɿG!*[7'Vu֧~(q Xcd&iYb8g$UDLU5K2%_6'(c5[O& ={%P%f5A5QV;}5j*Őĭ֎[+>ATDK?ކ]TtADh4=ֈ\ _ϊWɣ,GnxΦю-mx-iZt&bO߿=ZcFY{ *Nk'ptp.4i8%9A?c$!# IH݆Նxo2> $dBPF0w,sQ O4ڠm+na]bUXh. Z[tgcL#֓hAMUQT&e\Q/tRiҌaťpLg)`jy&lS cF)CEΧф!X,, 4 aWpARˈK7T`/Vd:YԊTc?4 L!qޭbb*|WwvGS]U]@Ꝋ#>#;n*Q-EW85P4罫Qο81Y*'x]q 4Ѓߥ;jQwԸ*G"ÔYji1G2Ù[T!EjЙxOB\?y%~c1apV|xor\bT0N,xTb`11rvDY h E *(J >yLҜ_f ZZX(b1VS*f Д>hDKiωX@Ef/`x3yYL @Jj@:U,E#x5~}zU=,# 9AyD* (ʁ }Z1H@-tq*@e0 $A 4V)!*) ~zli щ\?)Jϯڔ\ lɕ8uYJ!!WHvO-L\ ؞(|.l;ykZLIe=MRրҠ9ߦy8]bߢ'hm^7tcnCP {Of6(Ifnn-^S4th1Fp j^Pie4510(2&e6eĐ4<3ʤ[J5>F(h u:U7!]M*h ѪbquH )ixrGsð*]h%0u4juHS]OM%w4B>]M.gώuk=ɳ˙;jJ$hJyBq=%c{aF4*Tf\ `#0;37Z}P"5;L_#֠A R!gJJGɍ3{nORpG@9iO>tKLjPwK#~Bh ]vtwcC= #h3;[f,\(pvGN7]~i=o|~L'pe [ᾱ^;(Ya$z;cٵ_W|h  m/c޵XlfXc<8/q޺l;t0BaFԭ3zʼ;H*^YTBmw^v\9o;5L+Eʁrk@;f TJy&֮75,>J<,^&Q/Q<80isNHo*8 جxCK}rlz*+Rצ{~꧿(EXǔ L]q`ZhL/zK6ԛ97dmL2P$P).NؼZtpj`Fc@MuӍMz+H\= sAZ u)x{-F]ɏ#])a'[(jS|fjMaFĹj''T> *bԒ rnޓyIo;仭AŠX ?KU`Ŭ&E: [~idL L=(Ă߹" 鷁B|{:|~aўg;(,CGTĎ^i02\" 7f\¯Eo)kz=j^qQ ndW`O&i=~Ug*Z ͐>ETIe-'}nT:Ϩ#-noS+J&f QsJKJ2L>?MB42u= `ä?_íl_ˢ=j6#2LIҌ!sՒ5^N9Q;Ne>Vc&A0rq]q7#CG`dh5d'IvK`H:!7;J!]2%=^RS%6J\0K֋}$AH7 4UZ!{qT  T}|᱗gjُ8j>>TD`cBhku E%U[A%mƃqZB| N#R"Oc)p-~]O`PGMK͠=lUpt w_qa>+^%/ 2/ M7 BߣĠ\|NA0 ?6 Aarf̀` 3`0ye>5cQ."T&cԤ,Ʉm|z2rBT;k)OiLQ¸˘ O$7ޥTX ÂE ֖օ4$3L?~"vG%Vcoƒ6Q|GޛKChpٻ9r#W 5Mhګc!$/uRUdidY7RWVf=2d2^`q(@W9o^}'nѯbgųj7aP7]J)m$^Z'};6Mm+)*ڷsXFB*՞}+O'_4}e8h6݀:f=cGʃ>tIm8u*LN(Ģ #@J*wEd{4<ܯo72A: ,`QJ RUL|[*G`ZQ?u!).U;H`"eCDZB"󭯠g r+%+Q nB0J6:ń́0W˥y<TK &Iivh5ig!Ym~B$MXz}tq܌4ت0*%e]3Z#K`a|O=DϪIU'U?Z!&D)]M ؎?*TIt*mNH]+~D[ M_|B89h6lF7b:v0OrBF =FOf v?|1w$D>D"IEuerxj;Zcb;w˧m.PtrZCt!7#z(=4 ʧ"vןJ|KyY(^[քEi*i@to[YqθRZO7k}7+pRQ3т)S:ń@^|t = {U݁D2woW.wzw\M$UL . aaf*^ʮ,0ڝ{2OqƷk"HT>4ɍxǦAWr=,wX}e'y7R7H[OzJؠ g_b)TY  eyCbe1A\SѴ7PdTzHQqlå6TvkR>%}˹ 9quBlAfg7,;r![1ZHGz ×S7&ThUʜm|R\li7ldFIdzF.o:dPVt6/eTWw2GOMa:eEږ!| ЋJ,{;?bzWw{1\ %ǢgAL~>zm,bS_h{FPӽ.?jk<)xZҵqYm<4u_RHd-{$0) ^qx #Nŀ}??$)-mu]lPNGȖ8ŞDj>^ϻGWz  @rMF 9eHgQu[2(Yޑa)ܢ5f*0a9'>+ʕ1LOU[Vܙxu+ |蠬auRSs'sXm@ `ȁXAL{ZbV*c^M3 -M_`˵UYǰ7qv0̄CL/2R B /抃sՄkLmq7} uTג<@F[ dsj1UD Z1r|Мq ܡp3r FJSTR^7B& { rUhl1ѸQ 6=L)~).d6̫yOĚQ ^a*`&r"7a7hG`hj 2oR;lU:~Ghtd-0cvo}MX#p^h3Y\=Z&/UjÕ˯Ѥ*bV~X~~U`~m@9~ { l%aԫ'b8*| YgO;tŧ~aO1]+˶BT VBS(8ܼO 7 c3D4Qirr4lKi5sX *}ClYD!`2X-1E':^L3* z*3 }Hi۴c|*;\JP,n%)8zÂ̳c,hf? MMyxsqsf0w1Eg( i1):zS%|e!<>}O(Ba-92-]j#rK!e8;w7_[ൣ?!|:?a#vRrF-R+-6ś~VRDv_Г_InA)!G2_A0aatd JKJڍ[rz79&K.X}ɻuk[.@\H<6+H~E|-̫WWJ TC<\VH5s-]51H٭&3`Zru2u?md6> Ԭb'[Nľi$)[)_*%]$- *E7^3Lx3 "dxbzVp'e;5y _l~TsNh3SI'hF,ӹ}sAۖOy@]2 &}rSU$zDɆꑍڵAH9T,!R䦚f!eΓ^c[ΓtXnk-{q$)I5&B ПI[EpZEpũ*&s3W1Qz8]iWP*`XV蓪У( t,s.s/iA]AY}V#Ltm".K]xA}YW3_sYDdDveŘL>Z~syURh/L$#9]IoMDA_bbCtvQP^b7qręsZ2+T"\F.+* e.+P+Uwr5ځJ_RD,k"b( N R8 E-tq6GǛ% c)DjS(=xJ$SNdu[5b&˙3)Za͈"!qbñxDЦ@VE<</nxYOq5N &lQ^Q=-[VboZ! ՜gi:Yn<*EhQ¨DU 0G>" *tȩ"&9C虋b*TJBaM@I83̭Ra6rer5r!|%AObx7gi"a!ya8&Bf6w;o`$݁ݾq[ 씼#$"R5kr\n;:L2pAɆl풡@ DwBAjE8 FǮcax%yû.ԟ$awݎ]fa Lv *(-*-FU5R]DBV=8n4\IŰ8Eg#y %9\К](Xw6*EZH;at^5?^<>r~?>y̤u,+ਧJ:+hG\6Sj2By\vN"OTMwp\6@4[rv;ؔF\/؀^ K,`C%E؝u(A`]i-a(c>dYVyXDn'p1AY++wmmzYf@OO1xeJ<(߷ex9#5W]]U] k1*ӌ%xW~_#y0mNak%k:л^nq?Ɔ!2NGC//阽U e0 B0mxֻo[/_r+<\IrIh.zL` khE< :\][ZݶA G\SkrlO}g?xy5r ^2S/:h4RLyaF L1e6K'&A'! &?܆xA@ )ϋ ܼAjc: XHgGI#iI4"&K;5CO0=l=t\7Yno3Fx<].gN5vvB RRy+hMiNUBOpns]jg{R,A"ʖs+0mq]+ 6#ܯog%ᰦ[J:1u(j p9Ƥ.S)|H+pv=M).Uwi)lF>\v~yO=o uah*iY ?(T<)[bBe4a<#ʢ?6JrzlgP+xm9E|-8t KG?'8]oXPDraDt])ՋGËcוW_XKW9/H]]-a+bgIfh҅co`=GQN{K7ZqZKI`%Nr9 Zi ;=V̞ST@yקN@1B={;3!3w'[^1V.yB-n.8MG䊠~iKHW[Czqr2ܢYk.䚙2oю~ukDD{tfzմ> ^.w罟E9։qx}9>f*g͞ ^'0 !m3ܯ%sgN*!^"rNYU]bqy%YK⊏2BMtӄâ 7Đ$'N}<\ T5iAJ"?7 (V[qA-"X0f\UL1BQQ 2*d|c [3߮Iʏck_s&ԭ13ϙ~Όsf_ό׻Ց \tTS qHZz-PpNR$[ύYrǭ 勻5Q/GaE.uncxlcIaR AUJYZL,1нg8٘}0M.*]=faIO|:b\PZUQ#="u7.A,n 'guJ%wtsD98F{3p!嬑^j6zgw8} J^4+A5y2_nGQ%HS}F|X~* /sD{r5AGo2n1м~&ec7fV>~57.EXށZu^<2}.Z8G&gQPqQV}V\e%wDh%iUk8Q2MVʹNS^J3*\u7 _UO&lOLBX+Kۜ<_ -mĞ\Wc|ٻg5)'L2h 2YƲJF*YM2ac,y9qW2ܵ"tMķyEyIk_E{{.mK۷'GS^H[A'|ڊ`̴p?bU|fI۸V-mn$|׭N̍JxF4f{뻒T8F ?2TB)nlM o9e D76eEHvx"%Wcn??7i ػ)!A"U@4oI">Kh)<|޴3Uq5ZUCcN~0)I=̙Ȃ2(Bv喆mU4/4PF2V '`q'I86(XH0Nl% 2kSy5UŻOfkdjt-9D<0b]ɝc;ٳUQ\yw'`NN>WL0lhLrrfY#u\¾,PK ֵjt|nsIՓYޝDGPʪrXFZqM]ޤ1)*Jk\fn0/]{f@ހ!/ї-fǚ1|K*v6ؖ磵,eC6O5v^u%#"RyФV5Z*|<5";R/%nQm4BiX$AbH-e f56tH?s~nM&puH0I@yMoQ!*V6i…pDI=s@௶zz6+Wcշ^yy_毫68K9&@4P.IDÙ#gF b5n:P#5n:mh{(U(]sAh+Y&-.[ĥB Izo\ٱ!26Qs(xѣ']QK=b $( 9M6%;M{q6g-Z㋳rțewL{>W_)وrUR1U^ଡ1 )eH0><~`4co.aBE)?OpVc4<3-zڎ'`mJs/!~> B s۽RJ9gh5l\'W<Ciy)`-(j_l aҀRõ.Y>QMuOGDѨe"9M~s%(t6&0J@ jkm{wC6,ovzѩ_wi)m(*#<qè** 5$S chkA=Zĕ%QVn_b.&r3*g>&ӌw9>pFrWx#e58=##{]N hPݾ1ˆu^4QRC `\K`*^.8ishw&2AF?VW]X~i;5F-GQ\BEUS'$f7H=Å d!@HNvc4] 4l}~ h2+C?FĹV%IdFQa&xaKTZ=a8L\|TJ2A2QM#QUVxGQUIěI]B7M:W h.I{QM`2ӶEJ9lЮ:Z3ڲBhLT2RB.$LX":!aI&LټÍBhʛ (!O"DuLr-|cz_0\s!KpY{ UTaqEqզ.v{;TJ  8(+Ak5[nG.57G/P!Gns\µ/BɸԿL>9Zsq{?/ N'1$[WȳEb6Juz8_ =6Ԃq:݆3 ϲӐAX+i9Nr\Z$DcvGN:fԜ'@plFͬLK"h3N*>OCzqeeھ)#p SuS`ȌV:="+EDYbby@C @1)kj i3CA>F R[AnZaḏ5#De]Tj[@$' &t2-L:b;ZD\~*Z*Ʀnw#on ք{uxZT= 8&p1B= D9Jw@6ҠVkB~!#^PLGC,߹y:HC\n`?1N}nz?96Pt!,PzTT9HD $k,F8~ @(dcdnc(Dob#i4 uB兖"ŽAʄ1/bPkv#ع<'V.(NzP(G0VxKMq׎y^ eqGAkZQ&"^2Xȵ80BqJ ViBV7qo'Xۤ>%_I)}DR(8YhʣW $Kf 81Հ`JvJ`ⴋsgq48k?X^ np |r˝VqgySXQtc$}l:]J|e R(4]ˈN٫*pODWU!1Ed6"\BFg.A/ƤI$*!("\" bQzIb oM5EԪK}6biJP0/8 "|\*sSų! gZuoE *Qh\P[5U\a00 fSFjFԻ| JqܿǕplFvtpe(B`{L4J覫7YѷڝފAR6ڠLlWI͋>'e y5SԴ˷jaK߹Ny3 _3^D^ne7ם6훏kg)I;bh7hm龣@" Otw;F%=ˑ ZH"ha*AzkYwꝭxR9K;xR9Kx^8A۬[Jzq;Z_18[Ge}`Nzюp͗Iɂԫ'ڌk0 q͕޷up˄{$Ψ)CWTxNWE+a| 3)$ӰWZ3f9fTelm!mpe֩;N~7'1QБtq }<*qϱ#&P 4ӁpK @&mŰՂ)PwmVDAwڇԁE8XptȦ\D2PbZ;/l0܂T <=/hc0&`ro@Ay\H( ;N(>{$շz6;Bbv1zF$8  a8$C/D ^D\TuS.8:J^Q47K0xwE](%#6}ٲ 9R{|n 3TJ)6^RCۊQʒoS S)Y~AgНwyzkի/"?̛t~Fnu!NKEQ)ͮVM*JI$IFsK0\tWäaC`8agA(I%@r*3D,RHR{p@n[YN胒G /؀ID &".% HrɃEZ8"{45bR蠪ox,ΉٱmkMV)WeuSͭRy#>VDaZDNmȩ?5s SLꡫu_l?ԁ&ǦltT{ }} 5t42DDZP ^Mb6}3t-Upv\qr.a]n)99Ǖ͞eP*;.t/GKy'?_ĸ &/ ZCޯjr_._jƕ+f'>X%|U6Txȧ3?{X?,>}_ qhxyI+Q+/|~'n_OK[_,^Q `9Bfڀ\[û!iz񻘱3eAu0 GITʚ(OB/g,'a+O'f> AMs.$ѝO/{m;ЦFR#M E1fQVT_&t&lOa2SK'zdl؋E2 nQ_c[آ[p*w1ax^8aPcq\Zk(ux;GUnb3K???NfwK*xxA'?/G @?x&~4lY,9Y\C.&S፛fI3Lzj/IɶkV]x9Yc͒Zi;6Z'32K%c|'wq(xpՌIէzj/}}YR.ӳ*舱eJ~ڌZ.3-ݚs`,Ⳡ|Y&WY.gP臘tTނ,b- D!$PF`!n@sHX!\\)ϪP1}?)4_MM2!8!Hڊ-S΋0^%#ɀJ| -@WTͿ]緅Ϝ92I9zq-1Yat>)Ճ;vi4u9)gh3% JeAtZ{ <'T~ѭ:ɲ1e̪a7'>LA*_p Q}gLuơu>YޔH|nw=l쯰_/{p7OrIi›75FZ;S|HKBXRV !߿[) xƘ!/WqcDE4[ m;khn\\?3iG~fŻgn<_.Y&\s.hVXt9 F0MD^0!]U^uytU,i$T;hU}I貤_ƈދnK }h= EOEW.-Z<^-U>zRoM+F 㢪c-dq3 .38e)[>SnYԡR2+拻. qq!aEEplkuG>V;qTPGPX:g)fZ,%I'SP3 =&XZ̖nYu6>^^+dVkX¸(0Vi 9XBzƥӸ$]OWDIs T+QvU6QZ"Zb KO?{Loaҡz—I( B쿢y9W1)xk7| cCՏ>/Ԫ;XE i y sP]:c]5Au0VM鯢y 8K(;zg=q?->d>Є#i/gHiuZ<JF?;scn5Jck?4/8a}]WI\(N<]XIU `e:f> =Woi_@=KY:|F.,r@!m(=i:!K x҆x#ztSBQHo!]УG\0r3Ja;Nkԥ_ɍD"j|I'9D%Qi1^W;ݗ@L5K>M6HnΎRpx;#;*{Mj<9J)MU[CWԌ']4AfЊ-{>j>]+ KεU=On={q^ʡrUdcspm#0O3xiˏ{gr1 |:3=`hS8j ќnqswD N^G3F#WqHl 0X;$_.o%mK~1JCns⑌\2ŐvZD`dzn8ף餸rt=\t31`5̟Aj(9y1h~Mƃ,0qfjB%g˜sY\,؂-goM^P-!)Gf(͈yb<)Sq%[%͠S3(,h&?WێiB4TG|{UnyP<'Ls㱓qs 0;ܰADq  QK鸕B-kžsfɜr.G JZINKE\A&Qtk>=&# !'ʃY9E<މT.*V&JY1zG(H)H>^&,q,;q.fh%N+D\#s"`קS5JOߏ.+]e` bS^h-Btֳ"jyȂ:EF ~slV?Cn.Y9sO(o.@1Og|<?~OqV;BͿ^*L}z~|#hCRH߾=zD]J*]8K&K1(pSDU5Q!39TH&Q\.,DK %'B6G. KCBPyPHq :^Wxx) Upvq5+K 8x?4>elO 3o.~2ӫZ}݊۲M&4F'IMĶ'|[flyl5BO>>?HACzTP]-2D@:*91iN2#b]SLܨئ;OO~K_QWeElƐŻ|7 ~U2D6]"$EFɁ\mR.[lg[w8=O) 1 9ÝA^vY)Fy`c/upZ `v G^z/BRl$,w"ZjlXΔT.&89vQrehBwz[p8Z1NNb+* ވA$/GtOϷm ,(q<\ Yq0 ȋ&sxCxXh(_EBw \BV Pf<GAM_:`iU X.xh҈;X&IGfDۈ#9T}(L-ro(fP'[+iIV*8eF"#V|<ˢ1ks|{q^xS6bOx|ϱ%=!9cVI֧]\7 RTyzFVnZ%DG|9/$-gcώe=O3aW .Yg*1E~BXqvwJ1Y\˹ w%yEcO?axslo"QD9̿{bI]P2- m j[W}6pYJo mwLZQ~e϶~-P\ekTu|̳4B}Z TeyHĄhN;^.{0LlE*%>[|$io2h(M] (k(.hHwPd5g6v|ֺ>c闄wCQ2 IoCT ]]M1hl.EYvT?VYEze!IBIW^W-WIbDUv^RGjG: 5ڳSk=8(L-פSYM7[,})o'&u8 TN@$ˣ3r 4cBXX#Tj#T qJIM֬ҼdkC\}zyk@Irŧ_|MȰJ5vU cf,uSQ왾EP+s+U,2b&/\jKرdlD`XTT|D1$USH\i3PJQtDBUony%55"]'jp֙s۽Uy:]7L7Fwu:=GhtwȭM{(|gҔa, @Rx, Q.oXrI~4/gc¶rn^IlƐuCn![7L2*a%5!0[j'LHnH\ӣ|>îѿGINM}.ZY<4zY |?̋1G޾7UB)wj`w6UE;ABz[ZP̏UܯK!Xc sH.hS 鍑@(Axϭ4diXxmH,EgR$_6h84Fb\^X 1h|/M9jժd'_#؜,Ѭ(ʴ.{Y|f-x<'4ض3؃E4v}8vVc,(/0+~͇~j+e>je5}5quN(hBO6ғz_¼E$w.MdJ q)ڭ))t:,&[s'&Q!!sm$S,8ϟ6Wg]'{IIyO8 d+Gsr$lIa[dQNh'4}ܠV[ѳ 5 6"C12 z}u"TcOLMv~Duҥ썾mjQڒNj5he"ޭe[{;'Ю!nEWRMiܚ;§Ć?o.K.הà 4џ@sxpKc.cڐ(`xgu,S޻R;z,萛$%>%yIWW.R TE<4G=Jщo{=6:tAU S5aC8P1QlzyuMzdş?M}0acWϹryC1}G&Y- 7':c6?T^f]x$D=DXyW|4]5;p48[W֛^aƤTG3bہ󼏭 g>]%C^=cAE)XmޝG,p!04+*IS*Ki1nm54f\| ƃ631gv?-z#䅶>Nڌa2ŌȜT1rdFnKb>xHyH3ߛhF] \_sj2acj.[Z$aJ8hn7Bj$'~ǹA# a\3SQ|ӆi!Q=ѥۯ5S{ M{;@U3S&R;30(AHXLVh sYbs뫫7"е_-B]p*R`F@E#f0u?b^p/T"/%G ޵Fr#"e1 ^= 0Ƈa-[-R۞FJGV1Ue@TY_ >>;񯗿hʥeF iȴ:8b.@#e)ay}PFŅ}JH.eNHMLi@Pvǔq)gG/IM#:  dkCl|1x$J,ą%󮒮FBڻL.siC5*y9O@hauԙRrptYt=5k& ?{wq˥YÄ߸>Oj犱??B.}wBsԻh%M-F`7.}l}Shfc<ĥ wI˷EN;!&T$Wt)"xwբ`Bl{ 趽˷ :] ]C[]3F m6pqS%\EVi> !*% >P-W" >Zq)~_?s짱(l!x/wJRe͚UՇ_dUjUrF. ,[WXr@ __3.{u!lAJCW/}w5X=|!F`MId03+U֙~FVkٛF4V?Z4zXi}yY ,3MwtE1Ok[̒uAr&ׅ}SPu4/Vc/|VLǻs)C?Vw~dt7o~ 7*iJNd8G|Uu3NrļF#5SEDpJ)`Ao 7j>2DzEQtq)ˋ#t񨮉hLՇۻ'#UfloÈcҨFir}~ƍfElɰ}iL\nyҘ){/lOigQ.&55@4Ni4sn\,7:.nx7 ;;8Oar&.Ae!C;?/>rmXǼ:x5eKgwF锦Ѕ~2֌^;뼹c}g'pP ?30 wR8F8B1|iҷWx0Qp`sjgغ s`Ks{s͔!Ϳ2Y1A30OD: hɏ+ AېGV )mDvLvFho4f1|~X?[ԗCVfa ΧUz._|E`8\)-L)VP$-d}_!؞.N)iFg&X}gspkaDESUǨ1j;>ߠ:<ǟO3Q|rzsҧUy,sN- ;ʎ#fF<>ME=o(!Azp~i&B2g/@٩I=E' ç=ƻ; wq_g\ޝ|.ԌzXo1MSQ)Rze+1pbYfZÒt-PUݿ H2s(yFE֜OhZPRs`h)dTW?TfeftE(c1eElLTElM )`G#Lޑ,(@ 9Zr@DK-3qGRiddKǯ?_ewsLqw* =S$X>vڢ%?/4s/%t9\30{{Z!-s /W/;;O ;m_[f`)0ɁּdJfZ*\)I.|dPvh't-뼎Y kv>V֭OCuЁ.l'e0[멿ܚ_lnyB.(m<|r< +־\L+[[I]\XbX<*(KsvŵfwG0%4FQyS)LvpVZrv~vfpJ~[y4`vρz$if.&ECda̶&D΋&̵d78őZ]V߾{>_k=?u.*oUmMT 4^xmߗ; z}tgw{|6_wzšĐғ{WXt@2 ۋoAA0"`VGYZ[)P N]lTJTyrC-B"p\0m ,hR42kbƘfy?-)nQrS2a)8jDtEQF$ȌwC6:(;(&5~mxe' L` 6m3A*J|MHG}_0djd%Hɡ,@C4ff#utKIR)[jFL6.] ]YoI+^=%d F= CQZуKŻud!+#>t(>&2 :VZȺg菧Z3Mswϟ)AC,!*-oW18Ew!?͟FFE;OU/oxŦT `L-?޾Y|~Qx`v`5o7GgSYOo/0?LJqz~J!/ysͱq4hCF`Ion_D\G?K>qޕ~-/5&!xD(uys[s`? \vXnn52oq PҒ$Wg%CbT~[2٠rSr`10 sOgBk3$fP3rrb2HNT )B 'Q f40!*&7":طD@ÌZ>\~I '.5W %5盱t㖗xT3-/)1T Zx,Bշs|<>|8z -./~?NtB{n?O1enjHӁi-<tjQL Hf1w8K1Vt+-3nmZ@aQ_۳<̍bfq+J@X̌f.nc4z$Yxl|-\ff$xz啻Bt<8C{IW ^'?[*3i(&=~bBSI9`a,ȩMʏ')d"ϡ!.p(^nnEp}šMFhuBYCNl ɔ%$A2OrVhͬv>Ɔb=*hs}H$ xrњʗ7С#&}hCte&qR ώ17 Jå9رR^D8 E2x)/9Bg#\@9YpBbԑ7c'y F&)@ Ԙ C-gGs fMy~P~U"IJHk[bˈ$d?PJ~~ܡncQdDh9S=Ǹ8A;:p-ZsW@)sX_ *bD<2,θt˭! Y zh٪5 d;>?1k62?ŧ9>SZ`sP5B4 >8gbV&]VNn(':{-A#yf9)8af_ByY C0)f hMAX61Ɩ*@%F>T炐̳8dT7#IS^sгK (Z^ :EHW xwV쫶pJ٬;shC*Ak8Pm_<[F.)4j^.9Ӳpw1af3oh>+M&XRzy~r{3 >syfqzpH[f/4kȶ}'e^PPr4# Hg{n'23.q}d`QEr7>bʈn u/>?]yOZ crEk4Y[~;lT7_gy>o8jgh-̓8b+PWz V-7w_F۳xlsdKZh>vqh3`χdni]hn-zzsn= }nI&i[M/;9ι}]0G)iG;s-Ǥ&9~.- ­;OCqJSބ7.'GWjI*Z<Y{'ͨ$SWf|=m7޴oбלjQ__hMh$zs}HoysBCBmaȥ,\`3'`s ֤Jn֞Z3; SlU$ &K0XD$cPhonx3K-21LxƦs->k/yYhf8ScZ)d&7F˸P9HG,̫2`w_c2N^k" iN˪PVF(S:9ICa Rp@G>U-穿B5uR&V^ *̢*L-ETsc\(N:*AyT7Q-TC*ҵđv5#9ΔC"q,vLT>nW,mjeo,>fJ*ˇ;^'nnEh͖`7A?Lq8]oU(qC=}H-Vb0&?;~i1 ӂBkPфr1hJxp|}1 /}zW xT)%z|P)a`sr&TFsm)wЈ3ɥb<Az )ʆܨH1<չa*^=T[UVRCs}(YFo,j°F*ҊQpZ$g$u)+ṣ܏ҺLr$C/`"H&3~0Df̢} X %"NN}D@g|4S/rv<6Oh) raڌd @A,:@_WU}U]\qJ:i(|^ !1h/HU>ӊYK֊Kat4vW أVg Y`1fr4ZiTkE/+ڭ2n }!G\(z[D)w{F Jvݡt=y_/W;>[ }Uw*_xIු)'Rij}נ?|x( >QAs,X@`acQ?DWSӖqalTCMi sfta#gëMj8hZg4ƴ0֥mb촤6\a=^+ 0AZNh+"5vC=x  Gɕ#A1x ^{J {j6X`6D#,2 TTc8Rb,^o -I*$D;Bw=G.d>q8W$}7&ՙ̴W^|Ġ|P,LSAx&<0:RNxk ]2+r Q'KSm2J/%҇!@zf;shf_ȯ%ٴQ3M frG \\WP.k7ЏO0J!c>CQlzD;_-xY"y2],OC7R ~N+dD6y}*h*ĺ*]%k+Ϡ;5Ϡ""%{̞hH52u;l,=ܤqsl?}Gw-M:i+7n-JS?f'~L* t|C2`Ծ-Ӷ,_^Hp}=}'7}t*gCB PJcaG*l_&;α]jL@7=C*1jtз4<ЮaIUy!w9q3./l_H^ +d9C_j,!HױH$fр:);* l¦ lf(ФT\ԉ EˀRHw<01^)RߤSW LyT*T*.(2%s zݗgTFJ9|NQ$]vHvpu5tGW/<] YM@x +=9cp$Xm?JTaٚl ܣJX>_C& )ɒ N2z-k|bv_vV^1v^rnbG! py>)$g'{$%*JA岑:z)%(7k!xEL J a\RʎT3PbU'/cT84Zb0l5zy:":^Q橢+F/ /?~[9柫HS9mzTdw?P2D?}ǶC/Oqf53sJU߿ĭ{VoC{ʽP֏f`O[Cq)rjkǃ٭kz1hqw.p|987m ]tϧTOWИcL-1c(ڑhkVL*>V{| v@Bp?` ;C$Szu՗lc8cp4gF\kTo"E;ϲnp$+]0h08.ڕ>R Y qo> nԀS񇜂0fMM^#Gy!2l}HY'#34}k4 <+V57B|~HZ v۫K/. 3$a9qV;s[vGrcEBLcV4)}( U+O˙gR֑Ykmm7*t'f.ʛj9ӝfӋ͘AzEcFQ Ka8!(h7@+o VSo+'|p{ }:C|b6{٧ϱ`->6 )o8 J_09C@Фʺ5pL8GrjN {0So=[onl3iL#6BZ%{䝵G 5)L%maVn&ɸ&{<ټb[Suӎ3x-drWo2n `IC}}FuEw\7Ǒw<S[0:My^Px9-01ja Qݳv2 A_O)?ƞN^BwYAӈsb"\Yosc`8YFw~\hDG`#4}艅@GX ,4r?lAt{ }x㇟ntzaU<կ"wt ĝ"HP7]bE[Ĉw;I)q8uya)U3ygn(dlg0097AݙP.Dw af ϻ47b&ğMT_lFA Vx2ݶM W7pyеq\8U$E&mr[g_Y7/gnsM3 chk#Ls8FN so ,8= ~-B99f0q\8sF( q8N/kGdeW+@X˪mHmP"uQHbi G]?Dڄ;zN2da_!"x1UHb@SzD2wֻD e]YaJ ){<[R@A4CbCFMF w *1Fm:\Biפ<y@XV J%cHθOd9 *$Ot:yH'}D '/ׂ)gƴÙ ;|*{:S5:".(E&XJ$փf|xe4ʾr:I⥝'rE(#LvNC5,IxT#<8R$DE/FKik.(7UN]7,;A|W9Ia'ĬwuHJM\z ~uc+a@ʌ+89ց;QSSݯOIW_ݤbde7N9 HZi #2(Zc06=8U|7W _ft@K߉Qޱ淼5cZu= LH"]`LdI%]\z9k~zq>܌ǜI\p˺,JeN ZY^;wpﴐ$4%+%]Qo!IdJBvܒ4 .kBLP=7>R]Mm0"9."FYYE='geh]]K˱)X,7]ۓԵ!ZE+B"&;xX Q!C>Pu3 ? 峪*~fx~>}7iՠw}WFAGxkn敠ڞ`>tveqq+2 y= FtRb~lZش6G%oA|`Ŏ&;V4%g7Fw 6=x݄Ja 9>17c~ETpDZ=I,G`Z͢lhm$!$0xihz$xi xi{'xasȄ)7#F[GޱfQv #m}eB/Po7}+Y.;:j ^q.d3Y+hQޙUP뤳/ōy˧lQgi985[t3f\+]z$BEfN}wF u`K?||0=02Vb~s{ބwr-hj?[!悅׳Y!qjrY4]R sB5 d4]H&\q:ۜnl^ܘ|a:W>hpH!*Jz+B%%+]7˄)Q?8؎S/689"&tR"E dLcK A Cgwz:G^Co=H=Ug/&[>CndvvMf!H [p)ne z+O@8m̂!gϛōy' y:jȢxbFdxs &+e6S(O1sU+YBO.Ő,\ 8NvF3.:IZASNER8.WQf fFQƐIXI`u 3xFyڗ3=YnZ=./^Ʈ: TekW0I.=*7^$ܕT mr|\[*ndq_Vv'p?Kf/vX5|]&yb1di66Ф`\ݚ۵u<;X_>_.nZ|(5*EP_q8)P.6=jfh(*j{[Q@yKE :uUb@tԥr@Bub\F3Vh_|jrV֡`Rx4#j4^:qˮsJm4^ g56*ljZc7bhn01漝 ;\Z]\r}?`d}N%[J/ cf[y݅¿Jf$7 4U)kS7+ ]џFb8csM8pe993 vŲ9!y ?9S3k"~g\ج;n` Gg֏N!,^R̾ȑxmb^|vAVTL/rQY2}:2jz_/JkV|Yqޣ<6^:+}Zpbn/s h}j6z>͒ij=WIw=Ҋ#[Ijucoi*UGys hӮΛeg -2AaE塾mxCOElݭRՍ7s=Ug`5? ܦa%[qüOO/~y<,4X>KlʹпM~HcTggZ&?/|Lʉ\R:ag˛yff2o;+7컪ξ;~j2 Z$XPI9@Nc  #9BRB^$ӓ_x>X#\.vYrǤM>9F9W kʧݪ|dum*}oӫwՁad&Qr|~/ٖEˡo^dqdfb3?d+dr*b*L> L*޿;?AI2+mIօ,*K-,M\H5W NDS9 JZl׫C~ٖ@NGk[m c*)ռgmhݍekIOX{yVO/-TDxe 1_E {[AP AgM\g&ralٺs"W·A=b$zB3FnR۶w&Y]o#;ӧ*;2C$)J.l9PL+9pShb\~WN6 NTvc}X(ThYz˵ w @ŷP d'so VIRh6fVd.h{{H.\x$l:|&}Jl \Q=Hfi/t" m ^+<RJޅ{_[UwNwj\MLPsQLC05N !҂ F!yi~O83F 9Јݔ1bJIM|+6\|~ Ň~i´źgTl޵} ڒ=C/.yC =8݁@+oKvCWhܵ H鏫Uw֋6DL [m$+_{""ݯvӃ w г %QNgC$74DCL4ΆkĤSp_B6=j]5wa㿽f}xޱ#F0n}u6geLWPM9 Ly R-JE2pNr"#~c(Ne4ԉ* F\)JKd"2P CFfG.O-o-j#5E}B) +MD}b}y}XH Nzeg<}j>")τf#WPkaLI6SJ[4w?>L& AcscnM93Tzޚ6\mKtK)^Ab1xKV>L\8qmosdq~3O7>w«|ƃW0*Nj$ ,TPeLgQPL*QșC &Eb흑ΔBIE-6Z9F]`sYy"aC˥H*}.Zݙ͔vP& I@KkXD D Ffe.3EHx k BX &fHx%f81VxӔ,.~ V e֔KȈ?svvvh% @F%']:T5*K˶q-%ZQF$5-Jq{}b.ʻ4Qvr). 5 ]` h39 u7 1QQRT$L AeZ2@^d Y/~~#n֩TDZЎ5K#Q!ZtK[̟/iuqoLoaڒL[ Vtmii_<!_~+HC ߫+ěk}u #WF)ߨ\{~a'^!JV pG(^i#~JT؈Oť і}^bSIWhyFπGgug)8( w 6wCX;X͚;EApĕ-ʝafO]!OP,=(U%j$u넬RMĞjAV?SK@C"p _|ܠ2.3[kL//A;zZMQ.KI/ϊ/K+aT 23"h]@{+jFlYވoLLJǛ '`ɺ8;,on^M7nDEPBi*!|TrJM/ks]÷ x5PjVlojF5 BvCf}AL)x}|FOwBs's%8-)-rTBӂ"re$p LgZA!9qgYB!j)&BBRK4A0e7U@ SJ\+bLx%fKىβpi*E)HZ$ɭIB@‰23%KHe9pӕ6U8gBjr6)8sov4Y!\^54v ԚHwm͍rN*KpN*urLe&/r$8-{-y.I? Jl$%֌_7ݍF=iXJ92޹\&a3.2ИhLg!KMZRZ9rީԠS"8@nΦ ޡuΤ>z@AXы,F'鷳ٵ 󕰸hO۰ݣfn)t[_={, zHaZ~*NOnM/}_aB''e{FF}xs0- ݜ'*OcGWs78?syү7  -)lGsn١$CO6)09 @4>_run1j:nWFA^"??uԆ\"h) x0 j<5AF5g5hʈ1#`1Ŕ*#zx+%\r.jjm9 "ytH"Y=z&~$_'Ϗf"cZ ķȞ /kʫ7!VDCAc_f@]ф`$Dlo4\AЋN=%N%g&I!,fx K:}=)r"hT;QIu;!lș5YĚnD8ԣ jceBB.LjUdNjM`eZ_"Ҍo*lo۷Q +4:> 9ħWkWTiP >?vF漪@#DL5- @c'1H@tJXKy3jGVԈ-E`ޓ؉0* caJ1ؤ,'0jd4S;D 8A&,w'g+1AK\FNe-s jaÚ>v!BXƩ9"p m5 .(m(݄SlBD: 0F?E69DH/$9OsJ0+LDi@il#9&`Rmo>GzmYYNr ibD47Uc!*NYVjNqcÇq@-pT(Ś>v!x9ah*H] *hhjW®(ԆV1U\ H =ZӚ\7n!]q9/aEJ%A rݬ&cT t<ݯZ7XfTzCf>v8/:`۔h+ 1U1Кi@۾0FxPm['(@YCh_Xb!KQk?@e(V`_*T})1TlP/6h, /ۍ ֵۋr!-qڪ UgaUШc6 siA>VҚ-nfyZf7*9 T .YNEQH^Ua'r0~ys5X] l2p'3Q> rUr:|@}ҾDz͊C!XwY#mu f6ݟ6Y1)VE]Lِm :c9\dO[ccZć:W']Idi6/~L}^8s}{yrqw52uYeW]•{'6WSc{˨R*#)5>4"m [=2w~N xd+~7 TfqZS-(nSÛ nd]ct@iVG:sm9ё3XG܌wO9LfN߮jipz>NY + }wY<,huO:+huBz=LBpgBƤ`ֲLKg# &d$#fnLuԶLG lH"kV9} NgFum* T4},B?;]FϫVbswx+%y/ՏlnDb^Q4)ڂ^dex|U%`'gxXmXXbuȖe-dH5*KZOO%.w&cȒu;USk1hһ˕a"z0߽FR߀BRmBbpu*-^cxzG|W{ y~lnUJ`N O h1CG=QtWGMX`A{t?g?cR/: ;5b X\L0Vqny&4Z&hV㞧`:\2GVwѩmGv dIRŲljRfVqKEj9Z n gnu5[x 3?70,OӤaN4ͫmB0&Yfׄȭ€|;fݣfx6Ustw_5q, ȋ~*NOnMť yÄNO&goۇ7\Ogxws~N$U_#Ǝ3n:q<O?_o&?XEqXIs{R2 N̝ķDZQImj | V[S=)$ -֭K]2!k)jMҡK @ː9,𙂌[q-;eT+ J6 8-.UO|qSq=䡁H& eXNR44"u2>4y=h2ʭ!Hଏx2 Ǭ Z]Nei ؚCIA)[q"By/.4P1Y(^2yN*$”t`ydY'NT I#w$)2999:<,_q-Ɍ̳{a$*rЍ>F͸ toQs=^S%^ˣ~h]z,󑷬SM 7 wLy_0XmsIJ42c2'dNDd&i%+M띎fdH'd2m\eVԑ3q!ht!uj) =`kb z4|`T<> "{1-lVjD*oە&AI: FB $,4rDz{&sm'QMY ʻV\&'Pd)EײL;I="Nn)iIiJIa:%HJqCXq@EÃ4WzHf<ɻY0O" r3n? VV*kQ-,%:hnsGSRnY΅",fR%``J;=.aR rp:F0SI$F 74Tַ @o6-xg¯&o#510o;ڻYrs3 'k%,f4BeYn4>D͔Q@FacqF3# /!Yr`=84ƭzc`Um?hAV xW i?L@+]nMjdɮ1"R11$oRČђB#cZ1ɅB%@nIA4rcJDcQzc )aѧ=W ."d2 F*>!*)j*<8V4$  }%BIIf9 BBPzWR4-yV() \ț̓Fsj%I9hQ,UB"pXlBΨiQ:m_kmصJTA`+@UVќTid+Tp\m8J\!˹]bպBuHeV+ٿͮIy" VU`-h ]Zj  ֌>Y)\J$ [(X<_5\-!8؎1oEDMո[/\tAB6*{vyTS-1dSGgnd)7i>{CS[;orOnE&9ZCp'_&cF ywB:gG[¸Rw-6;`:~2GR;o{HB^)WInm1#:mhݶ9j@ܘvk﬩ݺW.mdJ^޵6n,"sh2<9 "L)0ds- IvՔldj5VbWU]}<ϟ#AS}z{/IMyN8 #K#ʙRaHg,mD.\,X-^z>c^\$) hh6_.3w̵L&NDnׯb"Z*D! m:dSSMQ!o'ٌl!a/* zv0#Ȓ[w{/o}^s r͎^Z2u䳏8p7=JBYE;m̩F^Qi>\# +$8ӺvuLڙֵoZ~hŅG%XNU.9jiZkanZkj ʋJUYI,2ѱ Em庢r8± ׋,Iy|z` ۇ[k<'MT'/eLܬn_\fWjdA W/:wlE zjB4 VY rU TiYTEҔ pc(&Ǽʕ-,9LCBa}z ! .%l;V7D 3J-t'0bV4DBؼK=>:-ars r4 kbF R_"ǩȌ1A'S Bd-@T JU!L[ "IUαN]@)B 8^knW7欍iWr2JGai2aXTSHPx.Y\Jer"qk-W d:_JW_߼8h#&II* cPVi2 L&+mcDEIh-\ mK jWV(ʑ椴0׮`;0 4kW;Nu U'qAӽaz]}̓o=Bb+C =QݠA%b8r6UjƘ!#=ЪBZ"-/3sFްֆNlXk^3F1T/pQdž АBh>~:- S[^RMviGp4E‡õ3'Rr\J@?oB*l_;k,?b}.&!HK}<}A(AjL6xz1dLBe1ˊL+)2JV#s/k-hu^j] 7F`0G^b%))c}*5-2o\}ݱ ?zfZ'c7b8b>SG];3 9_x % cU>ׯe~ӢC mrz,x۲\,Y؞w_>f%v[-+$ b>mw9n}ޓVuZ%||tvˆVZX=|~z= _1I8@&O]=twsƾM}i ]QOUϏ3hzRc֏8^Kv$M#SQY; <ƬYfGV8SNȰָ$8,-K-A(qT~b2(E$Jje#,c"!v^2K#B>3d:s_y7n7w1C y A]Ӈ$Ijiݘn A|5t}kl!ve,F!אgq 9U !13*Vp[,ǥ ] ,T.+1"/ ۊ;Z1RˁB{I`ܨK뽝01xpDp^wz,0঳?8`Ɍqɪ"BV P_.qLRtE6 hQ!$M!cRRN{RK*+_H$CX BvĶѦZj:\%U`]M4ʦ»@cnNlm,S>\%莆zMM J,uKP~)Dz KrjL+qQq,tY P(.+[Qpcmqiav2`almi{-cRR*,UH łHCD Ts"K$ .&(Qz+kIb^a,n%#C 5k^|z[󀏦Lg u?Ѐs4k*%^)Кۉ)TbK nf^~y9 MᝧnJ#E P08u/V}l?RB cD#ҽe#3PXeS_j5s{Icv7aMnMu>u1 >zgːW3ObʐkQD`v)نnj8% #d*@75jB%K(}%)3*q|+|9CѠ(N5J`c@X?0,=jkJ^=Z qƊVRD!#n+N$Qp92 <#y riZLRĵؽ>-0+ |ZeɜsY ZT @"ܧ$=ĜC;̳%&nWpFk4a ɸF<#96IGj66RX7"]ۀkܗލ{L (1V ۷hG=\mshm y&eSa&<瓎NYԻ9 9!z$Hń%1L "Y:t)8o [}=qW,sE133e}0zrmuYY87[\m3vn<*1^wڿ~ n,/;;[E=("egVwM+Ka/#ŠVpVʀQcGw[ty@R=+42G4Fb;A=~<+#՟$!jKDw3lFF֑#iT/[[l6T_&ܽڟ?ܚǜ _~#2Ξڱu?G)FXILX2bfnLE&Z-$QFO7V-"1&oV-|:bY*j2 ^7cL 7/JPQ‡Ӌw2C_g7#{G %9 p YS %4q*ZzWG ;FxH1>14ҌY ZAXV{_M7 K,zaY_VpgoriO7I~D}m \o݃hk4 Ġ+=DQ!vPA4Pt8H%~!\ZATdp4L{ Ynwt=NCǗ304TB Űw?[bPN諺]肀nhc̐-FÈh&DXLK x^9^L1 GGѮ0mG]F5CѶ ||p=6O NCTsǤ$ ڶ*RK8TZJVrV}rD,pSƣ1wk:Sf(Hg]aJJ\+,_!o[ULhrD"z%#ersIFT TYG*y%Jn-uiD]UQT$W,ms"!Q#``GVYd$Pe)(TЂeЬ AQvjISZ-+>IKU0LJ!i2l{+@T*KF=h: 0*Jk#0#I9qk byYIJZH@0%aEAdj~ 5(S=gO3,sI;l J9wό:Wc0v^g`2l~'f<=)0?<~=6_x,W|q4ywe ""C8?}Trq0j)&sklCsLJ wm%=)'w XQ`W*n~~{^^8ݻݹ{])“&daQ xwGY|1uM3o.4 zە)ݯ>M@GPx'LŘeSrE`HX_AH/BZhIhr0ith;-0y;~h6NO98 DZ }t65A->c MВ X³+M{&0V=ş+Pj178 _b0{Y䶻Df.R扊0xzܑ~.Wv1XhqɩWvLanRq?3[9.tW_"[ ?ۮUbkFb}V(TRgUCңj%>?浤.ŏ֔hu3mR#@# ;|1?xA!nܭG?whR/g`'F܂qT0{?Ϗkvk#7}9'RUV8W!K*ĸTĕVyYT+sA0u_?kߡ?a9޷6n)"ˆb~ucwiet XrIXQ*!kҐhX^[[7^+TIZALj4eטrg ޗ;;Bl1U2͗w-CnhcsD&fv18pcvM*1&õ[h[8-/uZz^Uz갅I7u{b &$:wkgKC3.$ȅPT.@hIrJI1\VF*%ZJ!6..*Rz/Ljc.*dR[jSn[*d-M\Uh¦h/nn믛]o]E+VRy[.[HB[wJ;?ɝoBn<}Y9&kz#1Qc$kw8%65]t_kw/6EX BƶzѬb VR9~XR9qlSJ0+skG֔84d,^V .{)/fW#6×\ -"#b*q n΢iĨ手p'-D|َr}ܕrfp=jY*aXok"峌dĶ]`|Z:WUnf4-3 ff-(J,$x0y!/mxOc°p fz9_j)살1BB/RNr]r9Pw׈"9Ӛr,1cGOVTp6&g'WwP'jumrcJi :f߁- )c"& ]n$9rlUC ry€X~ ޲[u.s3MVT]TdfVVKX̢bFNXFJH- 5F`kD2,(+ThtXI_*a(0%`˜Vym:ƣ@(0;daoqS4ƨD[]PK~wTk >iG{9bfǃt72׿!=|@zkO,`uRs¿"exǜg>5?EZ_9!% 3F\;:|:pR)`'w*FSq"v ~: N-м5~JݍCBq;r4WBPlLF[ MiIcQdGL\r u7T]J1CIv9Hkur+P'ڴbcg*\TiPbl*xo@(e -xqk4z+TBU=}:!&?;T 8zkxBGHaA*iܑF ?k3ҴXМuCnPE鍱A95d $&ET2a2FAMz +P[R+@4z*v=I$InCY$?F3N#Is-% a 1C,|qϘCA *뾕)txL~6ِ}#23H79_Flx㸱8l+6FK5pڃ[(Js8Z*6dKgߖVR]:^(V&?+[WUʸJYTv:Bj\S**zi2^bTr"֮Qfi̢܋v:.H,c uxPepN$@~I<7aR,9ByeD^v(UwW I)3Ě}'h B_G/NGfTw/ h\hR %NUb,~if<ůBIfZa3rA%*4ُ,쁖RLQG[̶tm2eۖ=cB tʑ E16R ]])@}QUڗ6Z)3K|H ʄ*J-9)40׬5YfqƬRiRDC#&'h &9#iYg4zPKjNs8AE|A`ppy <t`< vK!L@FQ- jmDjdƬBel"^=V'{S:&yv,`/>&k5W{<^P6rH+yMԋ49.7ɻCnro(V%Fg9$4 OH#]:Etp:5>/Nr 4=x~|ec k%W!im/6?/Vo~qr|T>Tǿ::::l&UfQ r~E}ZArƺUp5.n6?aJ Si~^ʟ@ ^^EPc&WgoȈBc2[U!u6Y?/?*1Łw<:7*{|7tA,0_?Iu-s{TעK3b a 5sf|I'Krp@*bԤW ˙6zi)x"8cvU2JobJ Ɲ'hgBdźJj!gVw\EAL,/.bhH2/je~[6-٠9Cm+*s :97ú=1Gbw*qcG B`vN~[ !М7ɜIf$ 6.8m\U>Z;Ӓ2IFNݤcէ{7f^Ϲ(uLS(iruOWX~PwbHh67>Y_ŤB+}82rekc'DgU(Hk$qIl"'-}dŽTE)86y 3.O `x2KMH&)֑BsPvkb/Q r%D2Q(5!}or3~Isr%|rChӕ3}+.g~zrcrf]4 f6׮óoSHR8'^TG'&Vd$~>-ȈH^)NiGDu''=O7qO>zPZl xU}??sݽ$Mq7ͪ|?ON;D3"VgTJ~Xn EIwAę(}d!odd!DcNJNzm@ISj uC$s;MYKMx5SsjԪ}We1da+vw~O-{*9%)P i4UA{ʉ?sݫO@0~r 9ԨAð0኷uI8 1ϥP'l |lmXǩm5:p:h ZR^ݤI:m_I^.H>[g먵BRh_s##Eh H3"ՂIC $F7\y!i bJ0n!h@{pΊ-Jt=_) aZ: }V?T~e_"4эPNbl6\LtO.SEՔaeWjAa<'sxBOe.1wVW6u@dH.m ^B WN#!^|w_=Q\xpqySwWC5V ǿzS?Ծc-yos]vbSpL%9y_`vw]Z;=|Ӂvw9zC.W0#Zf5nF0,X5@<6o;c2V!˔CR7.(i@ K$juY|EX(,. T˥*r# 1W6dX paLHdd!Zr$Mdr,N_<7̮ k#uƚP#;%Qƃ i^7"Z(0J)EyQ+mSů_Յhe 3z.J)Ud)pD sDL800CDp-ڛvP.Z3OJnAfN+)Zw3{7f_ ɍ޴1aq7sJ!J+=%40̳DBGb)y3R(jec̒i G W璫ndjWnwR 1KӗDNִ58Y2 ︆UsMN(+^'_&Gc#[H F-Lr*ZʴIJ' fZ51͢I>iMiw1hCP[k!ͺ C^ikHaJ^W#@%24PMeW ۝&Rb h"ٺP+{|/l7l%ˆ$7B aHASo>CtTXHuW"Upi#յ5RQ#Un&l=}PF)rFTh{g^}Tڰ}:h MGOխ^7WUjE44*&HJ4n \$М&p*kƙ1L1h,gO ?{Wȍ/=y^)qýL\l`ɞݹElQRKfd5R`c]T,Api T FX$S ǹV"n/oJ i&* ] IcƋ-7  q* DPL~3דD™'K$ LEn%5L@Թi&гu2OIb%n,?]v"S|9 [X2qlQ3e1ȘGV\Q+/߹.J+mw?Vu>QQΎ2qCDr^xdzaO$>r-z" Da@5}k|' N*\jW`R tiOTn]H:x1|Iݐ<NiyR{ (} +D21xA0k%P&wMڛuhraUi*ɕ2_fAbA42p ogĶI}PRԇmnՁةy'8Ȗ j:*Y^aq zaK//%&&oT5X j0Jm* C?kd6J@#-QK|o Rڋ ⋿ y}dzBg_ߴx7qE^MxZ]Y`SBs7o*rÂ3>";ߩt\tS0J\"Lbʐe l%$XJߑpĘ0+[c},e76]d[/:$>TJΙx+(1N?RT I;1kp=5Fjk8㗛D!va"b(f1Jb䥟yO-}ɧ'zx)LC8OYѫƭi=/Ęw=~e0tqr_rDå&X"JC[q܊"\bܠdΣ#^+bmJ+K;IԈ= țz86zJJN9{H,EvfI=i"gy߆8^ncN]=#8OH%p$a'>3>g;\>3SH# 8 ,AZXEW@F).ee7D0ҩ> ^p E\݇ rߘqWBPS逩5PM a+ ;գc2> [O$Y Xtb;%-t7wk47ll=<ƎcndJz֍"IZo5\Ln߽KLˏW@Q AvCdޮ: n3s0n;b=kPkIGqNnS[ FS&LieM&&^}bws{|ٷU )bG?I_ 2A Iá HXx̛ܴݠrr~+";oqãHyb2d ""dDz9:) ˇ %X>g_fY!`Egۊ:Bx:w*h@%1Q#*xcwRS'R@Ai0eJ" h.T,CHԸ۟*u5q)'ur IԊ$X 2Z OWznH6j]&K|p^ӓ<9`\;qrTyݲ6{7ϟǧӛwAP ԵZ(D0]ZiÈ B f K-(%\KDyyf;ASrQ;7f^N&.&fμ͵}~>vjEbʾ G5%j߿O~"s 70> &uëd׵hA;ЯvijN| J/m9Mz3)5wd!'n)6X>P1q4~;얊A>viPf[zvKoT9尐7 CdR{k‘c*;VNnXefQ^bn8g{oeDzD([j 8y3!-+%Qjc1\E{TAF!`+֧w{~ERz8 hb]z(L!hHGn,=E_H`0EA8b0 ,BWlO %QÈ?R L%D~RU ئ&ZX^"JԓYt(~]K_r\d{1|/,_>oQF\Ra*Hʓ{v\@Up2M߭ ~nG7.rJd>3awu($)!IӍ&x ~!50>& T>KqNք!=ل!&ٔߢnnAbc:2.uPݒ_4vva!'n{!D@]VNpך$l+Rci3f4cM@+Sƛ~IZAU=[K7rlZ5DVR&0_M"1tmˣޕ9LrH[Ғ6&=6 |YK>`=g[gd%7ǭi׋Ϗ ooV>]Pcooݛ9 kԊHza!eS 59; szju ZB7+Ck xh<ߒCf-TYkM)%[l) :xojcew^Bye]7hiB`y,RPvÄUc U:߂qdk Z`,40Y!r_BnԊuB'0ǀ+^{>z.ʜ `M9FqYTLBfjjB A)h1*^C_f:] gm eA( Ty7"s{ܓo?~wY 2o>AvIVgg$0MydP X0ɐޭ2Q p1+iXՆcԜn6PމU*}Dz]9,(49R )q|?7YQzo@U!Dm1# LSi,xEvyˈSV-sft 9>BPT |ʞ tZYtbPyWxP呲K&Ѡ/1V(fi8RdkRj&Mm11%TKyJ-hkTs ӕ1iJ*&j-lF BFKMdEn%0 {P"VV9%T1\[-Õp-5& QJD1aU AjLtY$37ԕe%1T618J@ V8Ç: -WJ%|~8XȷrX&գyԲhOd&׏/_,?n>͝uO{t "ü?#|w1jn&ӧq|{pp>onEؿyX &7)Gw S(#b31/(S" JJaeNJ %Žյw{uW+,i-P-72s1 Ö֍b݌dx%>|>D<".TknuҠ'аm޸i޹!9AR3p"`?Ԋ`Ѩ|q×j{ aBތRv@6/@K ii2%(ަ&n)Øm4 >T6gJ"A4`? / ol6++f}-ԒZjec.QUX,JH8Vw)lP{T,8uw?Oo;6f2ǛeuE`uRfF0^3PxI_EqM-CڏlC H`vvjU t43Ɍjg XN]-kq0}(oHK0, (d-l2sXz{r _h gr>Ga ޿ZVnr9o[K 0#aBIS*[RzSסFA?OnSU{gt}򐑝ݎA5٩"sT]I>{P%'b4:h4¬X䬦UdE$G{Ձ *3ȎuJ$4=_Y3\q"B\ X&:Ǟs02E$$iaܰrp2SawLYYv|'sth];Tࣾ{4BsNw\JP©Wrg0֙pw K FR2b:yRD^5UK0taKY072re`XCrlLKbgJZ;k=y;);ֲ7Ocx/,17Z.:^dM%w C 2 |$gpGHC-#\!]n6!j+g5b!CNV?.:9~o\k-TJvï Iz AzzKRӾBr&gUeA@zPi͙--b܋@5>|xHˏ<>ǥ1W b|'Bt/\/.r2RtguFA]!NkU CT.3ǏFAЈ6>6 =2:-7";e<}~z+=wvbYƽD]/eC!/y-[8eRj@+_ks":uqu\&:;GҮ9`볗/βbbrhB8hϫ+ }O AWٟQuq\oFj  ߑ|5޹-|+wyf΁Чяf*]L <ύ2o'[ףĘ @ȗ@M'eW.$loPP9x:Ad?TgӕF z!836s lz6 Lϥk5_a!)R`j $))7L[686 L, 1sJP ZPR]jYɩ2I .mOO0¸$ >%%5oX+-99ա!l/U-0xO#xA7uEXCeoJqF+6e~є'Ma2keH7~Aꃹ0g}ѿOw%-fKbHk3j,L*ObRkZ_J͔zK3+XɕUf+$*=f%Y. n3>,bHGE{$Ɣ O E|KC-yxe<~:ɿnfp-x8V'0 bdB-ORI8n&\hx|7=B@np׺GkaC9X)!ӖbfƟFAgb1Qb FĿyK&O+~]WqJCA4_tW,7̨`gZ'0*ѡZYwK–G巿=+(cȐs"Rm> %p 4jbcxHCA[4K{)ZC\6WExWj2*[ kΆ4u-NXoy.0 ɻnųPbE49]1b@ 5춺DA`z^E]#[kĩ=`HHhg\롾uj>x1C]V0HVT0o;}8Hd,-R,#Bgz,nj|l(L=gXfDՃo 4#7*?er̹2` F4|`dj1HPBULf@ gQ9(3&g= # N[ gu>d` QT!jVL{k=5,_ĹǦQU>ks _/ܤnR\;h+ܤGբuUpXzI<0B &}p,$>$_{YXqi,oR {saۡoqͨ:AhS+fJ=Tz[ߞNh%Zp)8'ӡӡ(u!0E% kU,,fU0 yAiXހ71,T#vD^jHJRϳFz.RVB|kZJfOCnQR߱ιw~V &l> Aㄋ*4$ӿ_ "ja#Qҳૺ+g kGBrM)Z~& FCnĈN;hy#nHֆ|"$SbPjDemRHPa/keMhs&Hٞ:.o׿{ohEň}:7ݻ^@.y%ͽy0 C.|{6+a[桿 מ\)7z0gl6}_ *3J7Vf/ND VLʓϋP""~_P8R嬂kv#PJrdt ۸|M؜{L P1)f⩕P=!?k&YCcmWߪx"ZDb3Nb@?TdسvY`1A3WӦ1/6<ܹBo du4}y;]X 'C lڋy 4e}*ǒ9ԾZzX#[G-k+6}\Ԫf%@ٶXe%FܝOY$ G.P'#w`?n ^3^: ,oc,џWAje߇o! /迚_Ǚ}ܞd Mٛ 7Afo2|N$܇7dtwN b,\3a\hB;=+(iG?|,Nף!5l2!~;&q_|rh$`6(B s&wiFT83 FXsęBF3}qeU,eb TY"Q%b*UwX+T d<4dY&9fKE,D{3`,tbBY ZqHƌJ{lngj{#_iS]}\p]l}Z (J{="m~MIZtƀ[M"UzadguOO踿Y<ػ+ϖ_3.9w7z(d~h# O>έ$^' ߢ]kP\H ' k`.`U1Q6-ږ\ܞ/3mnɺ]\/~,d.W7o߼l.8)c-7lf? :?>p2傌zZzI.v y& `Kz * o'X D*T޸-WA s{€L㤑А{tPJR>Hv$-yTbҽUҙf\ZŬݷ:1qnE [oFB|%9Enk{ w+&9Nf ST B\BMyYd2jg`<\zxfm؃lfxtGgn2R5lQ-Ñus:ClB$XةLK3[Z)a]Zic-;*4 坧$p9wq9AA쎉R}z񊀾N@V &QiY#<ݛk1Ԫ>/\a}mĮv V:A*J =A/WN݉vC4r\C4rñ9`d(0+Q\ǜza;O:ݍyʻkBba18 Hyʍz0XHG=u1UO%& 92uP犜njT<~kb xʤڌW.BI\0xJaK=:Դ>Jy %ȠJ%i$.d“sK-2'uT,Ge\yB^YMMf5w$U)mݩIRgrw)F'$"Mi*eJG>lѕ8e~f-iFNz&:Lf.cԀL=At#i֢Ҧ;TY;5F7{"lb2 XEcv1qi`КF@M6PrNG*Yߙ+,Er0鸁:2ŭٳl#ɵsiEsh~搱*r2\g .7T̚WZ-7pxhURU:+ xi B<ˬeɥJZԵ H $>!1AH| z/_똡5%2GpH[r$c\zb]i [t ^}_ԫoߗwP'M19 q܎"q|eRڦZ }eaIs)V]65gVфAvl ]Z 8L5[/͹AdQg{Gb2xF>Euw+Cx@DYH ^tY9KKTиRTVV[m[!0XߓbmMJ a !F6JZHdD3V*[S9 i$=Gp1(7$3M$"Q |h;2C)#ɔsւBþ{"tߞkݝP:>B G@*7 Śs}C}NNLqK_DY0z"e R!j[reP{~F<3rV=8ȾҼg;[KQ^|mvcCPX4JƔ/Psͮ Jن=QYO\W~R=[  `5EKzhoAnh-pܚHs-iQD2,-Rss`B7x6"$S@USXs" d*f#E% <ZNvOD+#%䰴6HP)uZa-*x?>VpY꠲ o>;J>E//;s}Iћ+7ˉUl뷳})Ro蔆!c/߃|4CdZ~|y~I\"ϖseGSn<]ɀk8)Iuy~ۺ`vVw:UlyS" >ռ 8Pf C SJׇMz~J-x"|"uWFÿDiIc'ŽgȟTU!!j 2㯖kz JY &SF #3I 0*r9`0O~"ϖug.?-nl# sI07%9LKf=gIS/6ӌ}~~Gv{OZRw MNJ1XC.-h@np h>gEZ/'k" xe%)g%*S*WY*ƕmY6dt1xВ=R{kJ!| ,Su< պ(Y-@^96ɏRZ2S9!*FV8d+UF-mue)C9q/ 2鐵)>wwYp4ȂۿnTa`#xzI,}PwݭާPڻgÕ_󰂛!nV'"¨+mR 5+8 6jZ$??-X^ ͫe5,S2WؐNTڱk,Z[{w.$Pi99љ#=flal{V<͠Uط(MqwTZZʡZRN2bfkJeEvqy8-z+,vW%gzt8jYeylPimկjrq+ yUF% ʜ\k PTEi/8"Ԥ:\8j_/ʻ+̼uJjR$HY%2LCp{\ZqԦR逖-6<$[pIh*d>=},3EFqJSg̡Ke9_UfU^e4$mOi*JkIR#}ͭ.5;O(=+ B:L~68畐_mz'=ڙª@}Xm :8 z5 Z qN .5ȝu "*C;E2vƼpY: L_~U Eq}gkU]nFS`BNf -^ *R`1ZܐIL:bW)%$z|檁\v{0Zo4BsdrZoe5"U",oR2:2#d^ʡ$xE0B ǴB{ŠLyIL($d,ͥ٥|0BȈ{9@J0N3'-[:63b6V(AL'x8c۟_W,w@KE ]SFTrᲆw"ur0 &Y(y'CCf̼ 䟮EBQ gy̔+es/2 Xrj7c+k yEb=(=ԛ}ۨGQ~H? + Ojsu|-jlvqEтE0q[Eu=(U3_1=V ;:iO{+8ӳ:$۷Nv:wrmbF8kizJ$xw?Ï߿Vgg7-V;.*N./Wy0_H|\//}mH> ],`pF@p?l* uy" PX#yZ _lĊ10+L\u,Jm,w߻YkHrSOX/k$lLB ҈X WM3>ywiM4ܓ`ѩ;{\9=!3'iR>H~V0n ^cb_ 靖j5 %HC3ƓSi5h蝤wag43@~-UZRw"7*]Dt<ϠڍqcNwOq53?=&Ya3S.QKr@V hjP@3F3j ˵ `չ)B. ~MԲG) IspG)1%CStC'pDk'ra%w ?Ou τ['>0ra":7#cӽ8jq;Pk*X. +\ɐHŒ oG~jA^gF1Il5!!tF30pB>/r=n^5:ގP~QRsnN=m5^jE7Dkr+6P`-('i#DC H"jaGxPڡ(c`#ֻ̩AQRLWFKPW#%QQ!1B #x܃IZ.K{ŴDxlAqHj/AFB*B|ƜpzjJ[3Id%ydjOkJS#E0N6Ň~JH9N^&e+V//eQBxRA1D rS&8.5z5GUPG(FV8IU>0%oxqq Vej3;]?(!j-+)Osm Dg_Fͮv}u{7+7ceԿKu#UuΐDڽɨ*29V <%k]gZ52/G"WqxS*gl}l,$r77:kWIdaLMi 6JcPϭ8ٷB 4w؇`rZJ,3SE< LHBJ݋k&ƨD2 4F9&TdL&H$@sӽD(J6ylf>,0AfdA⽻ϝAtioRrxL;<43Ƹ9N_oלuب6 (OOE }7I <IB$k1./cR&wW3I&+ډ'\}eD>mqAc J9s9Kbp@|_#AYF9 H8`(1 !2oU,0CufLDʴD~.Lqm Ex'5!2kncދa)%z5KRZZ!.R`V`3vhEIA(:XJ9:"M!ZY+Z0,PqE22l(5*ѾKQ}crF73`M\ MH2.H.Z\?,o:?FHnw$Bgs٧y>R!Br%G!5:rObTUC*3e1~u+7ڭhd!;fI\b|x:Y[MF_-m>9u$BO^k Ђߨ}8e}FoWAuAS: $9+*?˓=YF(YUƚ`T0QcN`ւjVkqc3d>޼q b*Vc1C+p M'n gDsghڦcN ʆiO*([Y|7 ۋjiiǓ*ވ6ahJ >7Ah@(:n'"Q,p<D#uIC BT18KZY4ZxGLv@ێmvvF*)'Gs.?sRvq~iZjU2K%N<)ݾtojhȊ^px;iM StYJ?|YȲ|%*fR$f~N6^;#4B]^<~lxP "=6N) KNr|jmP`x۶jβFE:Əπ2ev@f~,fj T`6,R "],R_dX&Iz}0>}w`vfl xD_?F C 7rPQk~aT@(]w>#tx<w/l^V^`~vO/` /OI ϡv] _l]R~w+tfC7 `bJx/6Gdn&c7χI_G@ÔPH|iH؍YHlu)by$o&d{)Ξ{%"Mf\d:fZ>HyG8WتEsC̵-oӲ *:ZpMPZd'/O r.Kh( z~U&;YYUyڭկ7[L^VNٺ5 oU-|57o45^-w1>9 .1d7Mvώל7M>·']]0n>%InoۛvחɌ#J|Hjs%46g@CDPs&ӈ_M[ɛ#&=L"iy8dZr*NVԹ}h+A>P= lWLC{Y 9(51W##*g:_26͜:9E,B@1 ǂ#IT^b8biJ0V?Iu@ ҆)L6+Sm4H)M,,: L9pd152%3Ă.i ژZ-k8\<ޡ ;%$!2U1iPsN곯[Wüh/"eH\Jsm`)Mt{#QA!FICrmV?8Z%*xlQc}10?s;WB:<ņdl;WBC눓ZIb, ]{SFb`[uvM9/\Wp29J޽%h-j%2 IwXAkxX/y_ Rt]Q:KqZRNJUq7C@걹HC'cbe x3T,i-Z7j(178AŅi I!4ȈV(BE%-?J PC@[1H"<1(~2p (u(/Foy7F@=$w:ɗWgJS=Qv8sOT?Z#R7ɀ$r*/SQOHIKw!Hh#LbdJHLAðmv];Ls{:SZlCdP7ߎ{LۍبV[\d``a+:+(KIoDse<نRej$-$mĀd⢓og2z2Y)#eZ}}a>*Q Sy]nѩ7Kw.M:84gH O^9kᐮpHW8+uƒ}A0=bOOG ]HGzsH?h#vv1;L7G!#{ps=8-{8A ]pgˀ D\O亵Zt| ny虯mRvN(g2!LϸJQ?}5Tmj>8r]LJA u|< sbN0#e?QˠSZ-NaJ[ syvGz$b$ ܢ% DQ QY7Ҿjmb_* ;h DЉIs͑`&vϢ&(D sS:&SmXi SҐTS<Y`؊IH߷U]48 N9+\44rĉ\Z.1F0)96EEd/ܚAn<"8i崔XrHk>E1;VրAIo/1uDuMgD {0+="JJu\^]^O|k?UvW9>zltSWrrIEMN.lΫ]IZ4<"-,gk1Z]Q,)F<eIUv Z cӐTJ;ܛ+r!^~؂32*E+/gQ(oR/X,CZxsGq'&rNBᬱMVup(MO(Nd缞_NjYeֿymmhiaioRw(=|u֢76?;%nwRǍ:!j3ԦRٹ{h|՚}f.bdA7$-! Z V;}4 ʅTߦ°[U6ߵ ȥ*ݰTev3km$4! O`Hʂ6!R˞;~;97Mf+ ׅu#Q%VnHT}o߼a<:LUBD®phV o,U#&dJySnL14v󠠚kcX iFs&PŢw Д\2:_|rS5ש7VNN6kpGoN:#`\"9ӆs(1~v;RoYPGO"bIEdM cjJ(v2hgώkZln#(h)e`pC 3H/`qp2Av#~&P۳H5-!9.Yɩ^Mn."C u-aop ,OB 歏c̵Cڎ",ڱu'-5>#M FCo4Tlsys\#ܥ7vW{;w.={;EpV)բI̩& ٜw5^{cFHx"m*g^'k|.U[kFk2@.@Z^gϛ =\1E3Sqrb=RG3a\n^q-$/vT9;۳2`uBl 2`5ˀzbһIssTˋ? "RTOra5=Ed`c^O&Dy#Lᗴ QVM΀ RxଷH{RŞY*y'$S4##ԧ ͐(l22Uc{b`!%=srtp#C,?<|Y $6`ͻJIQ-Ef؛۾IWPR-Zee׶uך"{/RL_oU?WU y4;,NؕS2{/HN«33g~|:N[ U|Y7],x?aJ~TJ=Vv9m[YTMGtCSǖ[ORI<mzUܸ~ %zHZt) ,v_rZq ͏en^;>^9:=\M'dS$LdZgN2C~~ _ WN|:BD ,R:)>ښ::$ٺzH_\PAq]dž7QFLBJj%;/)J #Rہb ܼVH+WvO]6-WA5SsEطI[P2y1x^ y1x_<;A5l,QyK5:y9yl^,Ųyl^bsemMι+6\!&\g8g{NejB˷ 2L憩 {Ԡ%cr6tbBI21)~GNqޅ͖i'd*)$IhM'Vt҅uU7 bB* CxvI5ɁrYCI㰹ُ=d8FX߁G oRAd! !fQto4~<~` ݐӈ3P25@(&=q?->,Cf;E+NPv7{\ A4 Mxe!YNߢe]GK]4Oc8n7BI?||aR1~e4.s(i7ZQF~ ;,_8Jij&7?ԛg3 oQv LjG1~<6](oԿ/і Ut~Z>?;W"Kc7Bs_DK@-|^?ߣv3+L( "-}GOB`^ilƂdnm0-QW񬂎"TfKfl=`ߔC>7R+ [C$1n'LGRyA8;(LUd2*W&-R`3.wǑGCv8UH-.5>J ,iWGKSj)," .g6^UGaU`̑`T^!^ζC޿ltEUZ F:*ޒwKAzɣ=ݣ' Zj}%g`t鈻POU/F.IOP{S֕x6;M?]?'`o7^0BAQO, ONt"E}f1DDUkkđ%£]Cho_0M.;-QԬVmMqbv$x}6eZ?wx<)<~՘ރOmߗEQ=~f\9*c`lT.?zQC5 Xs[ D5RDOyX5XH5Jm0_b F097i )|/׺vΩ/agrH(5&Cҙ}žnB|Wg3c6ӄ: UyYC뮹HB^JʥS6G4qpx=~pm#t\n6t+\xn s ϟW x;ϕwퟚsPٺ4mjj5tBz6iIdIW|6QUZ$Hs.lQ#+>t!PZTfjӽj.`GFs\պlM/T<~fy^[ܣ}1BO3kh11˽R6ϽNW*ĆK; na-1!5U ]FѭY~+s`)`LM۳q Qz6d`4 )O읻_\(BK"$2g fS0ݖ ;?ԗ|FQZ4)'I5_cC5Pgxm^)$wEg<n9/_ԓzG7AGSj[Ocn֞0gR>2K)4G!MUrwRZX0kV>$ !*xT4O4C56(,26(tf=a1kIhGmY-W^";v1ǥ!G*M F]`t.mDp,kL;'9Ië7q wXvz'zlD.4u,UaShíw[E7bmhSlZڞ-Ք7T')!3cyA*T~A2iůY2HU1gXc 35 iB#Wi8Tj̈r>պ6 ){0 ᄍ{!PXf*j䕠;z%hZvF>2/B>.R(xbhFr˸B?C7cpLL;BJF/9gb<}o"R4Prs={t'}c1ZDc %f3V-3QR1-J!W+Gne )2: B.QXG%N鍝v!eO]àqn;Gh-pY;,'<))-n۪d#%.UÖeMz5fOe>D5B~ %|}֘'!"*ɺ&0ڞJj.IM `W;ϼ4Leظ`rbt9 `4 "1sxjG1#iz:l}V,lZ]_ ?NiZeΛOkk uK75dH6mYs N!p 19M 9LgMȹTw8ℶ*Wl8G N6w>M yt^ w rC!ǣ\(Ħ~)jȣtpޝwlc,Z-ƍ;\p8`L{)jȣԼ;:Bؖ+3b{T)jȡ'TX>OomBw<;ozmr( `߀N`[ш tΧ!c-B}15Ŵ4XtM0SjP4͝OSC;Y6^ؗh&0"?/!=鿹zWFn: kN؝t `c,xډH¸Qo9L}d2uU"D`!u} a7p1!dc ,d5 +[ -o^EIg26 ~2&Y .ϑwъ 7< qNR EZ$,c$p<۾$zl[:/֎<$$zBa_D#.}~v^?EoCuFn&*Dq..?7?[,WGffw&SՔEr/~ }љS9UVqX)ܧ%3j*(ml ,TtȽ:Z?D"y{"AZ?7?>x0\v <A8;(Lqdו^}PH_aW#=$pj "z¤(+軲O ]{@_W;Mcl#\ΌT|`̓`T^e&ŋKa`..Yvjq wjфWeI"VW[G cZ{۸_1.lxzl6=Ŧxu(ײI}8cKcŲeZKÇKҟ4+_$}cgۼZb.޵BhGw1r3\}n>r9O/hi~q4fq.^+9_6]lf,!FZ3N4YL]k[űRosv~qڕU ;fWs@.¹*k2k֎%d%UKZ,Az|U[zyW_[7oͯO\Ʌ;6>|67MFer/=>dUԗ`쀽4;gnGr2[ v`jLFY&vv}zj쁝=J UDZ>f68ҲZ Tj4 dfMae ^gmqv7]=:?xr|ja⸙%/H+j[|q| /nd+X_|+p?~~U-݇??VIssε3~`onz?Fuc,yϚ+<'y]y%|o+{`-m-?jGu׳4}Srփ ,rY1S=b5ǘ@le{>%CpRS~`LgOpdK~`,Cf[E}wȑ_e a9?>}eGa]x/~7]z51>pSv`Fy:BqlCٓ7:|`^"("ecUdY:{&Zwޯk > "CJJĺ,oEpxeWrwO7wGGtjsj"A{O)Nis0¥OZ:x#:ClPj/AF>-1x|rXNbu~x8 `&W95cnwA̙v*Z`3%B,MK)沁'{ 2#y+\j_%Px Ep&?nrί)? vWl̨$aM iH< VHY -,DcA`).cRט[+Zҳ^tN +N+ cA P;Z Պcv,H"\ L$7J ʙDs(BUb4l9Dxt1 }BI.kKI'Bh?tb6Ie tE o+“21`Z3s!҂'_*F]R=5R4lmLöJ44ƎRB]fND 4U!T|ɢP'TQ@s4aFa&XP}! !+lIB#7$Gf $P ؙ-]uv&XǙJZKMN:M&GSnH )Ԯ= X,*[d @)iU0|^gqYaT"- g lAfY/:De©@Bwf˒p.hfc.ُ;ީI_Rb p8YD-ce} e4ݙ9Gh(S6GC\Hۚg\9' .#I]=֤(+tF(1Di2fL?5W"ZQ,Pp=$[IdZgQZHws8k-,f֚3YGvMU|?YL ɘTѐgif4d%إAg` A\(ia䴱c IWVK: 2ȸ:m=ZcFr3`YC'dRXnO /자74axrrgM]-~yTˑ}ߥxSDˈ,zO ٸU.H,=_AIK;u PY2'+$oK*v}IAYRZ(c*Elv0' * u Lg*d]c)'/Gjtr.'>!/# a>c8Q1%*&I>-`?!jJ[*Hxa9SRoiZ}B?DfEfXSIJE)`,BcPqCl$,{#IUYMEݧe Έ}% UR( $hبiW7|v9ajYzxj555.+tZ l6630"ZI3BN U #[KC'F 1H4HȲDa&m>Rai=SmBHà93(gOYz2:tǷA%s kJ6@ .| #X 4 ŗdL. ͔rrsߝד )DWsh0Sʇkx\ڔ1bjƌI dC C"DxEcW>ut8FTAh6!٫*h넆!xH^ޛ.[^K; N9"bMy+;?7c>9уUA-)kCDK7cGt/募ϧWOwkƘWuؿ'|XVߙ~}"wVc\k /$|j% [Ezjn :GK>9O31;ϋj Ns0.xT]I+H8cҼdY6nNRz+uʜtSaІ 1BB &CiefM ^椩(s$/1I.)&OQ"krbp6بK+4)^m] I/v> ţ5t?%Eŧrvn]6V%eb^,u.JTL .m%YdㆉOGX :AR؛zݓˀwsƈC `l V2'Aq.!sJ2qUD \B@8U}m6A&G{Ku |h՛zϫ!ij/Զ q2.8;hu͵&)G]ϙd]*M2EID(9a4[e2$gajHrqb|}G ι1.*y-,$EbL=z,]NaWvͣ_0v6hx]@xm* *KAVQ_UU*0GsLƪ?qpʵ#6)42!T vIĬ\x !pܨ!Z.) FC0)~6K>+.[M(oG&ln>}>uUv[̑GXJdkZ}A 6ZZ˱g@ y$2D06v6UzӀɂ\HMXr\(v]=Zݽ3kjD_ T@vl~4Xj@KiNA_N8SC `^x!%E"юQ K3 |o6I꒿%+l7f?#Kqc3n\l\ξƻVI\3;&9&ݼ0 闿/IO$&I.}L%%i;iFRow>}1u9Y>~q<3gó$6f^"9 Jx3h,Q ʓV 61wn隃ɰ* %~|Wv~JNM?i>Mwo_66;;wUFW.FD㒓A&oowaO^epvsT o&R#ݽWwaׯ{>8ė;5>ߦw9[,g*~n}[4?MW!j7[0䮼|9wW;}Nr.Dbس1 Z]غe16&m+_6uci2?uki?>l7ͳ`7~`l_L,B|>q)"ؠxYhk.a[&-F{9 *n7d>sӚ ;;s ٰۣr^^afuF4i|7, *c@f;hfNB1yΟK[ƒnjus:{ہ~ *Z4j}[~>z_vo]yl i "ӧ>lGE?mG;m2nxk {a7й+ӯͶ< Xi)]+)v{g^ݗ c+3`x"gχ} \BO#a:dkqcben6JB>k>j/dG6۹},ֱuJ>?'w9ٵJYqj3a~+yw2_!RGQ#kcN=!b.-U|'퓏G}5;gO^'FYFo-lZVkO^۽Sɻ]]]yzUGM [c9{r޳O.'듋ۢRKUΕRڦLcx*AmfDt.'p>)pab"H,qu"h?6]xN'5rN`tƍThU7h5|t$c)+. ,C!Рׇs[!{T^@5o*s/I}YZ|kb=cmspd4I}+97#,C 4DELBSM9:IeX9uC9@2:obIҜSq~4=3t3&)yw6Vʹ8#8Sah_ 䤃Oҡ:VR0e=ᅈmWRкp*ïHJ5202!C sc s< `*JBQE-FOGvFLʳe֪(uVZt gAjV(haHW^ e,㞁\mtV5bZjQ3@XQbeuZ`s^q064S@5iUx欴3 ɇĨ'mB^DpN *sWA:/Q$v(Ak)R&$r!dqP.CGGE^1R23, r Ji@?cY1 ɪ|T:ykq 6Rf>3)Zb0M[bHScl EXCtF `PI`8B  z5jA*I 4j=P8QmP $AdS$О^1GPj4㛦u5gr3ܓOb8Rx07Hu `n#wu?jHKF7K̥n5l`%ZA35f; oK}䔒Eeɕ+|;;߻Ri *YX>SY3z+`z; c,cV>F\B|RL^۳X4]kxlO\FRsBF^K>( SPƝ+{e>fQMNɫSlU:ށ$} Yhcs&869eY^H82I h!rrH_^=bKzW w Oc* Du@6/jq FncIHTǕ|$ud7*骺I_:&Iˊt{|lo/4GVK E#u I4i'qGѾ#[2ϷKiqLsvznvO\Yϟӊ*3r]flO]n+Fӯ4Hh`pN8)${7R5pBpp~9Edp9&5"˦D F1`!$TԘ٫oiZ]U2:Sv[d.ɋ߾[r0x4r6R#Axu[{0k ?,!s2( 2,Kp֛19^&@M<""c%Ht`umi`ӱ'}yY N-ԹwCϞ%K s]rs[blchn&pܓ9\zsObLdQяZY_Dh!jtO4C.¥gi1 Hmk͞R˪==5{H]# Ze3S㠗d?V'r]yYLQS۝e=S[˷az塪Yq4ꇪZTz JS SH!O$yIbjJib ŽZ@C Ldg7I1Í o']nTO{8W Uo 'g=_8CvddIGRs:jm>mK2$8+| ocǹD' YyomhlZ4JTNzqxJ#G3١K((cRLIm)25J]:ms4nᕧ?%ﯓ#sgǁ[-gOC_a8%=\<=d\*{{_(i_V2`KzcmБ_VaKzc؋/k%=xfGsN7&E[ l~QD6^|Ic,^q1/ʰs= _ ,+nj08$y/C$#vtFM3ƳYI/^d@ntwY ^s3+Svbz ͺM0 d .fiA30~25.Xu9 J/P$i/= ɑ%ۮg 9VhU:V+sۓRyc:Ѳ{vUޛM2 fFEoq P7uucy@QnJG?GZ<Ok-A 6y}7.DzҩOmEGɫ_^ېT_/뀵He:^U2U@;~I]qagYNme.2|_ɽVcmQEO`K]W?%6rq>}a!r޽PX=ܨ~:%8ݒcET}ܟnNg !!,^cauSccm@őL뫇ǖGeۊRy6azdGj /MfB6}+x=SzMK<0v%ކ;; /ZE?< ׳Wjjl.̗ld.)Ս򝒍v d" Эp-cdޔ8-fn%v9&u҇,o6^h/w4Ƿ &ETv9X].фRsT_UN'dL#пc )Ԝ!:]t rn׊理ܑ8CtFFg!BJ1-1łZ^)"\ul7ꆱ=b d iX .r Bne ]Z`CYd.zo߭,b{T\F1l#K)*:qϚQ˘-.sٳUɕ ۻǫL&E{ Կk +EBýfroRڔ&^VM2xX[*=ܦO c6= |!3)S{YL4z^OPn_~<w.$m %;dFtySl?^?׻Q1itDkdLy"@ңu):2 {4 >">ExCfM'+R ˴Bz~Z^HzVL%ς{8ɧ'Ae.}Nт9@M!N$4䬽KhFG!hÉ\=lR)IPMhU ^amo p> Zx^E1e L(J[iB[܃ۏ"gVIe0(1zAx!M5fL1 I(R{_l /A4 n2aO{?C[hعY倄E01kȍa`gYe-*te90Ĝ,eKJ*Í#"c2Y%_Z| y+ukYb XC{,?psRTudlYɗ$c2OQ8K: ]|(\(?#(FSnX[n8GBaXp*RsĬ;kh(.~Q 3Ƭ!7XyѝRsV2u2NtG:ov6B+RϾ9uLt[5lad/Ns{J9p)e00O%į5_}~jFJ?MTՂPdsbG,e5qN@RE0{ɶJz{!UQldz;(V2+Jyc3Sg^qwmS3CbÁى=xT<Ӯ7Ix{$,{&aQYPJ+b"R5V-N$j]]\nFD$l=PLU[r4N@Q 3"o_Ȣ,QPIlHV3Ϲ7M~gtzH&AyzMsts9z=cbB^a^(.$u?Jx`ɔ=SJ$3-(= :iWn(} @J ')*ԴҖ9TN;IR<;ޯ_~}ؔV[G#)j]p4N@Qp'aoGz Ĭ$ŌLٚxSZWwOl"o 3kWYqk.[>fY 棏J%F?HC -TRV7FM>_~}Fw]>ݤUxJM3r_rG2%6^1Nȥ.jS#7I 9$qDW?>E (hc, \pDL4p^S#;ZHTfQ15CɨD*!TR^0Ec& M}ѸD## iT, a R*-P#}tK9j] l4NpΣGK@=YA ,5Wd>v,+m6i洐LdUցNĬDmvςpN8.&z7A|/y`Un!A٧Hݺ[F@Ӵosr9B|/jV 6/n" "̅ 9鹙i5ICLIڟĿt,!ٞ 9~[-|E2s|x9!mdRe:Ӷe/P{ۣP3:B: 0xYCmo0`%x@2ɜ\AP!9(4pe,3E{TEaJxm 9T)QK4w"Kn 2:Bө I2-ml4Epo&y%S:A@\X ne۰`| - fQxv6v,7Fᖨ;pKމ_,k>$~ѐf|%h1_ȁz.G.EKԒ3v6 1z^O{ 7PۀZлV(d!m"F]Ô('}tՅI!=B$d']T?)8 N'[dlQ.Tȕ<)(G5U3N 0GjkG痪YUGs>:8UN R[;:S餐StҥK5VsRpy:'Gʑ>N T8RbGܯbU׹ۧQE G3i G3-~Ryr79fQV/^73Wܞ?$׻ajJxgҬ !#D\*ڸznbMvq_RUD9CQ'@T)=DZeڎK'A,$1/FLgX(NO|ex 9(eM[5.x %= =ϸKOrVzbk rsLL@{wphkN8 ozAsFJ@P9ãc"Yzn dr:= 1GT'3GqI+F#}Biƶ5''p "K{oMr s`!e<tU0^lԂՓ{ɘeU\2=@TE~[M1…t. C.>nuvɷϘTjqn<li:-! xO1 e#b7t|;xs':R/hH ޻t Ўn!˝!ZS,Ϭ9{1gȑ S|6&?!ɜ-Vc8ꇺ =PPB ˙uxࠜ{at/n~'/y{ڲv g %fv̛bxsE/iIr[J¯_<|ƞեE;<_I|..3'~\E$ewU>y6R7fc#m:րZ|VϴEׅ_}ӏQVXѭrkc-qg2lHFusI6>:޴f`ttnX:>4o3ݙnt+߷X*8!Rqb%ZmD8֞E߿_^/*}%lD 3J-33$ʪ<˚ VqNhi\33s27]<9|vè!ijZ#=] 32zJl#_;;xz^v O\VDd<-z!UsuT8/zG|A+i #8JQ}jl J&U(غq $CYH6LhW5Qf Wx*d*/OMO%#P463]%#?D"7{^6?}a~cI3g9?fpȱ B!#lޢXͥ}Fƈ@;Fpc f9XټEqS}o$T5lR#ޢd(฻ eI*q@Dm#( "Č7xnt$DRCa2:d¼bvS]-4.;fVe"kApy!9*Qt?VCX㜮CI+7,쿑FϿFWWoC:dS!K\I"EBȔfpZζ{`sWGc7t,4ޅ{;-FKgEGъE0* TSpD?䯐 =ឌ^Eȅ5EaFܮd#8@ː|>oJCӎ~eńFkުH?9~dbhQd[`=*DNfh1޻c0x06o;Y0`=Qq[fn2]X؍RJ}yk4WV|O +#*h(aϨژ{-O-`‹$גk:6 \Gt1sP;)8 BNXQ. OfgCt~U_TBjFp\Rx6;!C1Oo֫DO?;Q]\AfRϳ=2wLkΌl. [*%3Ѱ䤯d>OU+)N$vvsNiw> ~6gyH@2FV&-@`pmb( aL\Ne*Ђh?WB -%lv曓w竸mfrv7$` ×TxӲx!5Cs[km~o&"=![ eL2`R8FL#HcPiRXj׫=ỤHA-.-Ë6ܩ: t8twd75l$ɻ@{8|\.l.K@J 3 Sw2 t'>bΘk '#rG>?-l'ӢaFXi4.r!xQ$:rH{sT[-P/b0Ul#-(ټZ 6R&XUeR0ݾ?딍Jy$5bBΊ'N32nKptO!:>3N0? fJtҖ# O:(9e{,LښLi Đq3s!SsTI4G֊a;!J!_~omۗ8ҔΥ@*p1\^ n:?*G+:z5V1,^AӉNV R`NW!H]8bvOAyB5].gvgW>} oNKq ]}o9>/AyQ?pJT9;XU0L6~6qgXas3ȳ̌ A8Eٻ(8<YpI6a@Z?CFVǂd (0&mdIm:9]Yͱ$e ^~,Q#B8 HN)C 4RBEBPRu1B:_ ]mN=3NUX)E VSK2oTV`OJl X ' $Ziq|$mX %g-bgp*A԰mjU@`}Xt)Jjp'KC[J]|>[ p!iP{bFѐAo)gRɠ{R"xEC);i)uA)@")U"PfNWX뱢>34,_6> "jn>vтdȦ0YŀVĢK9U/INNȧ"t K $:sw$R"r ZmkV<$aj0]y@"=_yiL& da NE,&L&&Čh$оWS˰I`UeKGDt#Q@Cvomq;q ?Ld[tVZF] ٕ8x".1y a%`fl>6$nˇbӝ0{JTXJ8(F|d [64)G?3Jal6Eؒ-2;"ދyҕQD $)ȼP1r&F|8!G>}A|:aTX#>j3pڰ+2krєp_1ӂ 'k=+7< $I4q$e-Ŭb.2#@1+,D[ /` WbGW!Z0FTv&*f|Q; y4TlT)0 hA9Fn.LMai+QMn7 2PWEՖpGR%Fji[8\:'tW^2noOVYxf>ɛݡ~Ɯ!Ol_*#Ғɕ*<|~胖 ,:?͎6;]d TxG=UVOu#gy:F| _3`-/̽:K<-_{aB_{ΞK"T"S-ºK)%/ԟ7X9H4aRFɭّeл{fݞ0cke_ll QBք?z[C=b eݥAq Tמ؃H"D o*!"gM~픽RLع]%; OJ6)Ar]㐫Fhgj@RiCW)?}]f>ZЋˠ&FfT&O,X<EXNQ²FZhˋr!l^}he5;h@$i1=wdJ y\781Z2:=as4]]Ҁw1 P/%E"]D,t Si0B,N0=xW͊xAh.{Lz=(Ə?{xqoo/{K(XQÿoӏNzo}{xܩI; 8;Hk'h1n/~휁// AxPD m@/ fn}燳_]}wt~|ث0y7LnsѓA}a9Е<_˺f=:tSׯ1~x T`40 `p5`L/>!+ ;]0Hʰ{wIA]o΂ӓǽ s FYd:8> f*O:d,'iՊO~{{XsXo.}u~q<_ /_U9ny<\XW߿:~|s׌gA#Cm|[g૛?ovC ǬD B}* ~s)NP?cK%9X[IeM}زeEZqcPRC!iKLe&V=: P^?+4`]FU^ɚJZe#?} tA/rLWAK=ea2iwaq#?H\ϗwZ`}hۡ__9 r0/Wqx 0ӛůּ~{cyf\|?p%`xprKW|yF3͉'71@,#^~`چί@fB0ӗſ"ә8Ll.j0 M\\y,g/n8%kvo cx&~3^gӹ;ZzF[|e-zyDs|>(9|L-D_6>7='wx9# Q:$^atGH?B3哏ݧofpx:#*y=Y]"/y 314Zts~'Uo^9> $yh9'4_|t}z=fX-ǿzJәtͭ)l[hg0'#fXɞ9`+!K M&Gӫ O 7~?/>8OưY9&{0nnH <>]E{`0ʎ@b[Ԃ! 4dP9B*$OEj^9g]8n䛪cVT `w槝5X}6o62)UzBz {I6 ~{[mݘTQsDdc~mH6 pb&$ t0y }VP )0m\YB)pȆ$3iR<2=bB]_=U\IȘYÄ X8"O{aGJ"l~eoơfF#M+qҕ"172+ci- 6g1WjjϚrryZX[dMi=OOO8iXNaJ3̴4g_|ȫQ2Yh_Kw g/!<'({(}K+؅elwQ¥=6G SC( - ;EmՊq8po@.l`X0 >H]VƹU? ` xZbtʼn@!ypZk(S[&|2t QIP.fm^$k11@6olSw%jKtم۸m Xq_ LŢb|ͪQj<ܨliںfm+RBf -3BY-%x/\D1uLpP@Ɵ!+$apJ$ 0Bg#f@Y -$q^lh,e'D11`fRpRcj e b$D06H-,Ֆ> X#m%XΗus!֬]s+S}2_/8Bk^DRloɧd1 /Gy~W+k^tś .2\([lf]yb yuA=#(/}DG#:bDGa%QyuA.pk F /Ť7eM|)"=R1.juTy:_5D̫;MӼK \/(%x3$kޚϿ-xTdt!sR$ )D$rL=T4cgJQ )ռ&ȋ)v )gKS!f$]q7X _$L%Rh>]k)vX ۝Y YvQTuO*Ei\$].wnY;$MKr8qp9Mp:bo53!Dk8Z)0o ׸{|fsIpF)}c 袚QL*Y BJt6m2fa4ݴ17m`rvb*[Lo+R;[E2n=JLo+j5pBbFݶhQC[kRpG[2XM+3 yHʲCBY{s5n]FO#;T`an,TYS7P?1c(AD{^)}=I/}QEK6ѳ{[j9p $.e[!T F>=(Jh{[%OcZpzhi[맱j ܥ%WY؞/(D7%6q?y3zybzxL4.iARSd+$l{S*_ [oݷǼ kJ:v= 7nm]nqL*dl51o r*%)cM[`਷~Y?-kqlMb_bd"-asn>VMi&l%Q sxF>(dx&oGoZwQߩޤv95fAx"u옍.ҖUFηD=pZͥ(pvn#3[0ѻ4M[ˡ(LL' uRҍHXY+خܕ͈N 6% VmUr^=K'PEўJ 4%AҞU%K 01_`>0M3S:1 ptz؛N0%!בgR ^fY52x, #:# 3L)g O`G#Τv(fX$Ju6 QbDXi;L1΁G*A~<~ɇz{yv~FT.z A POTHQbTvWm 1ˡ?̳^\Oa ,"M7װwjf?y\1#bYdFqRHg B㰷[\Tc)Tδuu@_Z4-5{ScQ,7w<=9O}rR޲@tÛޫ~9',h*r{/ ŌF&D8ڱ$,seǘ%L( G,cR vRT+((YDQ +@uRk)GЯIi엥21JQiZﬢF*-j Z+].WɷM(dz [4HmArJTZ-I9iv]"(%r\!KD 7\%"Mh4<Fbm")/)P bkӼKNRgQB T ꋹFZmjNbj[a@#%!+m`_2YZ4݁ͬA+ݚ `4H/]\jC; MT:NQ8’P80Ncl8rֲ e/I^g9D@wB u<F12Az}*W1=W, F^Ӻ a]`?m@.a{A8PF~O|1}>ԝ'l};>?VV{΁4`в5䗖 i>"UǕŚ"Qd}X u 9P Ê2/a Rb^]|o+ڝ@eY =خY ]ڮa])ɼ講gZ\]7T#Tde#i"vcERS* +yXm08%ڐtDqׅE±;= sW~"Zp>X%Y#-@| `1cí sxfb'*3u$1RDS&yw W /gߖaiRP޾29f2:_ǰa ?}V yÊfIgZ?W$|3Y-}2i)iD^p+(ؓrGLۗOw7k,zXSqwX~g<;"*έi3>UFw`k3$!ۦ[OZk:-I Kw?| MbW䤩vcBF򻷄 |-p8Q ⅶK`:ft9)Aƙ|6T֣zw{w3Y̞.P+;{qhZ բYFDyPZ(f J CHΖE8%cgbH n:P-e-j]\Gw\]W0Y=3֧ ؠBkO&{z©%D"$A5; ^9J+㩬#RAf;s+V*ABSbNX3Ǡjz _g)MyNCі㡞ZluE8-OqhAxd} { {g#!iVLsl1@ qU(‰R:#y0+MV;,F 4bߨ~BJz<-j/+!9wyߑ07kJg/hc%zE{*tޤtn&ێ .b|~( YvVJ`/uNcr4 oFw P)'.SF#ׯʀ*XLjgeT̪v.0P9ٮz>ͭX KcUaT![{^e~n=:MoUEm(:eS k*,Zbq*-C4Qp"ջb; )ޅjyiq:ȩleJ{?Zs'@)cp5:<pPs LB)b+;L Upw6|TUcU̞#*B YPZwX^RLmP*D-gHA_HL%"׸s^!!1t+ |ឨWLx׀ ew/S+[o$N_"r1TsJuܞjJ1ռVoO \/^|^L|i2Orfc{3 ɥRK+n_愥pwױSKI;cMDȏSXM_t%mbtJ3;daU_f 9׌ܟۄppV/ֹ%_A E+B~KӚj$݈yhs0-)>PJU]0a \0bFڳoDxt8؏ÅFsiTkBh%*;`FH퀙8RAJ%6(,_mZP^.C<B>G}Ͱ,ό1fȅŕZ=gR{4Y!Çt Am-{DA j/8֭>H ^j]11 {q=]98oEz/Jdx#r˔ AԌYDqӢuZ8eiw Xo>s`E6Jd5}'V~1_,z\>@Z8K3m6,<7~ loR[,۴٥g?>h>-[eHPIh2-^ ,l]`eXd^ףXWhfxHN%,6dLNҰF7F o o4^ct7.2y9PF15t=3ƨ!m8K8d̊|PJhF wAYa^* %h4t5&.X+*>j8J>TF,L{2#_t\R9\_ceVFc`TT_qRKbܳ;xpȳ+^w^}{1#)B_cO}eDTGE:Ͼm5ȯ1> q=2{ikنj dzW0(DL9~zn=8Oq%E"va;|>9qܫ~]XqjV0fw{=+n,BJ/ r 1k[nRdH#"U4HzLirtU.w9<.7!5}`]WӴioJ2mw~ڹ*b#G1VIC0WkoGV.4coV1\HᎨԝ݅jUn/ڷw֛;Pmµ"zEw0=`,WP(QႂDRi,]| # @(mӷ'1GsaX2&?64ТŁਜNHD8-KIML/թª:YcTa h{INǔ$ /ЈbLb,`-1U u]A޿ J%g|8T $g3mJu[jk9iA/߸EoN󑂗H}B򈍱-i4ZQdIɀYꞚsvJ^)|PlO0C\@/;)_Nߡ7/m:g;LH!fLryVQZOeA `,&5ǨI鸡,p  b:f9Prpxoqb!;/V5WO~sZUm. )rSuP.c  r9=n8B'ĄP@` GR^V-S)sz(1yM)=yn&N EepeK@n+ua=D“X# ƕ1ހ+&CK2Jj:b/C2H AEc(hf2rv4z߅jk:|9R))l3#,, “I",Nry*]X˱8JLh]ZLkT)NO*Xۡ>JLr Oc[ ڊIdܚCɧB'\J VHPB+nTvCɫZ٠2)sz(1yk6Hɧ;iJHZ"+_%S<ȥO>"X|Ni8lጮ 3Kpp3z(1y!8!8LXDvɽٖk^xw0& m$5q|l=~|L]Ọ52)s<|Ha L &݋<]L|z>~~''>; )P-YN"zaۨ"U$A1ISbv%nmϋKwϷy#,G6 g$A8v;{ G{]Ġ,NJYEBs82 +M1(l}! n:۟two0ETѳY;zc/M7M}j7D_1{k_=3ez^qǬ~'2=<7 VS͸2)+y h&޵q#2H-pA x7l>%yfn߯8ZnvgBb!_b>Hc,p'QB3W #[1y|*FmENU&2LEt82fOeq>]Xl//^]|wq6[].-_@I`=;X`qT$ş븤jTqte 72BM#kQ\IpɥoE%=X.^i?ȪQ11n~^zķmZĥ;SNa>7.FAJق1T4n5FhL19hC&v EtrhƋ<eO4U!!fxvCVNs,pV((P(E3Rlym%A3 ֝j8k;V;š"RTѕ:,"Y(s ؎ lP 4"OB`VS>H<']2 lbc1r5n1&1EhŨVΪQ7Q+F1IѩC@j:jj:1DS=uIo;Io|'} ^D W{/0=}[!CJ BoFIנPp=N .!!F-A9(.RTSPh4b CB"-|PR2PZ֟_~?Y;iMA}iy.T$<YYȏй_|̝۫ssϸұ!td:>k~<-A'Qw|8lWYnD/W2F0E8G 8CQ 79 wZkTU>ʣgύp^ eDǓcF: - g;@v%OˮBq7ީwffiOC7Y dd2Uk8l sYZJ +ecĄ{Ͻ2P6N<\x E$f5FYF9lѲHxSpOFw`2'Tl~D-Gflq锶` `_1'h%CJ=+(u-p-1G[e#EeiCmdG+lW\Yݻ$lD?(nN3<'a qME[n,zPɘTR`S$A`0:FēBj&y1MQG[Wٳ˘ olV쭶2S2T àG H3όQDT2w 0e#UE'BRM=Aԣ>܅dݭ8ԑS"߳@볈sA`S ׊" :ȓ,=bj8 %jp&h;!4I2&'BW7~H [M~dKf[,`ג`gQeG > |6| ^EVCv u^ȀU`w 8F K'B;u6^TuE]*jҨ>j.t B^>$'6c2$駍sdG!n F)o)ÚC`J҃2N Ar إwfQE#9RLO7^5?| c2`'Ar5U}oT{HFZ Oxﻜ4z Da\QSƖ>gcbң>1Ƥ!mx5` {zѲ_mq)#BVvݑVr0[&8c'J裙(y.JONqSDzt}7bnQЇ}OYp&'XI‚iEGO9/?=iMjbKF_/[p( e9Rb]qKJ6Xn6ӯ^SsHh,ġg$ms9xqn)&ɯsд><< )sltUӏ'ֻIT)'2O(Qtk#qq=@ 1H.k@jHM 6JXJŭv{W1ŴLn Qb'bE\[w{K[b+ 'WZ'WDYz+9Q@~e.Ъ;5]z2$O1io][TH Z$\H7i"4\|C*12>~@ ^'WDZ[9bݳTap9)0餕siqc mGFaC\[&j7JD'̾L0(t԰Aa9(&^9( E&GD|x㠘(<²"7( =s-DlX4x$1׀Rjvo6&ҒqYIO7GI+QB'ƶ#uʽ}LQis6#boP З׫u5IЏz2LaI ~rCK9֍"VDO,ṧ>P'))TEq7oa.YUc׌E{Z5>(XNAI(n$v Oa)p"c8[D -,1d7_C0£*p*6VPO:1CGC# 2qlZ,TD$F+$AJN(֯>̞6B.!Q`wC“^AQ=;$.RS;(=J*St1yEZabz I@, U~qz:UŹi^&MC88J˚V QŸbVW\[d`lc Ⱦ8u{ܐگng%X׷[~ Ħ=#X>C0ϵggnny v An P 5 _6(/uv{3^\n'߻ټǒh eXP՞J3+IwCop~7vaA =D"bJ9AA9)xSzcH`FHmoV+UQiVWX)=0y{q1wTRZGeA5Sza3HiZM5ܡh᷿^W7yu'g'~?Ͼ|y{ yyW^\0| $@@RAoHRc5HggA0ΤԒ9^Yhtc$8 ػ7kU8%Kh'" 0}pdmQԝjx6M)^>Ɇu]X$ҼN -MpiT6L #sVZ!lnUIl_@1M;IZUIO&\=**L}mSimjN s6nF z1BP;`!Vc& *&,\PZ.լz"K1Uw(lᐶRlV8Gk- 3&J 0{Ck(P'W?8/8ܭYUsx'xMe//^]|wq6[].-fnoqڈjuz5M)*=!Oua G h@aQ0y%P9&/_ C*L',)ȒLBX]KyCՖhx 1X[ #āeQx ǧw d"F$C73GBuoՠhڱgP4Uj/fY=N狹Z?v+qwe=nZ~ R~lOpדLUd5Jj'9,-͖J*ы!J,rg:RAWDWbjNo|9^'5HH+O7}{G:^ ܛ9}ȽA Xr/, ʋ)Yfwc{F&-Qz7"MZH!j{7DweXCgH Uɬ,~n?jռ.\#cĐsh1IᕖO˥UNjSb_J"G ;u9S 3¨ d FVWZApPkd$R` ˅ǕZKrC !Z}ѬQR<2#cMܠ~|6%՛>VRSToEw?;{aOb}7ӆQj &?";J2<& KG ϵ;qFzk<@ԯz|}g9YT],ﱏOMf)ri\uyD0o\6Ekj"W9DWZm!>-䉗 /R W>,Pټ*dNCCdymL[̈@/іLKTږLNMs.>+*e̴Ko^f]%D_\rEO+ +RRtaedh2:VgWDWys2SH\tGkI/]v՗=lg-Y]8NH^^nɟTbmD_BS,eF/sWyq8yr7oN./YAn6$UOà |\.[V_dNBhn|>_-W( np';yn51VpH | (a]*+gi2np&lZ6sO~Xn cg8ij*Ir}i=BNe!ER: "cLYA3ΥR8q ҭ֢8 *hI"RQRJ~'Oᦷ;j+(6>VQhPԐt& X^3g5XAif,:B` ݿH H5$0> 6HXs:VNk=Q>/`8ͪSqTwȟEE!CQYqd)ұb7/u1\yݯ&-f2W%LmLYƐհ&Xyα9e6l2E5+k"i.D -<  #GȬ+@}gJTHb˰CK 9SP~hUE# TZgkjdC-W3$9$zty Tg9pאWsbpq<5۾5(4=* IDФϕʍJ X'X(+ IR{=Vͧ8bdlp(YjHJhyFM1S&:ѫtPDWrZTD?@:=p#Jp;( :1qf TuŹtT]aՎs}E% !\AMCESDA=D8=EtPPt?)s0y&'0Lɤ-wyea"?C/?QcJtHCI-z^i.2RVɄUɉɄb CQǣQ1k+A !d4[|~萛|zv@b.9YL^ PtxMT睨izlJƌly,\G!1ҙ2NτW *Q aiqGͱq.ɐVl~ĶԠѠF23NYvš,XQُ"c y!0hq2A9a⑈J" A>{-cjۿ + 5D3MA_W@[׶6LH,p2EsYs Z+Ɏj=EPE))9Ή*`BnR`>G_}tьֿzrdPsr1]M21-A_9!8f8z=׊@vL{v}_]í]מ"V_o/o3}bX5oݺ6hx4%3îg畩\iZ;a[=Tob1核U_Gz(y[T+M4BC4fLIF&ni2-&C'mV!Ymun)Fȟ|=8J&ivY4w9brr{N_uJVh@`4{́uͥ )59bL "ѻ|caߥLjݪ]I?ϡ|3_{i~c}M5wb6;F } -Ɩ܋XboJ򝖕%Fek}A:i`6}ޭ~,`Ipzkaϡud{ogokȧTNg/ɺgsOFǃtieh'<:>g8 NϖOMi0;_RXeG)wq* L#s\<;5}}R*tIq],s*Lń$P0لd1t-hpԩѠ}xQn}o`|6x.}Q8<X6_"1 s16֥ #Y}ŤPVp 2hЩSpxax3$l+.5]!wqEcRviSc&:<d;IIuR<5$7y bnG\ Ė: Za$0,6ұ2'eH\`Wjf2ciBa-rPQ";$<CL66gR#9ΰۿ=9׏9ܦ8??o9~$`&vroKSo?{Ջfq ]uӺ2F}0X,/LsDbi\yi˕6^6]7]$ gܤ{8Yc 9ڜ#WZ\sˡ\K09,%'ہhc{.y6?"`a5hcx:vDZ" `Z^vj <_'t67!mz9O0 ,15Z7[w4d!Bg2cȡLcXȔ֔OjsU} l8V,FjEJYd13tsn-˵vԮ22 ` !m>pzndG#w{p #]Gwӳ;/FJvzzՄl6WA0'^[9kϝoA(a0Ii}X- E#3-f]NaMXVG T:ͻuVa6ZSMOfoũf8yھx yM$OAx/'4?=axw ๬GV:5׻@'#U)Х}Z}KiRLkuijxP2bOa}7RŴ`T)9'9R"igyycNnr"hE*yw*MT޼ݟ?J#3 UAk6OTA%Ttx6:hwЊFneno6Hg?b-}%̒ FSG /V/IW[njꭧlIm)=y<w֯!x =mrS܄%Po 3{ftw?ש'MWϖI|uL1&]:W{ut #I›;:A?g*qʬ_H7ሐ_) i˞WRV|zZ2i-Bߢ iT_tQ GjP)B IfbYoWcԔu6daQyhU s[v \wt'5:]2I{)!5&Z;vld P0@:ug#ڇ-[Ь(VrY]uSP&[ľ+ 9u LLPD[bW2uPJ=G=e˓_Ilq"shP^P(;(Z7'e5ZD u-پ9Cv5))X9]_ 5׸S -W0 QB x@ca U{UJAXW/+;l14b;O_lBZ))(`,ʞAb  y%}q5:Qh~XgD? XɀPs&I"2,B(hRw#HLHE#L5N;Uu ^T $4G qV) 9$bEQFYuP LʵJ%&E`.&26Ոsr9s69evVX+=+ֳK']A1wGRIm0d,t /RC=82".>TH5.#œJgdVȉ#Lpk:~*6,YU%Ӝdgp=zB`#{&X{6r MӔA5PrJUJ b(+9TT#>] [Rme3s/~F? GN70Z=rcfUL"%~H# BTG! v5O{Oϓ܆nJI-C1PґvJxCY6UWG*[$BZ#A0ȨYD1;kоT*'yh5t6!^-j3{W5SQmwl7!ߞy_OgH y8G?wbu Ǐ˓Fj(JȻw/>T, oA8m'%(o'~-ܽ$VM̛$u', xs_.P/DG:~`Tf}q7oaqIgcyjY.Ʒ"bZe3ZcT33Ĥݷ'{%^q"1.nZ$>zjj^wY}Gq33nhiĝ:Z?dH"nNlўY I!sl;?{$HG[NoIeKނ!;k;& %8/V<2b. sqav7)şaܮ*7UM nuf z0#{bO-nḷam'uKvMXY z8kE~Q7.ǎ8ڒ.[+.T9kMZeT*N9&`::BRZm5 Rh$;5 틠K"DH! *dP}T=bިـ:AbPJiTJ䎤aǥ_:^> BY|{ݑmVAk`ekU%$L BTD_0|J#j,JZRөMl=O"<5j#(ɡ:*_bֱxc.0' ;vaƪDYU'LIGHv/KV:W2eʪXp'PtI h;6{A5WgoG E'rNz*TjTv`.-$v!BT8P<$e\J {'tBt025p~)oI(|,2Q\SM9\lvvlHK>AMDu>YÂxI 1ƫ M >9%T3Rr$-IZȘ7Vl1JdX6 fZ77jdi Ɗ6mXBn{,pQ AJ؜tP(Rj'1?0d FWI/J$Lk"2;x e`b2A֌lׁPF*0}0Z]`KBh=^g)B(Bܠk#KyK{c03^uRՈy…˺Hg#HyZF1*Z~NI"r"zaEq-P˂ͥfe-cV]6cbK2yO#w0q%mauH 3lJd\,O>?Y:XD$FP(I#T>T[ܮ>#y -0IfX i#{ΐ6HD:{ FY;6 :&!BKyp[{0#ù;id "J#D.*xUd h)H\(cQ(Ej*|#zl{7}7!Q|OS/#ZP.{nv_zq~rmBil5np}Nnoo|bz' `Ȯ)2/W d6@@2 K/:2cW~;$V k/IrvLtaE%2j|[dꃅ]/,/ W!bvHN썃v9K'?+G?4rp^n'~&p 28xI>47?y70tF+ubkld2q$W4`n{4j /1$$!y8" H/a${7'U!Zkq/&#/;ְ7ŔJ+= ORv'Yᶄo`WP. 6r7)?CIty>H8m ̛ vEw#$0Q~ElUfQ,b8G># ,sd~t~z҅ Yנ]O?qQ7rz{;UYϼY]g\gXϸ{#^L۩=݂3bx׎fVil/{52V!'Q!Ͱ{1ߍ0~N+Ӻ~ojc%cŒ۽sp؝ h8q׺'(q5pF)/*a_)ӣz3~`  >;X f%3A]Φ\0U&#kB08bY5'Zr:fnX*C / }8HgBw؇l>qYOu qޝmw 5=-y;3HБǜd3~ FIԎEkI{QzV̢h YFۨC` mqIeE)˘Uk.ȧo+}H#v((ژnfF>kz'NrbDY+ɵaꠔ.UK}RY;qGrӻEYdeDnRQHF1PB"#V,1ZCdtq+X~]fC)#Y\;)~hX&2ULmVI'T"*66hi FP::<{zۙUqeu5Ջ|yTN+޴fcz9_ ̌~ص5 FnՒ{1}#jHQTAMۀ)1UeDߟbNS8_#j`65_S!tܕ1>k'`9?S(wC;oK&fR?ɧR%H5ž\?=z SsRqpMtpuE*Z8ݘ< iIɯڇ>iwC1 vukWFS^`+CN:nYp. ӅƒqLk5GFQg1:9+a$7$!V9FޜCʤPBhCo ]caXQzejXNMwD~5V G-6u_rxk~w:EJ=iwazj3u7|=u.P=NW5~uգMwf|c8W-έ^M}ʨՔߴMc dߝ~fyr0N"}JhUc6bV^S)e|_çNtT;1zGj{*{ mc-?Ljנd--_j29`AѨ7~uzΘ@[U\ NÑ躚K UMR5;w'RIYզL$;(7|- eMOOPnwpp)/iLle5]]xAb;+eCwK$PmXbBd TOqp[HC.>fw Y@}˔CtM#NwnL96T5.m6M[MnϞUz1CtN#ڻ{nCuPvhӗHjwm)J6Br ?.V? Lc=6<)FEݸO~qggen^2 SyaLi:;ySkQDtqun]cG e$_4gourHڛ^xXj,(/\|Htrpp?hD}8x쌗.ѿ5hQ 6ㅻ<7'-^pႤM᮸<l>{V_}7ͷ?5kU4Ф$KT6f&gJp,8ɏ5=\/YtY*Ώ|'%+:fxJ9HK Dp BҞb( cђ Z'md$/d OjQ:292QH!s1hYIÃ8@jXJO;I퍉6@DI~){DNADlfe'LDN9!}hH$5#@H@pPpg4Q/ L =9zGa! sºΣqs9)aA`1@jm_ KThMR*Fc1 2e,Lh$vEvHw]V}̒Kox!DY3 Ցhb a5=/KR&y3eAN'Mam;!ffDBEՊu)mR\ VK]"aߞ~{=_;.J;0:4 Hwl"l0J.1TPC@^I^]j̢{#(au_'vRshQ ym@*@:+%n@P}_>YAw ^oDSOz@jvu^rzX2}SƤT2"(TX&<[C u3* OV^Ԏl?F}{M=].9?zqyYBN lfŶOK?K5cm%VO{UMH2BJ6f5\rb&Uhi+|Tu=ojBUeO2ԫWDY3i*)|pf)Z'[պFLztJ2`S5Z{U-׊?N6B{ZKIMxze!r]_長Sy9|RHxr tziSei)pW a"},|s> ?sO'vyqW7?,ˇ=Ƨ N\ԔU4@v@]TI":R(,e Jq\qt\l.^[_< $f WU|A5=k:$gȚ&V#:%,6i(xBR= UvGR9mq 㞭D[aek3oî _+ǥs ڋ@D%x.7NCXΏBTKy9*8n<пksE!U]ix0BE!FkC/G"WD1:K1A0\![ҠBpLXʩFbE4N5>+wRd]0.G>P#OE-lP:V ˬP zV/SՅ! ͞4˺:VFRa9IіѺ,, QV!ށ zs: )w,pv UP+-Ŝk둉ꖞ7$$Z&8)݈Ql 9IPfjUj[)us_s;HYÎt ؛$PIUlj*oo*ƷIVks.zYQ=VjjeΥo7rvN˹AСb7epwewbQ6E۶hDU3.ڹ_lLЏ^v\)G ViBk=/5pfSp!,U\9P4dQ$wы F0CHʌ. )U'.%!Hl 095R8 +cB%nKm+5{\M(2.<8h3RVP3wC:\R Vv˭*5oH}"59}C"*ׂoiJCiZ1fRu(r&qH5YN>hOe;\K92PG5Bpc'S `A94Aes>X˞}8ͣV vYȐd#pQ'5ŏ)T{C)=U-}T:6R[ѪVD!~i[#T:6R6R%PJgk4PdJc?d;VK"` 6K}*m;Qʱ}n&Ru(m lpRu(f&qԆ :22A/-"REYkRz&9 e3q|A9e >f1ENTMQJ5R 3RYO%6ŏR]'%=_T;?Yp^p:@J>.ݢENgښ_a%A tCŞJ< QE#)屽II)<RP8!a&|G5a=x<=sWߎU 񱩛p띕W,[1-lƪŭFF!G@H-w.Gxq`q9$my!* S=ves}>,֞{L+KHd)Y+[!c}Jϣnozwo$h`w/ NmB;hwׅvkUyv Nu"pͪҞ d(F+_9hX.q^ŨA~L|qB۸w<~V_{I= ?:H hT|,}.\M.uX@靕!EfgJh’;aUk]&_Ϊ>)(us] f~[$i$Q@լlK30@]PǔJ,/ u]M>JB#Uc"8%cۛ.7 /nC6@;%D:\nEYf%qFV qS5M(ʺ4A5%[F._ĺ`h`g6^ݼ()1w^8<\>=n녝PAݵsF4?h.fyf7; >_ic?Oĕ\̖\&M߉;ۓϧoA`hrnf^'ڳS>{Ԟ2+Dk5M.@p{q!mŎmn8Vsj*:Qf]My@0s+jԴE2xe2/=Jэh=fGooN/m$ ./ʃܷp>[ЉSmXD;*w\Z=f)qz ~o$[U^vvu;;t`dȷO&%Z"E]&ge=9:Y¢b7g1PҾc[w7 CFrT%ҟ݋o?IUo/i?p.8OV`=1 e'C~VɇE/wTAD3EoIQX<c?i]`~|;j~pwԌ;@hœ7@<c t^oM^M*`>:=qP88QWrnUg 0)Q!IJRԖwuCd[LRцᗬCо@ޮW}8X]Xn耘ߦ҆P޲bߛpHu.!K' Z/rNY{AzZ̩j,Qk C`MITt)pQP5ץ҅ Mp Sn]ekא`\ug4股km$$ QՊAZ#P4%9?\* MScs{ɓ}0*r(#A\`B_ Be&@UsF*MmeT:`ޛSN-mF!M1JUְ f_eK(BUk*R,jK(WBURq]`4. `7QԒWNOLsnl,ju:3slA)S)"fEϼޞ)h*mH$ОXO.mist76%ݍZת@'CYo%\މq~S Y\E;|ɷg2e ZrQd55ۧ% ')5[,fdV^MV8# A,wr]77q游~;DlmwLƖaqR5m]d8 _":F;;[YumYNALA0~4FD0i]Lyj >9xcx&%NSPV#/y kw}6[6cOv=ApTtRNzuxݲn5 _Bx< @& ϰ qHWP.9Z+mK8(@|4.Z*M*V +հxoX?յTR "xU삲*X Jؗ^ ^eOVpnjəu‾|9;? ]t&H@ƱR8YVDmwa[@Z[nɤ lygQi 1rV1 EU(RW@5Wce"5mHȶ&]Zo=*aQt`jlӔT4e͆mae-v*7%"ӇV<ݒ(Y<-uY}oE]ԞuTHn]#waJZ, RSdd*iQ%wJ_2Ṇ>O'tz3ݾ]֏x~t{a1XASr[dU[W4Xp*j]zcWo7PnCU]Lې(ŏ 8_\gٓCYl^Mdʾ\YPv=&NDhr4J)}hԏoSޠBZ {|oP}~vڈ{ rF۲ FUլI-dM|We 9\UPUuh ]PXI6u]MW&MDN=1A?$4V.{=#JWN:۰&x ʺMYr]mf6NDCZMhvs݊(Bzo}YafkuS*[EV ƚ@`ҥBF-.rfۃRB3/ľ][mG<}>ٹ&0Pz嵲pY{ }TX  zеҾ)oxp&%*"{IP B@︡yzZ"IE[ԅVfJshg߮W}^mDDH*O7N) )uIʐ)/eā/{9tE4Mr[]-IǔUSֶѠC4lq]UU5+QFm!3ʊBimP;'SocRk5hlS !(XRDc}ښ>xo; E1%EtEs͌PajvˌoXTƂʒj.pa|ݔ% A]h'+_˻m7<ؿ3D u(*sk-q+- W$O[dmyEvwɼ"Gf1zҨM N9 pL=]ʙ5UVOcȡ3C'OPV> ήQ`C!qjHE NE3Aڈ} >@_GJSO>2:|?>!<]XSj^)|u5SG#Uz 'ˇ]fUp!@Qn_y9j%^2hB*b`%@ tHU]oUtG//吝nj1'+ɗFo37t@`gXI:uqU΃yr)1b7*-e܈sX΃_M+m /;GHW A2/@&KRIEy$]}iiWǻ&~xff'n0N7Zh8|3&5zL_"ٛ@ء`C JMiUqį4Vo5V2Iܗ&5f7t!hC9P˂ 3E"rAˌX hcY$zT7v:շkTv1gl9A5%7<6<>JP}BsA9xJ(-Œ]Z#gRfPLD$P.#JqDʓ@.oרbK$q\o~ T߮P-R*PJ%[(@.oWmQhJս?k]Z-=o+>;=T߮P-yT49 H5R}Yx[}~ x?Z/Ri(QZ ;&JwQ}B5Ԉ JIqB@iR9Ղ 5Jna~?> ϨD_Pz(ei]0qžjx(M ՂR/(=kRRZ'RPZP_Pz(R^JA˭4Di4Ψ6r(+~S IT$Պ J$q\D)`KwS}J*οQ $ @ iN@P:ZMy4$N"d7շ+T R.GpA94z1է$PuJ#Ղpv=7Jb<&JiC֎֚'AZ4g36cr/0Eaarr>$i!PlKe}"ϋG㧀wGDX*Y5LeGJ?HSgL^fR,FI{5*B^6 ?<>hzthRUR5*NftnFh'0% Oe07hU1ۧ3O]m4`L(w=Lzc :zfW3|O<'rj-1ܳ;=߲Q;ͶX r^l`Vt)fs_.6Z縳Gj0) ٟ}W x6%>hB_py߀Q}@f]VI{޲1lHVpj[-?} Χhp;޾&%`6gQW5$^N7jo{{]Ԣ4HM-F.kWwjSP->d&KIͻrҙ0$1z{ `[N{;sQȋ%$5~z|aE/+Sc0Q,q<%:ZE&x"18Qƴ\rpQ}.k&h_c6-o|ή0R4'0jU66FyC}dv$J7YRi1jBaL}Dܰ?}X$Os;+}"sD_{wA9/7˭|.117N+ST箨H)= 1W;"+_VjD( _.h˼&e]'ua^P62t8v:1=hk,/l^bW- L&>HOUf\Q[eFfڑ 39eU mV}-$ %t;Q)Xvp{eU@syP\Ie\ Kn (gPgC(j"`RuECFj-BI)z0(x]3G]dQJ(avUC|?0+agdD2Dr,\sEΥ9G4 .1/2˜o1;ޫ@)qPo,%D`7PSZoEPs1|ΤLI8i]NF)ќ9)lq ZԤ^9DW,\H#BsP&XLnʄyG\ WƌƊY` 2d\PlԞGwʣ|k }ZؽhQɋ<3|t=v赤_)_Yn /&p<=M)F@+͟Ah>}M17۞'cC]i![}Q!Bi.YMC=vG1|yz)y%ڏ$?t!ލ50Bi f. 8ٝ+p$2b4n=b Lȴ Ah@5 +B 9V%2 Wyr iK )^d&2]Y3"?+3Ė5Y'PiəZf jq! E-`)Uu"Sn9 KXUjn ?7=xVc@na FL&WZ$RnU5 "y^vNs\nfv-?؇n.7#;? $أE}7.ů)Iuﵼ0_9wu-tVFxأ 4eTp;ene̯C X5$Ea '<|6 zTظ;~~~ Fk[ҙ>5Úvreww3.-e)%!\ EԹ>U7 l*+a(ל, 4MuwOaq|'(fu@)pue]mͻ|ȹLZ9)\zW R.i hF;9y6ZLF+Ϩ)Q>V#sl‘͓hb߻WrzeﰽPʸUڲà y4nMU$1T); qB}i>5jb'T+ q ՐH5 hˉiD'lLV9q\)Oe|&vX[n|׆WyCTòA%]%UU JiĺZDղbe0yLŚG6]w^ܕ vjq <7Lg;f_ߏoF_\ǫ+~2_|c|~+~;>JyPx}-B :1HPOr=*mf^YYV7 &f XK\1.r)vn9("ϝ6F@*;J !dYt1* P*qT*vRed?V!2JCXaL88WNST&n2[vHB| /@jy:(m8*q*r{fW{CL\ (:A^NǨ!.i Hٻ7N>ᇾP _x]QG\&hai*\:G7/E-켦)3w4VFy>#\S;-^T~nqe ϓ(SOM!R lX6ljJ4F'z_\oh9dd=YvR5˺.DdWr,"t|Gׄ'Tn)% n _QSel qm0Q|eE fvP$lBsIeVUJ&/ [QVmUWP$W#cJE?("# 9*NN0siݖBJQ_͋f x4͙y g;J`9qVBw|C_2@_~h gQgMt&tt"VbcScp,E8IBj.t7q&`c9lq*[IhD%z !BRy┦L:NVSS\! ^UP 5}8XoH1H,H%,6{9h ¼a9\B Ai9 U(0&TFhET>,W'|׻5Zh[ ]Ъ^+s;0 TRrK% UVc> *є,\ƱR4Uǜ'0aJ]><<TغgAtSRo\Ϡal\?ٻ6ndWXz9gC .Clm+/rc"K:{*}!% ! CJVѰ@7Nɘ(qccR+AZ$z}-:U*c>I}W~Dig< cA3TF~pBYݎӥe797ܭ~k=O-Ed[?]WQZJ8rL$|\T>lF?^lsERnYd{.?\~ZM\fdCh)uV@)? p(ÈY 딡`#G tt6^Y|9NGLgv; 差4Kgj 3 W8Q>0 v)~>H!JԊ([RZ`P P#H͆@k:fV^v]b8rRƫBu|~r͑{( YaUqjnb5y=W&G*=Y^Mٟ{= @BCTqJoZH?hKN+wSGC7,KnR.QK3˵vciSܗN~t)繳C gc [[@g4nM+㮽[ڳHBqG!ѻA t~F]Z՟wk_ݺ@7LHLIhTocSM1o#gwE|n"fTg!q{y>Zk2/|I72Ӛl%cPZ)NyT=3RR1%[Qr>3E]ZuU c$`͢ ~(gb.-8 &MV'bȬ-z5qz]]Xc(ٷ'Pi;T@-(+a?;lb+ 7VM:D 0h*5m*c`*O4y)c5l ѥ $30M-9 9%jMzcX4]o(=IV!u#f,υ29[HSESDm%e!4咕s6Ғ)R#`5L:%F ~OGfS_T6YHuztY#dll,fckmHEPFY2c6$Y ‰Dy}*Sx&Q+PbԬ426jDy6( &$4`3]q)? FdG:͘?\v(5&hUib H"B#YaYabd`Rr,PY DW([RBEHRd IewjDF̈́Q!K@,-ިGwR2޸v0һhخ6ﶈ]^]mr9=lk-z{؎=ґ:VWc~6GaB\j#渣'T7= QB!9vrkM\B.ehCo/5 r;[o/52Tz7uۜ[ z4}̓k;`5nZZq삈I=rPnj=7UaꊖE,&~ Z7c7]-yÚ&S e>txC|n+^l@~qq'>T~]ws,Yғw=Gd)]N&\ӿvn'ا:΃<5ښ$Z'Yv=Z~<~> e8<*Ue %3UE-wdQ65^)& U&<@l. stjV;9 aV @"G7/^ 2'}j y QsH7T;ҎԂb}7r~qE7W/e몘xJXQBlj[ 5_r:ޡyw_ϯw' aO{~n[Z4<$ٯo"CK#,npX|ؽf_9 b?< ?O{ȹg&8kD!R)Y, '*MD "!CS+`iRo>[Z #HRiB"I ġ:^Qz(ER%JK7I}*ctu(# υ=0nlEjyF)G\[7RKTG_z(Ȫ9^TATqd/G4JP KJPfXy(EC) q(-F)ETy(cQNE?>,*r[V3KY蝞־\"XKݑ*ڞ Ȼ^G7&G?u-fvj5$K4h l1pǠi%$U8VE[4$pN(0Oa] 慦 Uv{S@fk최RRSUG+k8d>IY% ;&42I)0 &%)Jפ",9F ͿF?bzyu؇ϕxtߙWyrq+ۋW7}U&.~"fdJEj)SFUIA.թICR&f K 6#sh'lzHX}.k S [6<\&6J'`R[02-< ߘ"үcZС\^.̾~_po24oQxz [iI=/a"P ^J%yWSWϡ\󵏭j Չ%ӛoo6YuSL ף/`a~= NF쇅Z>vA :}4hBjo//ߧdkG o_PoK Юl(rCaPRޠs~M榙 6/fK\@p]3GƉ'*B;}U|?W[.vu=noLQ ى[N! x7[XBe%8+RSZg>͜@W%~^ןʯü\<8 ek.kg*+:tC凄Z˛=$lc> jbͲqcT.h.mݍ= =_O"ZNy-3j}ׅwm#I0;8 $>MdIdORb<\* VHHWFzxr*A1lNBk")c'{l&D{H&,FR*jԶ ${ʯٲ_;v+S{2$ 7d{G-N6D'ҹDZå G%\!tpCA 1d8w`CFg-Y]HqVosф"'%)q&a2!L/vj,Gm9[k5V*H 8V#Y QԹ #o4a#K ̜J'i$C74|a7 Hy,O8QK>q)DFC̖5d&G'U7*KGuIFG&+7O\?HEVͽmYiԅ3-KDcFrw5!'bR͆w5!~%wbb^K G;WX`.q;]1Oؘ[$^iNG!djU8 =gj} ZtQ{[vw[/zg>#zZz_n t:^\0%Jju;ޝuߚQkX|h $gN{<@d]}aaʦ敝ɇoAyr6'ЯF6>y6nr$fȜ52q}C)F/\j?/ӻz6VUoZbUh!6@"!; 4(*KhX6l( F0YܮKh%uA7ִm};y DMm*m~UKui Wl?iM[X)D(M]$רU|<:sa8^jJJ+#E^DLH(DijČ%^Pdt^JgK8[H*ze0Y*ɭQ&~8̙a`XsKI k8cmiICd倠\"IB$TP!E)(ØW>rJK' fVefSzhµZkTͤ89;dߵ;HPuͧJ߾"#o>MIBXJ864U%HBGR `(yf)A(LR۪`+aUȔ&B麩D l :uWL]$#J FQ(I0F5|y#1qu6r ԡh't+d{:sɻ題 /-47]{ov_V0ø(3737 #3mVaQ~kZ~޳_afOL!@>'?`|zqGAPJ?gzd0p8(jGMlmBSq"bqf S- Bg ђdh%<ɐl;o`(vD4Cլ-8·Wz7ǚoxy l3C'dʳ?rT0`/?JF!Ih{rNd]I'X#LJشbuVUmh&%T 6 q!GN6REs$ u4s$(-v-a,̙ w^g^oVF0Τ3n^Ĵ _!pg(rfA1"0"(aI\8' 56ucAF͜!p[͛ .m]CgV}Q&:bM!eo4{gkɭs/_#@dgZq oSA&10$)2BXH _L_'};@l*] A|V& aaUWmI$mIGS&6dXZP};/ B<.W\"77m%} .72p$f}?B>ӨfnZpw=\βSbe+lw5-n8'͕}׍./8QpYD1dWWeF"GiN+%d =XF n#l!d1D.Ύ>@M.inzK9MV*Ϯg|WXؘnnA*[ocEqcmb'"':-k,=\~ PO_՚$dhr=L]s_6IHsf&mm'2q&5D]7 ~N^ T,#B^~JxG.5;"$]Je$Ì&ZFnUI!1Fj",D"(6]%)c5t39نe9VN1'$ƻ%.R٘USJRcɷk#$BT_Wji\r߇/nDh)*9j|g}P8QAyxt$ʕђ*<͏jp)RksJwL~G @VXK)\NӔTT"0IVd=Lq) Gѹ͂&}Ż{:X\50Z:=ÕԖ$9fUU.X@/rWw~N`[^}k2>Ca( .I;hZJE쎨4=uX4G<µ (S Qd5$!q "* ܂MfSl;R&lս]gy-Z{uעbcHs}gv7sm'1osRaR{8Aok>G8[ 8DulǒgD.i𭷗IEbjtaqCN,6XO'}{M N4I.D!>F#='7#r"jq ǰZ}1O RZx7(a0%PBuDaRB%R ϣLj+ij;K@dnliE Uk_J cR圲Z ,q1Hsg^!2J[^]BBw/vycaN{IWvNVޣ{q1?A)a?mXg `mvK "nЅ}yp6<}Cqڳ qO8`Ϊ ڽAfDv BuZg-ɟ-SV|@]gz ~ ~ДWF7?:8ޞq@'ߓlpί^3췧W߆?m8 |Ko=5Oa<d }?|%c9?]L]?{~)cI*t~=?~un(|VA0(/OwtoQ8sff?GLzO/M^;s-KG15,2ApXN4GߨcYqvڿٷ_74oݗځ հ37ݧ @K68i·g(IgS /0O_xx5} |/z?_8&O'oݟ1B__(p81P<8n&Sw=i?x$?ܕ=_(g2hoP~5z{WFOWq7S@dfO ɅDUI&ÏLLϸ8/shrE? s~sv_652k緶36?k\pc!~`;}rhȂ7$M}}ǍB Zިt,5jӻx wap=S^FclW{֏ҹWel%*h!jz]*ݻG^"6p'ӤӤ,Bir2MN49&'d4)u+<&Gm.qb Oް8 Ù='IݦI)!dLir2MN49&4MJmTOq&L!D WL:?`&O}aA8J҅EOX /hDylPa6! o, ŘTW%0"mDcy*Mt*-4 8aǕ},1lvqKo1!dӎY)2^Mq.6'Y嵣҆hKˆ*d =de|bX/ӬD^vvwٻ޶&WÖ>X 65XL")GZdnR%KrSA%=$"ꮮ')Ջ飮HQ0`M7Hfz w)]`kf}6L/3@"2o^?ƙPR]7&ԝqSf۠{)0PS RZ4zl.5ʘ2z3n3كgn~`Ejs >ϜvC`>-}6U)NyseeywltT[[jޢ hm0${qpkJ54CNz@{@je#`s<( t8P!Lƻwў:kqict!k1Y⺣eJ053\X'cTwPCEզ{O0L<s2ʛ^JنOZP2W0q׬Br 7+/Jج{2^'@Lx# R̕QY)<[jo-;Zpי<{5=>;'ڎ߆ ,!-0߷vYJnuKq5$:N%JPjoI}rKqÏHµO%f8Iny:2&^=ui a)H\!6UG 2G82GbXokm6n(5Ts0(=irRnN?:&JpCZ3SF)c'ߴ3ZI*n,SM8&'RQK)J uCiJ̣QJJ1@)Un(T&GI;骾P(3~ϹG)T(]PjY?xSݚdh-՚Br&|{sws],[oXS?9&v;qOs/)eCrH].$n]4&X,Q+)W-6!‘^{rZ OD'/(+wAmI8+B@8}oe G`wWX%׹1فsܮfJ%Ɏ%q~RIUm ej!LQmwb1Jpj߱9w2InLFPK0MA"D(&>L٘4P+-#8$f6e_t5Ze1GX_K0Ol3 j ь-PmFD&2AhDk[BMI`!L4c;e~LZ>Z2/kU~Qvؙ4NZ3GYszzha*LI[A~4{hNAd>\}t3UdW,'^ƒv~y9w0]i+^]6L7p l+eUq}<`!?lcOgi; (9+{ԓȉ4zMsfZuy؜ a6Xyi+܊U7ϟ@U\r\ yÇQϋY3LLQ0>_]f%em!;L_ýiiwy?㉹yo!|}է lhw9# Gןuv׼tz^_SO:EAg+Ea||(9mL1\ &M{);{neMd\dՂ t<^Yͩ' D80.G rw̔An> ?,], (߾\0ߟ@^xzq7li/ڽ߷C(դw=zB{/qg+)̥8g\w:OW5FQIs._6EV4FyQ;vvz׾D2Ϸ(ɾ~'P AJ|ze2Ks7 VZ+]L.0c3(+?ȁYj>ysg[ }gzi枺ыwVmv$Kǀkk $Y#˙ sK"=]LJ&ݻSCKA9W a@> @QޢFz/yG^rA d%{/{K>\pƖJa%WKVud́O*^LD,kCONKFbK.%=ƹK^d%{/p^rPE\Y/- +QH5I-iΫ<~i5iM4XMʄLH(I\ "B &0ü[*-bܯ366H &\Iۚ)B*/KA;wv5%](ۭ6 ?-*Hй4?LA$9Ok.P.k x 0\0 j BZfP?au.ox eD$2Da`'Ay7".LU8hʼnE8E*Wu"l5VD ir6^ :mO}&Uo])|c@a\7~n($묐!)#NzXWh:y> >x>[$q![,0: ,EmiӯD$ko\)4 4 4 4 Nqƚ-Df ",h*Ifǡ&jFMB)bRޤvuLl)B:t3.3XwIv~:#e "n͸(͓fSHrikL#qKSO`: %0 h!N0n&1q.LQ(.e^9qS)gJsk 2ISٱXE6b-EwVc91 o31ge$dmxųx>om,Z{fM8]pCynb⁾ 4^r! 7eL"# di0|}d4jDjK8E$$;AwD A,4uc-w+ &QFH`G) .V) `[ ) c1'ڧ,h]؞Sgo ٔaAԤSğVLj`R`&Rozox;~k䘮aL\Y5Rͭ%jj\awJp&.l܁/J14M''%KEivFb?"v=" wT,p#1"v^B( jdA~DaD8mkv59/rA( ~A/uԋNB ԫbMF\!}8%L8J,kFgIlbL=%[oNPsF4DD0 fL8$Д(8niUSdrm &)_fxykJ %N/:qc1b鎯οXc~z0ZLs IGL7MIJ6i`צ^sn wV.O`ɬ+~'$Oƃ P>5a<1 t ຒZ]~HûzA^g35a^~ [؋ m җkru6e*A%ie3b_t9;hR` L<Ja E%RC.SULuP Ij !h4H?a>R;* =R F g{¿ MB7^|:'im x+W/nϹ5"e7 3|LNuܳabnwDwf",&C֡5gp4]*Ua5& K :^B&h< tsMӵyv@9sEj};T_Xӝ1glm.9BsHQb[촜gڙ, ?[i,e9cBDee5"kw̥%_!՚LΌƋF'kh}D,w$׀VWgs6C:̡@aT# `D&[ }pJi/l',ʜ|T⇩ď ܽ *vU{肠\~+#Ensl^wŽ\׼,R[Tsbtd-xOS8Y{܎aW$J3YYi*[jo`/;Zpי<{5=>;'ڎ߆ ,!-D.Zɭn}p)&cDg\٩D%RJ<1m3 'i#0bb"-F`e1.̈́NcU*w*z\[;ڗ|/px0?]‘!<\͙:V@ysu@Q\UX7:0hf@ZM!8f LfHhB(Esn 6%QdEݴNY{tջSQ]:h ţ. VP$bqؗoX)'a"-DB1;DkYAB[Gq"!ZIJA Fq^|Ћ Ocwףꏟ zzV0O:1RY)-4LҺo7=`셲H)7dd/ 4WGLbed}̥}2xv,o0,o>$ ;h LIiGM2ƪ:eQD y0V>*Q!9DKa*(UUvSTA tʾv;!n;v[c s-ţt-= N:Ih3lVZ (5lNyҬ'8_f^u_a_W*z#\o( >axfB ц,u *X9X`G+x`*[R#m`vNC?TYiMۂxd[;SiikX@ و5 RQn}R؊JeG6N<%*f$:?F&JV3 @~_|)@j{&/ma21tKՋI ٫@E+2");rԦL!0+`Xu-TűUS^yG 4Hĸ7$1. [tȼMc2Π#%6rM =T&I̽-ER$1wmyg@e|%Ia=t`x 169/FЀW(MF{1.&j%͔.9&/ׇ>켟B*uǙ̕8Ga_\yOL Pí&csa5ZQ>8TuA%5WumVu%!PԳgv*:.H,6[m:w=%ҩI|x~^N&˟>~XIfRaI:ss/DprBQYXf9Dm` 4J0KGm^Fa7X)WaI,>RSd9ELb9C*X~T7ZvX(+OhXi3w] {S:={bqpe>JdK^+MM'\L?߭;9 DDMÂ,E/ @a$|/t3ڌ7h'vd"Dag) 7 Ng~4*9&`.gsRv2^}6%{X8\r#ƚgduӱ2.<.a?vp9.IfOR:{VѸN*yX]_4L3T^e$ ipAviax {Uq4#hFF]Nqfv'LHppDܼĈzzni9sQ؝6B{ZLz641i{LF0>gfgȵy0ch<:9h-*;ayp4YS'y(S<iOOnb BPfQ܆y/dV~7c3}I) N,s t⽘4Wj̰n3SICyvïxIͬeay#]DZ%ΔZ/[eyh C7y̛Oz.&̛ Afݽ|ݠ7Qbtt9;|̮2qva3sKew\Lpae}wtL(̋P {Ў\fZqXąt6$}1&WbY,K 0*<5 k}2#eDIP9NPA c1s>NJIb]wMY>Ĥ?3O:N_O*>./(7z)L(]S!4:UOjqcpE$0[Jd:jە%ߟ/_?xJhĸbh!Qַ/~;\@?-; lbcn;,ޒ&dY/S[.5tT8ɡl(^f31#`fR'8g02f`'alFwr2W*q 7!inbL%bp*i4|C ѝ t0$U\ ҭEfB`ar,@+ o=^O0x_K?(OʅT"E\sBkjtժ8Y`TLi.櫧 2O7l7O} '=S''r Bڍ~s7ml ɇ>ϟIaZˏC}Q]m-(S!\gbX%Oo%8]0pVPbW|^h254|6d*/KLtd?q UˢSAiUBMZnVX֭rU#-Lu92e2lԹi^B[nϫkPJҽKiPWQW$qna!E":տ?v+˘ҭBX tThs(dZ^WVUF+9"5 [kAA{IqvX@_^zyu2 ק3}lFjt;IP{; -5,z}>渫q77qp?~yqӧ[\ ܬz ӻ/K4~؍Oו5?_~?mb?p/{@O9ys{Y#/fpǞK e$}+C(`X#s;'>gxQ;ѡJI4ړ4AjI| I\SsZT.7GD);>Iy)%zN؜^zHR+MrʸT䌱'᥇~%5C/K%s2=MՙKKv[jF}/KReR,%Re4IX½ubȺN ^:KWR;{9$2gD)½ez[j6bz+Ƿ-nFϋm$TwƘ`׋xDOCCRv_7v)▛_F\n~uKC6??4ڀlПȗ:ю(㪘nQe}sGPۋdF)>?Q7^Mی>T%!kM(~3gmumeAaj^k#7?7삫Ά#[4ԹnTwO?_gǯ}&xfDŸ[//8A%"sډ؉%\a g!;49Ab8Yq,N. yR3,|![ 8/JuᡡNj6MWi4ekzs&:ǻKдoVox1V k-(xZ8xFҰÝ9߰HyJ{ftkCh zN&9lm4Z)SW.4Q:T6k# *8P)Bcl3ս`PXWMUD4t4bšqR`n@9Ku !ڹ;v K;e÷9);ƈPO67C֒'1} ' mڴ$E;.OXѣ^{jQ[=ՃϱY*Jݛh_/!'*@ *m~[A(,q1,\vzݗBnYdHR ş6i9ZJ+XO.7Y7 eEңҨ֮r&N`M͙sL=[A5B} Rvj4jừD1Cn+UwQlYMt' Ϥj=yBHFԎ.H]p[&q0wIs1KQñdKBӮ'^!lBcpL _Uڍ\˜oά٢!Wd}_fd;^qN|;W bS=D%J8Q9=c 0WpVP!n s?`6%u%<ӭs` %!tJ܏=/ů钇M7prUm{y{uY7.i}"%{\]rw]zX^C@CK0]4t4nUd{w+oo.֬0ǰəEo.|_nRrz[t?ܭyqsM~zhVBZy[Jsa_ivls=|79snNiv_yrԙIf)9SViT.TmIJmi*oq:=ERjEJ `;[}v]:d$M{s/=ѡӪ>>9K3>sDVϐ" f?A`NW][viv|_d%MW|| ɮK^?=w1T]rf1eM.t.Kzc%̱Ax),-U}|L9*Q4 I"WZ[^u7^S7ժ[w!c?,-Ye/}dފKݛ;T [h &@HMX GmӈԀ!omcɿ҇K@}*JVu,S$CR 9<5 |Ez{zn-%!v6#W풲oSGd3ùG*.\ >2Z]yk/Wޜ>^2X1`sUlq䒛a%PUn;]kS6\`6pr$Ebƒ( P 4 " Ōca<}c܄9˯+b_ 9>\ WI^NdXA/@2;qzE )(S.(#i2~oP@F/M n쩰X흽4r<R #EH` %=B} 3 rwp>jJt%'g @h# i V I B AH" bSrfi0\5JVb!3'=2UP!;E$JK `*8aJBLS3DƒRQ-L( /(9bnk='ۇHmMx Ϣ%D)Daֻ֓ITI85=ܻdnLf x2oX7gS:2tBPtZ>Y(RH w%v-bJ^ FQ7IT|yz);sVEQ0~FپmyddYA%%qڷW_'0L󌨝]WzLM_3" ptt=L5A"cܚ_ \dz?/_OnM}8s0yw95M4<fh1}?W#9sܙe;R&1vսN]ie?Xp^ؽ1/^./]ܙ30_&/N<ǤktT8seoGaxi~7P䵶pG5{e%g +hzv(S8k$3Xԁ ie~QϯžL\\b_e++,"" jmRg>~)Θs G>4}2?D3Ͱj0f,R`mV=;iKmMQ :0ӄn^+߿YY5=)uM(A}S̻ o=l=y[zֳZϥNnF㶞1t30lC =RmP_@M/:m8S%CӔ UkiR ne-R`lrjNa3 B`L)E >>>>xJY9s>Y~H: tЛlgp6izSB4#0.ܗXχۡ\B'-0pI/0 :+a72PSN^`@ґ2 ]FS7D1J8mA5s =R' zC œ4d4dPZnAhda!<_GC лP&3ȵjvg@FDu lHĞw&zm+k6[˫&j a" M `ioغBAk;+ SWEqsqQY}*[ttnl k^M1B ~xU-ТЭ`vUP.҃+Qd+61QL1t;)N7ob8?4ðoByh~ީZ&(Â[:6O]oT\`;a}^2$K#0otjP2B4-;Xf@Z9P4;TG@aK8s2@5k\PkbZPƥZ\„חp#Bg=5[SƖQsK:qͬ'gY1a]0p.wxmRcc v;)/mƀ &0%OySp}0yM5/m/ɇwV:I=gȯ[BwǴwÓbVǓU2L=ʇяvdĈʋ5}0YF.S][gVvQqW{\VpA~ہ9BfG0oJ ??eڨA~mTf6LsQj3a_ɣ[[2'آuON @ C@)RE`T`1b8IJJ5EkDb+8$Ni#4GcLY/g8;c2}jO8ԌL>5çflLXlpe۹Q6]}O?)cOx=MpRwD0#clC.FT ͆#ȴE.ʓA1MOQqQ_Ps%C) @d7}%7+i4:6|%ÑW2HQF!Vp44*{1A#{-`4:[ĖF4rOm*$>pu@5pi+l&W7d+#.l1um]yP}l=|r j 0!^ntЍ"f3\p0;ؼ͇;|pnGGf6nFmmLaPCI|T(w|P\DFNG?zev󙴧fC>6|g[8̹k޳3 i%¤! _Jq.ӧFhZ[;U"+ \^:[}[ >tVo`syro ~н]6Txaj02)0)S(St/i›j%#*`m=7xϽ߀m( 7މ,)FO)A5 )g4wbӔ`i5k+:Cfk5b1"I(2J  !@QB}W$qy>ѤI D 8}BsϔV!H 凟YH0 z,"A9bQeKwdjڟ0| 4ݾtL5?u? x"M~۶YS aP묋Mz\Sٔ(*oZx=s4^J+_/mhZv ^ X{F=yv^߈mHcrf+)H+Xn,{Su0K]Yx4Ģ* ?r(0[|h잌 yS֛Osze+v}oXMڭ7EGv]iKAjֲ76NNrd Lޓ(m,+*֖+zmq=oW3äM IqTx"%H!m"L{jv;8d,OVgA)x);,2!))HsjL9R,ubgn?5 ~/)[g9khIFs#sBVӺ$vqBbךS%]jXk`U ^ObPal9TWK $Ĝ ̈2@zs Ve?jcR*ƶ=`s.>&I!èMv(D`< bL߷ m/E[g;X%GAG7"˝!e_gݾ7#ĸ%JJڢo40ƎGٴ ǃ3XDzf JR"Zc*kmlWI._n(E(&[~'@ļ[wb?RC ECd]`.;:$%HnS~x#٦=pF }:{2̶,poJQVDhvd}mM^阂i %{[j8߆})5x7;a/%}VSN <{FMF@1ye+Jʗ}oaY{Aai1zY)4 T՞|pA^>!b0 ƒ:"?/yXBO ix!j󓔨VLjŔVLjSDx+JpL :ҨPe.xO$IcҧDRCUtP=s œ3.رْ &az txΰ#?9-wЬbnbG!;̼3HL bQ@%NH=;@sCG0=_ï\ԈE72e*lnUYoeް lEĤL7`; %e [jD* )J"Bz@3g'P)ݵ*#QP}P Ҟ0qXgMBԂyʝ]"bY<~%vt{4&͜l7#?ӛ1q@'*c)M(OYɟ5?:=z9?umx؃2$ZeM\\8G}w?6?t'{(㗠(& /+|c:2ʇ VaUT\m 0G9&=fd~0{W]sǥ} .ebzPLuA73lE`Ag mL5 YK4eL1hCrpks!?.\ד;IͰa!>J6[MӓW x%hpY˿Da& /gFt;Ñ%ƹX2'gå>؄mNq~@CLFus0KW)9ki v#O:8mJpiWF/O^t^fOٶmmD;mBBmh-tkshFm-\Ѷ-C[pQ, x[?vuSs<+ )~{C,< WΞэkgnz՚3ύigivI[^h\s:u2z9f^)폻s9i8g(s _3廌0q/{/g3 \[mx2| п03U^{-ڝO~:Px\,@sSF{G狛R؟|}d, ~=~= O}A*L" /W14){7oΞQױǨ`񑍐\ENѼB9z >{ux ^g1'kQnؤo@N/G~|| $חBߴ]ׄ*Ϛ?82<K% u(w׈̞?du>9t2]"gY۱_}~$RO@͵Ǔ^9oWߎnŋ3S1O_tzhICyޫ~![su~>_@rQ<Кn~;yo@|_dܰ̾$x_}Q̐P~"_ٓ> y䡜L.. #˞og Z&2y?%ei2hL*߯`]^K'vPPx|{s ,~5U+c'x(z:1X4 jLϕ &ą,K3eEWppБ0uؿkdiJw( 2'rbk49߽-V -)%|џBY7KDш͖=iQd C%U.G20Yn"reM*;ӖS!w: cG]6}%Gӌ'o+b"߆7[!vUTcwBxmq|+j"_Q`lߌxȺX xmsft9 "XY`3W%z"` />[[%5|#ҕA,lDev<Lg/:uc߂o:R߻]7^w )FJ PYʜeV.pr9\&'N1e,AvG ZUPy`-#*ՂHR*ER8JM,$SBQLx``J8#sػq7ߧ$ql6Vea]xyb)S8[0J\0AhŸ L BQn}fp}.sn齱/^Nz8TH2ןO&YY:\9>0iT ^#}q:hLorWq!~tpBgc\2"!ZG`KAX,: lJ|mW)!Yso :oi\QlVDp3xo]V,M0YHK ܔF=\0GbqTxuڤDHncWPlreRK}3[t蛄~Ԍ󹞷`ROٴbRIQ7#\` ?&8,c$V0Yۙm$W#۞KqP|W*D$v}$!B C8$f%iJTD~J3$2^+X*{)Tp&q:*]ݦV\fJ0Xz|[0wp*Кb0}Uw@n`Ä&AkK$-hRO2\:Z ,3 ]Cy=X|sō]Y@N1\U)l -h5Y\' )kdƪc+:*łiJ\f8)SuT@^hh\[k:,R`ZՁ*L:'/WHkU{\g.GS]l^ _ZgB v+A.X [)VX5R봋VY !j+@(d$(jcH&-C`X[IH c,]҄r0N`iQrnyL1.S62`v8BuR)(#dmƲ(񈢄Y4 Mm .7oTjAx]fgqQ~mncg ڙ[+ f"lۃ-;wh6V7(!ߺQ{/KTJaVsku0Or-@mM=aԀWogJsbؖTݳ* bJ#"P [aX,ҶKVm? ujsJ@3Xyég"p< N i Bҥj[ڳTXlGrVhۖ{*kH:<8S뢐"-MV%V1ICI^_R1ZR2J);R-FN;Kk SB0AXp ! 9DZVZYK"mUg Nl0)`Xb9c]JI%(ejȂue>ޒ)QcޒdkI%Rː O8IQ -Nb#$ A-uSs+mY%AUͪ30%Yg'YH2 8&}+6-mpB"}zBZrbiiD*<c`odJF.Z|\‰.؊`+ꂭ `11.NTT8$CP_ʸU)7A)a#2JEE[:@r ~m~SX?LWZ6+̺1@/"r'SZ o/A0W&0/Nj,YnԲPZ+r0a;E1 2'`$4s|4{0.0+)/Qk0̰҃š}BuҸ|!s4yɄx)OvIW?V#ڝ0J5i(AʄQq% =AyWmM8 ڸ,I֡\c:MN+dL(NFxP!.pPtT]oFW|s-,y_C)]AX.5J7~,Qd/EōƢ(r733L~J_"Xi*!WlАȁuKPe͂4X~?-z:32BcY}G>y圫FCHNW,9f$G~.L:̝8h%ińsF%80d&&-L.0dI@C[ ,%"BHCS;P4Q4 {)skIR3Ky@Ru 2ұYօ=dF\ G`ҜQ.j98:9,]BJgs=T0oO'1ُ-2:B5Bd* ;n ¦h]Yd8{(B+p*% m0K75+V[iJ _Wp͉%yQɻ-&8*,Bpbn T{6Uma:\t/]8DAϞ\2vOOk73͇:F[ v%#-ww3lh߻<_V^ivpWub#.+ɥ1?*K}BᝁZNe?UHx;rI snrmSV 'boχӮv ʫym΢exwOAMdtZ=kPyrʟcjWpg<'eGn"l{|MI?>odsi0I`=uj:fyZIAk0tًa ˋ;^ j?=Spx}ع4¹;L#CXlD~C!Lti%*$m(zֻwd:錽`Bii(Y=׬U t7!f}L\,FRyی NFmHP}SQ̋mQm?O>JCVɣ䑎]⦆UD:[*YBQƭjYli`QׂKi2rQjMj&4i%65xEHnWuƅҁ#,ϰ&PWX̍t6$Sa@faП3C cD#QrBjPM0 T Z.Q_~=\kL~ޕL%&ŋE?|-aOmrf*jsif0G6OR9g/sb^] K`^,jcIvA}Gv@xZȈݵv=Xք|"Zz`r-̋<,Dg}o\ZCwpDf7\o\5$tM8[e֨^.tIK+ЩSoC,QpA7釛(W"cr)dAd e~ ˔dTb$W)*#w!Pᅨv'őMQG(ړ7EbP'QhI4`GQ[jƐnPk\C113pϱ4 wԪD:6TDdg$Rsi/GideM%OpC~Gۆ{߾~YEyN rEh.n?gWrkZl\a 9w\?a&%~>_z buЂ#"_l |8)rr fd>`YM{5R Ƥ~/M2qwq44qI;|,%!P Jw胸dY]$Rږȃ%ӶuKn`lxm:}PHy&UC!Fūgc!hN|yA5/]*rnvd|^ć&(R8nE)vF1=F뾨VǺ//GkT)9%XH`ڡ$RqPbMB1,AH\y9$}ǵZ>ԃo%^w (#9EB& hN( R"2+}?%9zgY :@K~B.8GJd7V8%fQ7o%_*,Zσ{gz2jQZJDZ,,ժëA" jw^HWe\rmy p#({؎on[5$,!;vQ0gogLV*jpa>"?E1ě+vB `6WZ쌄 Y:g3U: ,2Dbُ'b+Lq+lF=*w$)qݔP]5wTc+q;|* rEw GeU\ F~+bnYm'v "idl B,-MUk΅&c?,~xπS#wsR~t{ߴSX&y]:a7{Z"W4R##,d[xRu#Y1rT >$+jk`$\, K)zd $IaSO#:osrV5{QZQGv_6_k)x$gIBrIJl0M'‘wI^i"Ld-i9 T0F^̶4e:a+KsХ4AL*Zjye`RLtziJEwJFIvvx(t+LLw#C]&nsjZqUuЊȽ=Bf|9d[l8ovusȪLCH0Ba*x>-hY/ˋ|27!Eq{{nQ1܆;2fΨh ]dsr';&C .GkǴx}O3Aye=oq; nof2}1ӀFw ++֡!&A9?ɋ_E?t亇 (%(.:KwV_#/!hh &+J5{\Qy—0Wp7ֳ< C2jkQ$a ںpC[.!΄\yR.+iks <:HmM@[sNcmTQZJV&(͑~=f7?J@.ݓ}K5c97be7]|~g}ӻ /]/1,+Pwƺ9"^ 7'$'D .|dfߎ#>[ں^#䍷WIϟnY".`\x6ӱ‹гtAt|DT(tLN3m6 $6Aדp1XaU]}sk-.~? 7 n RFקw|z=Mizz,t332LP?}y`2~˗(^{נ{eop(1Gu5x:y vHȯ{8x3N?dܸE(\zϜL2{~i5rʞxx5nlƚ@) ?M4rOGxSL_ }(\Ij0p,g '`D2|9|?y5L-Ŝ? d6pQb&nYP\ ݗlYEyطXk#5>07CXarzXEEMAMe{r<8@8>%n;H THjy'Խ)j-"r%/>.On^ vL|3#3>d G?genp2bM9{[<*+.`==1vi QC-u Z4?,MyDUCjNz Z*2@žvtl@PYqMq,ڞr%V:JAdʝ8 OmV"l6Ӝ\k|J =m `7G<{my XYm&WØ xlbvmMThx>!3>wNQO݉{!ܼ&KJ)'G umy#ڛ.6wkG=oN;nGߣx^ӟ`np<\ctMWͷw~%WQPmK{SZfLRGc= M$ԶQ.x r^nVDRSZ]e|SfU%Js%b|6_k!d#R&g3.Nx_.^26A)kI*0hG0+dVÑb6c>G˫JF+#ŷ+Mc;eh^g9IW*DyI\ZSq*̥:*4 G&Ĥۄ7H%pxh-Z$dALHN4Tݢ}AԬ]xLĬ#4ca;;0}~~ٷAϞYtT7~r՟Wc}x5]3ԎZ,;_&\HY9ZSJEzIT!oIv$L"#cuҬf\oU(DP8\!n86>Һl#KK-J$:LKB9LW,.ˎQ$>TFF=Fv"320L\0uG&c`c ګ ;$L6 #l8H]|<Ny`:!S~ъh/WpGo]4KT_"a:u[d 3!d~΄\)) j!aN#ޖz[Ii/.dktM@M謀V}pG?%3)"U dI 1IRe㴏$Zd.=QبTBvwyZ&h?T+tGxJѯ; AdxGMInU;H+7dW1DDҾK=Qם0aND !i-QkkdZE{L鴈 ,gXFpqs]kuE誆u`N55̷VW2 ; z ۿhJqd ˢcwгގEݑrEǜceϬZE{Ut%HV5ƫ;MJHaisU{ =U+o0ۥڭVUDŽoT@ i4h4UM L!ɭ~>PPlQsP~ -]7:֦?u^~XS$R2$O )TB)ÁjN*\ٔbiJZ Brr{^LJZvo׀Z vMQ*#ShZiSMBjk (BZkQYB]WHorTZ@gH0 ) +\bDa:jׁh*@(%S1,B7o?B\#bmڗMK@΋ЊV0_gveU5Fv2棣"j}<OxmnGѯُ^:*ߏtF9S\,Nr3p=6#gOg}UK?^+y/w&3S84Nh9! r?ūͬ@|\ZCYOtXYnV!SU POBmV{ҭ y,ZDIӟvB I N.7-lU18e੻6γG;09i?YLxY^s@zId:^ޘYo '{'鄻xZ(LVoa$6@zVBѢ+7Ph/"4y.*-."U0% J>bؗC8>={J)/֯ؑQsg31zcSݽG{ O."E%^:8MG~Zp'p_eCo`n׀A:@: n-bX[}m!xA^:w: ʼQF޿;ĜFE1вm@3%Zц+ARV];F#>ȸhȵrS`Eۢ!E(mXQr@ZM*ʈ1m"0D1m80Hbm)Za vֺZW2jyp[\B704W^E~ɄMcKLp(8`[wYRKV{5m *W#Jӕ#zAޙaO6.SC51bDŽ` DwgaB4a!m QA`U52 ͮ lv֩I׸n{`A?M'? [T%8Tɳ $Ct0MN>H=Bt{z9`u ::GX_-UQN۷Iv@MÁArO_A &iwmS㘑->C%ӯA'm?Zױ.Hd7X%+`ߔ#dYnO.构GfPuÀ9fтcJz Ѩ-m{?7z (U<r]P{ }&idy{xF9cOmM5A!Q{`z{y_GQj8Vt;b9Yd9\6cyȒge7bf0Y2Ǎcy ^pQ: -Ԛil6!6o f{=# Rc?N?'0=79si]r BM TYBMq TJ T lUiXΝ Ch& A޷i5:,c0@"γȸ-|af2)M)FP&@[!D<>ί1RPӘ>W=X Ϧcٲpdˣ!AsHz@xV@(.vcFh˧i@M.M<Ԗ4!"59`/~ֱ`>k,vzL?c&dĦ*'W5@7Kfκ2g4¡ƀQݔ\<~P匦4_nڕ+s1#=rFKIt=Ikf? `A4P2urĒ<()`£W{gNOA Ad`8PBT Ή`̇-YkU ۻlk/ ,x'8'j<+r@"Ur!r:iMK2S"΢_?@3e#hqz?q{HGyW^_yEUQukðҒQ?Sq1%2H d,i9C0e TX yj?n.2΢~.h6s}Y4KY9Fg̪/ߙcI`T-nƊ7U/+gwPhK`!<`.-wR 3@Ak(wY !F+5WX}xhRyݷb{w|+_b#fHi5Vr$5rcZUF$6X}Q-V533 ?FK'wL_n7;-wf$ŝQݬ13OcNooW($zk6`eC?BpX^q2i?D?}Ws8dr'l*sjLX %kͭ/FHȅsf}Y~xV?:z.'}\A*KB@c& "ĠD'*I-u2La 'm|HABj0D5.{^hA8ys[ZXf֐fh+UnNJ}O)ǽ6Щ3Y ^!֮kbm$kK9{{Z) d*KHg pyJZ[ S TPᔧӣ2Ajb׀ DXC݉{UF!2A=p^9ǿZE#Um; {o^u-2vlp<ykmAXϋNAY;~~􆲣T~KY@h_l5yn`WAw 뱙f9}:[i*7iN]䤯7#Y(wo~Y|FY{ɱ,]Pi& ØPF9#Lʹ f+8{ZBEǮX4VM?4w~mX1b"tT()L1 KlV@PT(6?O!pPR⬪hDP 3YY*Z6J*t(r1O(X9%Px+?0$ M։F=g1, `r4WڄB%~tǫέ# eQu(P eeдU_ m,ww5JV_Bx6zfȽHU)­/o "N ` ^ !Y} [1gwކ?)†r;Rr\RPZ$% IRb"%81EHKvتLҒM2E(r Y)1G%`aX8iQQ7=ӽK@V9;oA쩳NsTzZH䣳ޔ@PVۯ;AbAZmZ?TCn`.:mɜgJɜ_w½\֘d)|[Fmk4Vs cmlr?-g|WwWUyb(oYOb}D~d[3گ^O#L]4ԦX(ƄPg ) a7QʰXh}gwAmM VLU3tp_ςb_~Z|yVQ9rtb) zi@Ltx~ S`L`̹,LS5\͙.X. ˲ 1`AF'RbDv^dҤBE#0{Oj@ 9i! 97H'hx]6?Xnޝa L3: (Pқɪ6C`s$x+¯ӏӰu_O|EP}||)>>x뼺{t~MǾϾtq|?'(!Quӿ;?l#jzWI"6Sa:zO 9{+w|/\Bُr7KW{"ƿufvG/fߢz5̨($wV>=."´x^k_^ћ빧[ƒoߏNS[WxW_^5A;:t>g^| <y .ߏ &ߪ碻(N_O/nY|1sGy`upM?*ry<=X|{Mwݽ74y@\AA׻SONJ,zg039_< *~97j.ߓ\j3.~? SoB2U"ѫ[{Q|z?O雅䗜}>Ƽ8_[/@<߼ ?,=rq뷟(Hb~^xs}k3c?}u>*1+Fu3}1sp# ɲ*Orr>1ƂC}]W3_]q}ys7hOJˬJ$|)zqG̽D)zj]ݻ+g| %*UFb8~/.W|uy6{=t9 W~jꌙxeEqU+lrܫiaeM&>^8~{_?~;=EM֭[L{( bO/~~>CC`u(@3A"!d"ek^L7 4V/LUĿ;Zqm)DOj*Kٯ!V}PR[I]#GTR"@$rRLw\B } ^WB#hHS.' ye~ -$(IP|&UW.۱cqRxeh eLs'bR5 j*%)IBصvm;x 貼M~V09iX¼ؔifbjچRʙTY3LjMYIFgYԌs<[εq+SHΩ(i*9YmPx92| \&lלFM;=k ɩ-ǭZhSuweu FA@9n[qasOrČqۜuD7Tcl[AFW9n $j! Xäw:Ӿ 2])4An[CimT؄`^m۶3cZv۶3 gZkD]]YkH#f_hn5Q+y GNHgF44j;Y1^0G|J^־vy~Ä'^PVL_n.s}Kmw<\yC hp6р4Ab+fQ,fR'2j?JyIlR+3Ib8Q.Se`nMB(6=yP-㾷`$v]0Fl`\TqoX,a^gl,uzK $IDp)+҄"^s#Np.l%ps F+cZ{Ȥ_m܀4HGWHC$lDdD N߂3ۀ\QlpLDӋe8g0tTcl8 pzRI%U8},ބ9Eͱ\U"%*,T\W]Xå&-Tj>S=4HHjM6JՔ»h.R0WLت(!)ץ4*H,Q5n0Q`P%3"V-: J+cۀh)@`~?% m}0Qo]ԥɘhJt%2ceX"%OHv2ɸsGvw7TRTLJpcP ^*?C["&3keilDgLR#2%ΤSV$S\*੠JR=X\ֆs30uV/OBNBRIxLrŕқ†lK0g/^/~}BEPj/zw)$P:9dɈШj=$W|׳y)7EBaHzTs H˒OooS>6/->1#gb+|~׼T 8@Y>8wT׮ή_l}_m~'z6"+Իib||Mǃ:z # BT,> ~CSxy =rzȚ 2VZ̭uo=X5׳ٿbo^KA[AoTp H@yû F Bhwiir*(YLDE.D,1- m QEjE~Lݖnh(:*9;3;BDZ8J$2 },j ̂t03s\vP!.g 8OէJK )HnTD}ಭmnH4x=RK92,ooHPړH]?P 'AN2TaMFR8g¦_5JsD*S'#J& WMin24(ݶVt7Šj];nu+b[d&ikxh&kD'Y% *wEB$iG5b/S5} fƈ2\ضK]q˜CƂ(ՒLnuX95X`];Ƶ0DoOހDFHvNH<ۂs{-XnLD?EŽ(Ccqo,1)i Ѡs&t.K魠͌ȵB\XEox;ΌETr+â 6JHob [q3s@JMT3bK)IeuE\$OxGd>jBڬb`^ eXHBjCsRLٞK\@ g)\^6]PNGTx_*H?RQ>:YKFz޺j R^HWE ; w$hIUV"!& dLe. 1Zv1 I@y@ W ^Զ󍧮ZT0Yf]BB⸌'VU+EpxX$$I~S ^Ԍ>bS8Qk}o.Mw]+P1dF& s*pQr*P` R 20a^DJ`fʘփ`Ȃkadh+Gzr,O9H Hfހ61ʙbEx,g\>$ܜhh]8˵ QdAFW$^'AC.9A+k5fp Nc*؂ƭ,hWcA#Ʌ}_x*ؘ`:&uyas~S-*.5}W~ήT5\Kŕ9=|':v7?/2?$?:. 'c/=O!<|<Ȃ UX:(BkSG;] <˕KL!b2Y(?_m41i==f'9Q }ɤoM1vY>}9_qްgG#Hg[=Eeނfiw Hϟ@D˥?XbuO@rN5gwv2|F?A/' yd{=<M+k洷Wb!]90/Qϟt<{_|:w>~,na(2/4LyeVR8}gNF6@z`c+.뚬SPcCгWYQ5ܯ2D`:]Nm;废4f*:t=ȒI~j NìK͈7l݅2SzOE@NAoJ>>Qffzy*j?:j HEY`s~9ol3Ϸ>2fajрHieX"k]((*.>g ƃ2b4# ) t i< ^EׅhX ?KQp ^êr7tBܹ44Ԧ&&kA HJ4=j"dXb"%`%6P{4Cm 1!6ІCb8f-mC X{JT;)Һ3G[ۀ[K[K2BVVdBPƠPgv$27S(3̘͢Qj`Dq1~R`K 6^[O$m׶6^kxm=j ,k'eS<+P،bCEd!,؀=ϤKTU77#gX7!m;v^ɜ6vX켢,8#klpyu6XJcg|jPp"o-ǣ4I=) QT dҿV%n91"ъX,Rqp dbO(د ;o7Sefa ;'Hfmْ6jwA5Ï:t I>+Hi؃`2@hoYP 3r@Ee7 a׍YPԀCְ^|~`Q^HJk#F$6(ְ_V4IiDW8s̊P#T Ae<*Kh=Yn,p,~ZeD ɍ*@ySal$Lq)5HZY+],X8Ha)Ql #+SD3{Rnk ּ4* 'O&~ E֩DGlSNy¥tQF Ffيv]Δ˸,-邛,p|<-' ~DMcΌJD!dmmR4O`drRǺ0x; YE\]XIO.|;W(VNU/T28'|K(R@1XIq/!lԈu 4ϚD$n܊"|fw[w}~Os2/"] BL3|7Ff߾߹VgDk P_.˛'\(/lou\K5[sw/0i$T wpEeާE_~8wmW Ŵx  ] n{)Ub?niN^GQE,K+/0?6~h! e ZF4K.҅ 1RZlSsf`4.ͭeNWTZ {]%z%RDএҕ"Lv[oX׎Iē뤋øH ?RsZa*WФz6Yt-f 9Ab!o;kÑnT'^$&VdYϕ*_ﺝrŮ0cg ?ͯZIG&.t5UA`c\1IRsl?+icTxH\U)P4PiGSvbÛ%nfsc #)fߍ1Db > ;wUcF_FsȚ*Pa"[?2iߟjr==8ߟj?z`hŽ}u_O܎JZ\ raRק:T]E'y˶;m[~``_5/ -,sAh9n@=r3lӱ6aIٍqs?w(RR1mGgǖxgz4+W *xq+u[piy:}Tn}רYL_V|*SsӺ)OGuBbݎEPM[@}[hIN1$.7yh=O-Nl~Mﲿ]uްM*FٻA?w}B ^yKO#_FU@fk5)Pֆ2[cxµ˪>&e=ek$D K:f~R8S+&t H-^v}/Q*b 8Qfad) ĤX&`M)%ls,`ݶn$ԋwsSғ(%%7/8)'DI'"I%B[ vN0xĸp`F5C>oZiSSlE;!.Db#"~CqW#eШ@˵ĒiE@`je(K@sY*.x:+ HLR?2n})h)ԏ.ivLN۟xpmÎim` ZkiC?$¨D-K=`܍{D ^w#*qK! x%SH=hlsk^9Ue-.䀝 8ҋ;fVn4O_?B;8a+3L?ofWMZ}8* ^[\ɨVI9_ xpRlVM~R?d^;TU9+8l B"C'CX3N*]'X)N*W!C}<L!T(JJ(4<FD':ՉX$lB5 !9G=QOJ? }>$ X@'7k^({E!)A XEHc I#CĖ Es XiVȾ 9,P؆Q% xVTe<А\EtJӓ_7Gu 偏QźodZJzmҺyZ:4+W *+sӺ |->GvL/+f݂kА\EK:ND αz~>mX#H3a]#NN+D(xQLLy̥ \i--M*Mbjٜu Bphȉ76aybQ|׭]Fzu3⮫0qy Sdtj#s㔦D2Fhq:/a?QIJ UkRKb~) JT.,B^.x `L F0j5SIk)[ާY T1W^ʈ~UZIA>I$> .a A B( (rU9ԗҝ˾uf*Ǒ:7 |4>#ꕝfp?_gտ9^gy C̖Yâsl g-2Fjv@苗U!ýMbZK{ +l}/[..X.%::?jU^ ):E~`XE< ^l \\ju[^u"-X[i벿{c]v2jº08Dw,Ƌ2MzwvEvoUo}໭VS]>}rHXY]lۓƪQg^KIP NGR{Fec-2"F$ZqCTRcU,448"A J  zk}COujIn gleyn].x"k XO={ jL0 ؍] #p|2/FI5ph>x$>\\W2/z{%(95rħԷ|ų/ǥVhady9Bg+_\/[pO-C 䘰+t]ADUT%=C!5F룒/Z (WgM]:znVm$W2G8`n|vM?O^ίsajE9q/vm.ndכyӕ7Zxs WJ5T3ܑ%①N?o_۷_&3leӱ:ys4~{:vZD#_k9ilP`"jqO~EEwz1W{ &8mɌ ds$j|#"eꇋK.=u RF5r8rDW$ mlsw?NP.d'nP`T18Q`#D*p椪 5E<ԂqTB͔C`wy*|߁* mqruM:Oh ygB.yq>(S"zbRӮOTSv #OJI935 B3G !#7bw9l wi?` \?\eYC_>e8.Lt~q^+OVbLčT\e:ru)WOg痕H4#r'YKed(gH{ 2E_{HY4.Sdyq!`yxN'\np~vm\gx:|E>PpqJ:4dX5YV0iZ;,dZDh :<*288uR)_wY%Z]CM:[q(Uz#<կQm7geR$3b~ՆdE]m0(#zP\<}HJϯi;{HYWlJ*.ڲ|ق04^L07n.Z'q{9RZŷ7 6[~vQ!{;˥͖d2ڬGxAf[U6Cځ$vQUdyvגd}rVSk~ETpI8FY)'Yh)a4dV7DG7[-hkb#xkUq _^oZYAw6S滇h.ըriw====k.KK!$-Z L aQPkI_GXvP Y(>5d H]2EÓ'q6pFMR6 B9K6EDFs3 hR +iS3lu˚q(u{W.B"wTN4wTآQUѿ8L2r-.vkh+'L3)q\sGoLQїKsGK2਱B;2wTxYި.,ibV4VO NT#ޞSQ9_Pj8c,&5[BJ.tK![J"1׍h]qy3˛꽯ت LtY46]o*(0hEںx+@=eb,N>҆ e} /oU~ z?6V]'Kr/~][ib鈓["v^ί]0h{҅\ ޺/&Vr>"MKpw4Elümp]B7ojOAi]l9g*:}tAtVkq^ՏFOwf-Vuqɾ CWq&e,,z'{4T\_T.̻]vA\-^*w6DxWlbk.;eGQ]0UOX ҄X;NtYkvU_tp@>Td%[!j[_')6s:|2D`_j$M▶IZB+]:pf1k` =FkTcqP+AÀ{"DΠm@W1"Ŏ*3 @k[U3V"+0Ƙ>,xp^*eX0. FN>StȂv&zV! q3x+Ndf,#0R|p=vR}?hF v6Qı@%{;ҔU$t+0UۨHi;%?Rԍϻ)p Xna;!%\nS.֕dov^rP"퇎 4ۃRR%s" t-9vkAxPY96!uh fdl17|LAU΁YeCUZ6l4nՌ Xp:pF ʠ-Ւ)%f`˶4^N7 ){]Evc|0;q]CV0Ks_„(WvBK2B͈eγOav;UQQ܋9`- s(ŮC yjn:X&z51 QV6a>/@hC 05L ʊL8fC_;欢N'&rDJr\s6BXK4#: B%_+VbFǒQv4IPI/=xa l,!{eqqZ ;Q2'7u)zB±h :zQȍ8wc> 0;iy$h(ÍW`Zz[yNy8`vK ] F2tB]3IG4DKiW!I-aa6Vϝ\Z(:+L#0_{ M*e !0کBF] +ej,`zPk{1@bk`ԏUԍzzcvA zǜ 9rn *IjAf749ԴEC074&մVI#._߹LO/$~;7ܬ__Uίc"a^IE?ߵ?r.. b~J3RL 8<"h'P OB:D{Px.mwMyv:y=D!{kGlE7|Td;Lm;]/9~/xN~yB)#Q3 ,]"p|D{ge,(gF9k(Mѕ΢B5tFÍRG:_]bJ2dN8k.E:nחXtАFyMRlO|wy#]!&0TRG硉b4X~h #lL}M2;)#܂?q6_.YdT/36yIgJɄGO޹-\h!>]0˙zߤH)[|Y]4Q؜`qgn_vO>zq?Izvbx{o-MɝHƐByU[ud&apz3ͲPғKb*Koloܪz=W'EXNtEмY'_kGHԴu$`N.y'8 futhD2a)D#O~Jw?{ƍ K/dwBWalmdsTZL$QKR:[1MpF-IF7:(1h1W_t !6Bh&ipD&^f%bG(cmVTg5x ) .8XWXkU6P\eĂay-e2\s/S4I"Q<1X*p^F˛"2Bbzt{49$pb"IgT@ G)EmzSpH) hԠ"J#\KĪo b8q&.K ZAY27eS1GX[/ b9֬x,gps}vag:MoN8vI(wf^%y{Je7O* e&evf. ? =d:[z?/OS(B^߹ףw |m!Tqm<3"Wn|m#v//d5Nym T#{6r qm,0Kuj(CChMxo3W;)-)0 >Qhx+ DAcY⸥SMh]?^`Sr-6Hl? h$;G CLuWNZΥI2HnMZ%cq d1LRh i[A QKL+Ĕ6]:7ך&Ci`uVKI@3F|((,X!R Z"!%+ Fi05I4xNR<ܪJw 1{ζcJCH%)I5Sp-Qimȯ#nVޕM77뤁B,6Yk,.~,Gj&c&B Pw"%e"bBCLh:s:H3pK}ABoO() ")Z"V)y42U(4JY{Tn=4K󉑅"W6GvǦ/)L ݇4<||5rCʀ(m.8:q[Wsxh.)0HDp6|Fpa a[y c ݍqKBH#)00ULp'c:_,$$f<ڋwVOn^]>XI*uF.!0z\wDI yت* >U4շh%iFf*8nv'Wdx;;-iş'㴕h3-Tw2~m! %i:5 \jЊ@~_x>uID?D*}<1OPrB΀TNUB%)C巍Us X'邛jAt+\?e\dcm~Oa̩^d|I>]|rz*-CJ]՜>Bh-FҾlR|_rZ/f)м`Tق ++>*brbۓ͙ʩ:zj#{j-<^V(tם_~_gv<eǨLYzHFߏR+|Рj/hθ ӽϋs] >"'.z% .J?7Av\YF.][˼&|X.D 3L)Ǵ7b37 VM&*)zQ90$*de*ZMA:tɈf=13ꝱA1ř?HLnetn1$Uov5Rml+-9dGTnľ~͛'[R'ůW{ e(k +SׂHM%(70YJ Q+f\^BŠdrkرXŪ1Aa_"~,?8ß-.2=v%{LLEmZcf"7Wۛ}J6lCuA~8ܗp(㶡n'|)n0/FW}`9-+Є<旇|&<;g^2) vnsrr8u8u5 RqqGx)8#_r)XƭpJ%*Z՛ ,Q T3`$'Sk($k I5w/H)1|?f!$m )xzTܷGC'2tL;DP9N=ΔGݟTb+y%uNKy<չ+8.BcPcwhlѠAhph #FFNZƛTm"(JCUo6'( ЬRƼ"(?Șs@Fg)z;8-BSx:;6%,cT.;D{vS3ruL]#ˤ\嵽 9(e[CUX9_$!_)~9nUȞPŠ}F&rk>Evۇ|"xҪZߒ\T8[)o_2uM2-J*6ճ\5d.sݥ;P04M˼'bN>#{xɣ U=Q|RgוF:6WgEƂՒhl*]A)tLF ~NPk x>\SMV)`_k'rɥ"8Rv9qRy Sަc,) %j~-|[?)-'׿Bp>JbSptVE,*m,=3d,{N)'Wa&c]Lycy驎R 2 !RhYޘ/6%7g?c]f=)ڃw=A Dr"eQՍ>s/O͍2~z|D^h`LhK#MRa&Zda%ϸY ,RC[Acg/Jޟt7 lU5L .tE>e ޚ}<|6 h}ݥ{h*RܠW *Oy&"_?oKA%K>-ט,&l'eYfYfYfYyd4.S,$DHF\r%॥ɛ$/:1$3]#b0&jٙz1Xpג"L?;QhG˲TVr†۔X&qz^BmùT/Z "2˯8o9nԌS$IEZ(ɇUVs먽:ܺdQ`4p;>t𜐫r xQ( pIzdpmRN;=kPИ'k0pu`wn@HCykfԎQO \BzsIi8(÷$H ݗbP»S(\!_YTJ#2ut4HQjchD%_:`,՞T)EÌ"@bbr5hpPCXyHqI&)jHEm2OXPD;^#nCʯa܆ܐblFNA렮/k/(BXqب VKMz׊[5s8jQ ap{ j 9d2mq#bKnk/Ūb|?, pw_ nZq,)왱3!_e]|X, &i[=Ja%=Zm% e%cTFO,6gCi#Rjf$eiA˴e$&yrJec䅮2M1T GGf 6R`JR)B7JT:S2'B5 ]YиS{Hm#C*:|"ݮԶ 2 %`|Y%(/=:!+:#[kkna ;s;n+hL o_~EkWFxd2.eCY uvc5W߿Dn> U%^PS9CaU9Rj}>yD)FKlo>VȑCW&=ڕ&=3<{|jUZ9$wq@?!e{ʑb8gCT$GJ]S }+b#6+RLjO|!=^Lb>FτRh6𘝱+ NȜנ.DeC MQ7uneWʻn"tœ/_>tp{1{"Ĵqy8~|">~'L_[S>wq/w񗐘ܷ~ 9y4onUI(m-P'1i=ʎ/^l2}c_^^6C,0swz@E8;y,f$2M)lմ!yü<1hyl%X`E>|eh,Ec-ږ]LVepu`&$4-PI.KC8p _:Lj&S[+fqs:l[Y_uѵF.]hvNH/kkҔJ:v*i" V@Y:C, (t}?~V@NO_].bXpq&JҳY7}| >%TqK.q_<e ӗfY~Fplq܎W&{Ail^ϪLua] [(I%uTV9d VҙP*I1Yd=Ъ4vԵ7'O\aoJ l[km\b;MXIJ3CxcDV>+<}=zhJ%Sff%'pv=A G5',0Cu%=ù-YL(u5փ!iGHzlI0P, v: Z&&=dڴQ R^&-؍zfooJ8=CiX_Gk-P!Mݵ״4/?tG(HOw%%v=?0/xbVOb6u^Léy1+1ʋarɉ!k, Ia.RgN@ǧfKG")l%[/4L׼+wYl %xgtr čӱs dJ9$d =rD RHN)EŨ'־XXymI P,MkUUz^iҁ;NyZMٛtGBʑ-|pVvJxx~3!;yi 6%K8x.C]6I"%-=8JL8s'L}yVMŭϜ;qLbbaokדEb.mz$/Wqce_NS%|_ H f_;Qg @B $ OI O7wW-#ZK̓%őq6)vn1p>  m40\(Wg.uqԽ7[30?6VJ 75N^͘d9TA4Jo̿I H8fk({:oMFhC[C1 ,őm<ΧM3ium $ͩWJ ECMiwd+M4PTֻADcA\vKUum~W]۾O%Ze0˘MEs\4Es\)&GU2TBUBdt#.dⰡ%lW rN%O5~)lY;vzxg1A 7:C(e];V2h*.jamC6ҡeTPﱊW fA*Qe;TZ7*cJ}C/c4nqȥخ)iSC_GU$R*$ߠF8R%G9A=lkT@iNة5Vu-,wGo$"!G~ >z<>Ͷ>)Pn)sl)>;^jwk=^8Mŋz,#+Sk RicWXHʻ8ϵ`]QcVpP]J!ԆPoL~\ʮESfe6E5>?]5FW%r"=R+?޴DDn.FG\!ƅ]]3c4j `g7pb"?OE穕VMbQDEN r]Z)'?t^᯽+Ҵ3'|Yjm8C{ġ0֎YۦhkJv=B< iQK 7dV{ëpR(@=￑͡axB:RLZ!B/(zgTR ]u8qJd16VÝz^`ȭS}NIVt̀ ~ň<}" 6-C~{3׬A58>) &ƦF!|i~jzi) f59!Cz@>E0BHfHuǑ4 ~q$}GH[bMǫM''q5mb4Viu]:MNaW"!Js~鼏?9S_?<1C 35DC W8 Ѳ(LR5Da8AcPtT-{qg5 qd7/lܻQfhƌh!ٴYz+{ycaq6UaW'*w٥[7%G@ϫΣ[fhT}sI۱16fx*;%Y#~SvJ<5rQˉ _Sҹ[V .U꺩lB M˟nCmqן~ߌ:)|WG޲AFٌLYX)P㿮$*Ѱ$!b*5 8],HEwitNzD]H@jY1"%ٺo!B )6~5KJ5iWSRQIζTC@%Aok ؀**fv㮗!F]UVu^J-C|{ʐpexݶei<}~?,!MdC4"/a߫pf|vWnx?ssx^GS-q3Q98n#V'Ny[TЪg|Z6?{.=j"P7풹JpS߉[4 GVᠪ3[>dFXZT3k gi RRUsYZ!2ğpYTYL'O|a;O \UGfTQTTXPVgxG=*XSLwU_ʁVvK\tո-8'rZUZs֝30MypT`-VccV['nTC+ܫTNgZӅwJpRzƭ*]uɻv*]ųP "GY Tk0{ATU4sNLFU*6QPIZJZ\FId$ r&R]S-_,WCgW}rೋ'ns7TGs|8zH2O ҜQ?% V)8ͿtVY3 $c*=tߏX W]}\gtȃW]3Q67l RT Vl 錭Uh%UݚΝo+A#UW LPvF`ZIȐI kF2bs tƦ65"TpV%q/Nf ~9ZR.O>0X_+FxJ|2r3׭x}Cq졄ˣ4JD {S&yK:>|ۊ@G7al<! ы"L_`+;< HT^c` N^lILRy) zQpRqbo@ukrI0E֔ݻU$jZ,P#W1TP5 !6ʤhiHjdP4! Y-D(O !Ɨ %& g7)D(IN̼V`u2$(iM8E 򩺄K+$Yf,XUzU@ ֫Z7(QJ^U LHƋ8I˜rtCbI 7 C#ΕT[`x&!Xnlcw»3^3LbJSNdTIR"X'7{Ts[ {%}3&6e"7 +oƓU&Hrvw{6+%ffa,H?f.[W`_:[u!f~/VK}o:J-D!R衖bzJAw;{S]=A60fyd!CNi9)oT|d֋^k+XܻRR R)\ZPgJA5(OV)ְv!VKK494) Wc5jogҔ# Jb$Iq,.*'KJ3 .D'{THevґ>cOS}G`/؅t*RJHkmm YhьÂMd'} Dea -rJ9RTIXo2n⩭E&ǢUDLϽY<3qw9P6C-5 ,^³ݫ޽-L?_g<()2hCC*$Mh,Mwą?R7 yq-Z|zpNþt/G 9ӥǚdr;“̀Bd5l'yoñg7~xmS܌5  |g&s4}Ewpr4fX*el ׃_ 'h7en_b1]l$86kjv\~i3u؏]:4 7i -?XTk{)F J(`!UB0wrp<22ʾvCr_\&iyؗ\rY?.^5Ph<݌ c1.}˖/{>Ϗ,]ȕvzdHW*蓹l&{uB6"(m3WbĕWrmpP_bx_cHf!Yv-t?-`^} pǿ_M9W+v:_|vk2B0#7C-['fw&Qϫ2RZ29:&F"a0QJ0Ʈ!:|SEן~yYύDk*2[w3Y0݇:\\neGK{i9NҷeDy#+x CGZߑiNPD)C} R4-zQK!BIAAnmF>MgڬfK]a,!"MXV` Qگsw w`qEa8$9Py_A:ur9r}q3|nv15Kq߻9x*$4dpaQƏW-Y uE"3֛3!6a0=f?`W xV2b9D}|4'WCyQv:[2s=@ͥ)OHH')"r&I%bؙf‘4&DRIHgĹ4e\]`T<*5J[8OaTJ71&I*h*j5{Fh"{ͧ0=c/u)(u8uiAli% Rj)H]aCJy6+8dH"7E1eBzI"vIHd>"F0=[$e01]\zkv>UR-C)f{,~Q+ŕ8R^ؐK:bd!I9cm 8b(1L(Ёl)%IW~5AL?'dai86E0 ryì{LѠ.2ڑ;e?+&P[Ef_6m}٩8lh:(3ȬDH˖/'t$<~ /O"JʭNb,w{2kM_p\A&ڙ32Lb4qI&Cu-\~r!凼\:thRDzo!X$LGLUk!BILǡC𒅺L?l~8p>_r qJ}kz:`㏳vh o,鈬@(ȭl_lI/tidY#vZ8&ZRTyƱq1 U zIi/cx3MkT|&:\@vws-D\uW:E=☀E*`B^qeeqXx0;ǟ뫫ڪroi& MȽ\[CRWYƬNӹl%&B(gZHL8_CyDj/_U>B+OXԀεB.iq݄$UehmUޑjw$#ۡ4d#ees ?ʹí;a2f9%Sgpe]\,TʓIN'/g$\4UpD-߭"EĹhǜh%!hMA(| (!c |YP1E:kJQ]{]jˆkR-E2+#_nVe%:WߪH5cTFd:HkBإM+S323+5c.q.[XͭV6v 5סMІ쩁x7}Ȳ=,/ԙmaB`ZL="Uf$䙋hXݴVѩ;F|qlw"5!!\Dk8v7uݭ.1Sw29onmnDnMH3тLH"P07f5T՘Y<$O#(VA<2ےjK),i)z)NOv↫GW6$*l[!Ѡg!rc;1 Ε jANfͰKGxE O'i4݇;w|I#7o:xcw>[(ggK1S30YrsuH Px뤉)Ȝ >B9;-SIfHwmm~3h wSs^ ,ial2щoc,濟$ۭV_t X_X,֥ł'v\ǤL-٩IC |#)s`iږdvs)ȚO+)RRFdhe:vf4ogDpBhMzI  KUPYO-ņw<boJ v; C$iEut]P1=1Q|Kt͏{y4BniGqdW]ϫ-8tբZzmlFPvj $] 1JL%dϚ-R-"Z̶f9 jB(khT.ݗRK8%I!iQD:*  (p=KԵ%gAEธ)m2YWMIؕ:~XI@C0@q%,Lji`A3T)2 R+eeC`L~ӭ5ge}[byv OwUo?$, CdMusQ%!,yI7Xxk!g D=(!X%Zv-{Gr=mh(%s4i^?;3Nh68,O:Ӛ-Tӗ^2"ixxv&ǎ3LJ P} ՗T_J=P]ERj47eҸ~I0́WII!(\Ϫ3OBԼRh^jTA9j rFXp&2'D1]ի`@xde+u(z, Gg< DǸA=ʆX."5 T5 !<\R R"xADj &n!.>;F[~lm.(R-?ڶv {_eоDaD̃jZXN_tQh*Q$K U:Oy|~2οTC,h@x ,_[ITrްߑ2DXdKYlxq$?o}^0N fMY:HK Dy(6cmꨂX{_GM!t&:v4c؈RFt:Honet60Px}͂@Q~pZW,dX_&Dn/pjZBG(EѫTFtacR)~c9: $ZMn:9[N:RX`齢TˢPH3D,ZRF,A58Lj`D]ԛ\4H,|e.ӏ;2E{p|=…$DO I ]%-tUBWU-4ԻBrkWWyMd~3( Ƨ~;;Wii>@#̂zRnj[A̻Ϥ݋Iaޥc%,w_$LKQۤJ _rDHR~VQ y:%qTd,#m.Rñ( kwҎX&K(Z++pcFhKGMNe:5p&\5CkgΎ|z̀2!P2Zmָ ̊X\䰴5^ `D$^H~q/ њ)ϧm@'vk۶D@4fDweQw1ι*hM4. @cevu|/̫bk^kҤ7cYx[8FסMP֍~= s"5Zd!NfgDSEZ"א`uxwZtUv|γg^ 22xb>;@!hik_dIGtsX!pV)Gr|HpD Ļ2 $R J!PnuGZF&$΀Z|F) gi!MwF:hZZʚ2SV!yq |J[&7̥8<|+?=(g 8ؔZ/ߞ><0!R_F7)t!TG Ɠ9x~~B1; o;v1R'ص>wNIcx L1meƳk29Qs{> LcKq_ _\ƓdJ{S ^s%.^Nɔpiڌd.#Pna4-F}:dow˥LfuQ\N ^v( YKByj +^Gٯx%oㅸ !Y7+TP}sY6.g Z̭v|cǿP%b_:/L!HS뎗ܫN:Z⚴' ]qXqcٟQKw1Y0T`dYZcȭ krǐcZWZiREj䏢=3tf,dQ@5RTa@8Gor:v]MFtUFqZ7n,׍~c@/K=fwʒv鋙,#>Vb/#xx ~-S+fՊz-w]ݶN١݆i9GtɋߜnfQzM8uW=";P٭ 6܊Uk6 C6ʃQZgTf+&?ӆG(չk  pX\A iOPԯ`3ZlH`(r j w1 N+5u`Agש @9/Ҟ-bDuVlUqѣf6.gص?z=Ƿ\޸AYgq˪Mpڷce4ѫ2-' Uvvc̴3㳧%燶{<V/I fP}AoxȖ+}xaׅOGu[26.#~\c]nO5 Rr EE*_1݉GXo#u4XM͛W9.C8yf0-g%z`]&6${yt ~suw3j+O:%`x$iR с]Z+r-Y# Ϳ?*|ei?G k4z H|D&?)3vbƮRU5fl>"Z0)JpA`S+Vƻ $V2tDO#di ;׻Wݖ M M_n=nߏA aa2}`%T^}42_TKIh'%Gs[>!U ԓ(L%(ؐ=qet!N*΁ﮥs oΖ 慄2Fʭ%EEz|)LMjL+q8'%cp)+}xkizB\$ 9khIUv՞ƨϰf_)lr0X2VtzDY.ZFbRRKg ݆ }݅LH=q6ŴSE3 he*ۧgZ,5<ގh%d59-Bh;KM,R%-#J'ep$SR GwexO@Ѣ{u]\Y,֭1SM ?&֭h9Ҩ5 8~ً@ްi!%j#TBj= B6M"ss8=c&uhuȥ:(T-,@ O49ʅ`Z $<3?\'] %5x448Ȍ$q  { _*| G\0*L88YdfNZ )q.%h N)QBE vQ3fuJنR# bM8ZӮģ-K-r:b\TPJ2p(230j(j}J}unWHqkQN~?> N-/ߞ><0_F7_{s* #?d?G_PLy7[εO~rs7g?ooD:hJH<_=yFO/fޏ>q͠ä,:"=7 QIF~:⃆cCZ"Ymy1ez p!i/ғ7ə g$==p:Zɘ\I][nPCSq?,KD&χ{L7& ـVu 7AFL /TK&ŜO͢vRMl)g2NI|x2~<(ď'}_Ʃ"CtrvdƟ6X%c|3 q0Y\`//}d-UO<5 liJRG) &. pQ)!o>iu;[C?&Yo)~"KE4qHnuRs $焃 rp!-6L]0XeRN!6aG9F 0iL@%Sk5הY]|{Q/u P\Ovj\=0J1<$sP3]0pa4"X\5g 2%E%Fݛ3#@)R(E5=t)lL'a+^9Ndfm 48 tI# VwLS [ڐ^(:9ⲐRm  mSӲ] oJor5{IGgab}ҥ\~˵@ur,t*B߃9OfpyI PyFnʢ*ix5;oNt"tzTN2OlQΥ ]stϟεeIh&i~;Xɮcs2'5ۜi3󦏯dޙ5wLR+l†;^a]0KljӍ̬ahp0wڅcC8xs8:7s;@t|uˡ3NiFgp?VfjSO zULMLp[{?Eئ r\T yG#R!Vä V<%[r1!r<&n T˔'20hltEO!):A2 yi "CQui *d.Lw|ӎRk}ۋ*ssw5٠90z+0f8HѽֺH9=-0|~vw7Y-cc >^%phzgp)iBo]O~aot|*.}Z;#>.*%s]ortll%TȮC8$+F2%oMaڭ)9t:mw6K)inmHW. ,eV0/4hhM0IzwO'i_ݛO'gOftvsmfC%5`l6vuaXN%ۋuT_T成caPNBNւ( @{LJ¥P2#b<e?^,2^HM!Nm{=~4Rִ5p;x/jxҽ|{ˁэxP>Fm2FA9[WfL&3DԮRDS%Q“-h* [VnR3>/~FrV h0 e5i n*Q6]Lǔ:'|Aαx$"Ӛ Lپv(N*rW/Gkx^J9,޺>Lhw/ RpUpܨBF8&1Z͋cTٳIKv-[@y |y]?StU%vLZPXK*W#ƴ鶦^w+U㖓FeGZțxS.Fk|MKkuǭӎ;|\]8+3g=Le=T Mc.ܺN[l25!a?J+ԖsTg=R͈.u^um4ޙ(eRmJ{t?G T֋jw8n[BB;(zק8rf]f~_ F2J(T.Z`p=IzYM5#r)\UO5&xxvDB}:,Ad,ceR Bu!Z Ŕ8Uk LӫoFЫg]N']ղ5g1֌lzK1j_QEr0*h<,CIq7i̽p;o4%/ޥN.^S/Ug*i'ADdS.3~(I  3T0JP,+X a.gMm ` â?[-nP$0o&L ~1e#>9'۳ Z_F;< ~Z?J31SUλR|2aW\a?jJau&WUT& MJ}g4cU.CB5Gcuî/\P߇,*3S9{/JL*2˲J,g*{|љ`0ihgVJKcA/9ז"x2Tn:PL妃o-+x'%Csǚp3aӧB[6׌HQшQFrA9'T0OR:̌#Ua ­dfQpm: `0$ڶduP~h)w\J#XSeo_;p\Q A}DJ!]3Vy ;a,!oӞ8d=0%:*iY+J>aqqh](~a£;` }P $guK7rŠ2bz5#;3L$X>>멇ӔƳk29Z*y@uIK mYŗO- ]GEFJY-y{點 sű=/i}Y%,2D,\x/㡾jD  Ȟ+V:$34=J{ԢI$ڿ4oqYaXߪ| 60ڂy .{][Z|>=|к#>ݶolOyGK`SSNr>;|x>5XRhϯG\bޡ MڎD h_Z`Sf8B+R6pRأp I?8ilGQU=O֩p9Qn-q&"rϥ1F3*Ź$HE=21R`k[!Gy)ߩI˼_,QQK( 8ˬ J{F` Y640D3ZY-ʥ ZSҵ/M)Al$=RCn cԔ-#C) HSf4tBR `=%8;1/Sv02)Kpǜ٨8qtc}s(b xIoS*T " Sg ~ haƄ77`᝟7@wno3mN=|"pm)D~yp{g2 * AN&w ,0EU;!6*U@2!W]t`x.i,n,FXby,Vp2"lXXu ԠY}ӁX/˴PQJ^-J|EbV,l +S㒗jh@A0JR. ֯J5 ŏ`=39!ŷ5OB=X: q/~WkQE=@)VVJǢ(9\60&CJ w% |0ྀJtP.t n2MFݛ3j(ԞI棍j'=1Rd@=û/%`l&7׷.7U~>mK'C ͗1 ,=Tr_DqP8#-/t$ wݘF/4CfqPr{,'q\9+&p3O!OrA3tװ\gY:,| ^j1Ar`ڵzҌ,Ӄ( Z4P\zSPGi3Ժw,]t{Ы((21w'&,sD&R" ҂:]-}å ymAIjF㨳:նj?y;40!jiݙRw]41 ?bt{Ȫ2i8YT\)@j4i!T>Q7;iq95Cp)%ޯ+>ui[b+#(IZW~_UuWKh#\b&˕^I HsgJ*bV\4CQ%Ic?5?Xf[,T:/!+Fŀ怭iL ԺTzB17h[+S0HK"!TsH 1 H.*kL͍U :ifxNU:n"%DH}%‡UFu57Cxsi{ܜ:A` Z/fBz\gD[LtrQ9pmrʚerqLStP.XMG 0ݵ*֦+GZH"ݶe@KB jӭ.qRވ8e;G,pEO6v_A@ [Qdds'8`w9abVVg˛A.d)ǃ\#|zM,ɃRJߪ* ml?;$h>@# ܲU5n w݆Uap>o[59MPuhjm2g%sJ{6/j\ammуcdnz>L n|_@01B =s; vf! zw;k3SK`pt{mጂR:(^p@p|AU /^ڭ=='PPv tDQZS$o|3*K֒(dZ %qݘ: " sY{\]J"i 5%EBEX5 j$F7 hQ}rm'?{u:yY4J`ID+-up'Am[J:A[w&t~ceф My4d2aD!#3R2:9S$ >Xvvz (bwOp45 ?! "a~0ۓfz 7 '5=嘟tN= wsN^ Ƭ0'NUuO6Wb)~pRnMsvS@o[(@-\<(EhE[琹,O-E D[2-jTk!(R,,|♠0:w0I@n%ynUy$Ғ& INF47$rD)#u_!7v ݉^0ȺD axa=DIJ|VΪ`\PD%;^J%HvX2 ,C{= -R fdP&0 s@!sfF@pE݅QGٌMl9!Qb-eo8c:Lk|Hs+nhO`Uz Vz84X`fRyA%s׳SK}DʢZ-;2MNGF8pI1kG|^Id(T_Z]Smv&oI` T{C'x,%_JjbX: 8H틋 ')$ʇQs8+fr )A[4>%2w[T@f,_"jWRbڱWEws:BXDшrߝH6ʎ8 ]?ԣ.nN~=tɷm~2.ep߁TL 㜌`>$T7ѐS$Ux-#dgBj%S,:\joj*y_ܥ=ҔC2 dܺ2:5ckV(M_\nTtK7vnx5`Q1[ 1TìEkhj;xeJ_:y*K;_Xud(h%sR Jp*|O8>~M}B3߳$e&aH !rD%9NjQnZB⥶D)=]IpV0R Z4oOPPc bcᔈjzDF8a`xCt`m$U[LrJ3gh&,y/iHn@emZ3&4#@3Z>ɏsOEy'g3i1c>4kNaY|sb+u2=%X2"PLqNVzػY|֝^v_l~ JTZX9'qp(Dܦ`g2#ȳmKfD_f a1Q=ec!q<9j aY )#seRAW9,uddĤdQ1T}vo[ 1!Ye&bF͈̉@2թʰW&*n(8B|eRݺL(Ƚ'[eX0buF'ŝehm;O'ϩ'O>i֊;@DXHāi Xc ҘD0ŁjS-5GHiÇ<̩qqS)ќM:Bid61JSL{eVg A$֤qrɀAg61 YdnY20aLz;U8#9-<^ޫn4 ʢpJ.ϢJ4qc!K%\8S1Pp6CWK ܼpFΨU1ntڗe`fӉ-I=~ԙ̛COfMқ-hCw.)AZQ<ŵw 5o.NA|/)=okS.V}KZEH'*,W|Ԫn>\9jgߜb/=S'ſ3>V΍^,:;+s{wm{v{';=G)8+ 3^h-)Y2`7+j&yj'og0Xٟ3O5^ʜ%/DL!zv(]/ = a祑]DUY㥂PG(bnn˩0&u;w&ˋI>DI48:0|$jgwh:Vy(Avy(;@{,.?FlP~7%Нe}P%:ԭC!Ƀm$m{Տ4L>9k{gYZfFp ] O+u38&1$Ʉk=avJNy޵-;_ E`(}]Kݶ4LL9|rPp^jK }ab JpsE 'TnZ/-!"RaO62˔8o7x>^2Ob~}wҿ&jdLDHC=OEd$l/t#P*J18n$;b+DxڱWs-邾Kx;[e|9EԱJyH'dQ1=hiO8 +! *{.'3x-hfB!dƤ f͵MlhIrURBR#,lj"?NqkC𰵱Ȥ?"Emфʔ%R7@?Wp)QBU?h:2|< n48d=7ѮQ?+V҄x:1[{9pFx$툀A^?{W7lb!X$d<>]`.vbIlv ._mɦV{Z*S,)PRUcP~}.I'U~Mɿ޻T%F[t6X d* f(sEw0Jؗ5d>#hB;8g'nn15vOmw|pq1 zmo~oo?Gmj$Ցx| c'Tc'Z~JgK5NZ`A1o)VK&N5B䛴uR*tֹV9n6gמU!jnѼg$Ifkq ՇCI#5,:2G5Z Xb 5CFYHd7;LſNz"ĻDD-]]"h B$w#I Y\=:I"Q*NdS*$LX*9)TTx )E*dĞ5THkӢ(1IW*$-Cd W>$# ;D"2Y˒6dj.  ҧP_=NܐFLZ>霤.|~.,ڣi| n<HvڬoVXi!ɆbW;b,®4~1[kn,kν6 ą&]AԠ"q ~\R`3ܾHޕA;؇&. Z5&BhBm28-T[-҃Aie7]`glAú՚$֗bRj9O#]Aa'0WE>eǎ&#^A>#'0v> ʧhѭy.ZSݧ3Eˌ#.% &c2v(?|0Rvp Y~ڲqQ8NsL7 S!cS2nM(>FFO3vn%/G)4zgwm _(Fob\n}%Ju%P **Q g΁g'P[eOY'"Rg ד2H"p̳, {KrbHTՋhWs󜈈ITՋ[jPc#y[8TlPL= W3$*6wUI2E0>M7k |:CmO05W;ikPfJ+hQWNPͫ (U'XxTл%D[?^ab9>+[bV.K |ÀmkFR5gJv)7s-0= ;5<\-ڝ6H6HϔsDڋu ([3mXMXoi΁@iĪ#*&қ`$m/w(qZdh$ASsݐ>Vo bwLJbvϮ=˫oq*sӹ`6O}1~&NOVuOo§㒉6Wy&~ʪ+w?ߗgRu ?[2B+ F|IU[ '\7}q!j&Z<{" >'$ؓ(8@5a 8 `Bbձ,l{E")D"|"Cc~snB-Q{\Nie-?b`4#ezkӶJ.H]fӍ]-q"5F1׌%:8nig#鹡=D] i5qIeSgR&oFNBf([N*vt==x|0y y3Lm,8$b8 AᐪwQ癃'f5?NQ1*1\nRS% ]W^c&AvHI?|il2$3*._{ޠCxQ7ȡ A!W3~YV;3+Ѣ BpupqŝuF7WX0@ĕ6҈Vr#k_E߁L2&&>[Ug/ bR9ON#hu3Ă}x+w"fCFt3[))SvtThѭѭy.S:ѧ5#.5PPžCA)wu($6\+l\~cco?yv"8Sە͓hGپN2d,f8F.<4ψQظ=s<5]A1jr!qNJI,O<0(zG=M\ќAd,˩qnÊk4ӯ+{/foskܚ "ErBP&qZH͘w,˵\& bP'A[Vn]ۣ~M~>?!}*[ . |F4 BOw8MWŪC;ȇ2;o7 J/rF^BpvꓩܗYlԈ|} su6aYԤ4m TQxXr줻{*'[ B )=~0Ύg-E X'a: AZكܛ "{%f*#VŬF U.MYhNM40T U"(G C^!6ZT;I!p Z>\%'0I䒺$#~?_/Q*޽̍j+FK%˽E&W#nLmkmsB4BahA)Ԏ\ d{zT{kuv1&<6*"HA\`X2%I#$:-96PX o b - < AzgX'²AQ F@ELE^iZp&j/)vgkP%/lWZb[%Ql\5r)ЪGOS$HUYEq0ڝU@@4Xkjjts94ar'x$MK +HSZ'XM[_,:uۅ2c5?&6.~S%ߤJ),nAO%}(Dz< K .6xzBZ^ݵ_ex]uEN\ b_9}=qFM|_&6G^ EEf񋕿m둉u,ZxLQc"nQL}dcRP> \At蕮XΚe2D"x=LyqɴJZ5(!St]` X,ͽPrXCj4q(~>f8ߓ`Pt NwG'~dx U0_3U AZ[>c/r(El:}3[0acR'|:8{PUv;!*muIv9fr3 \8i}=s|׳'Ce ׽ȵ4jQDUznz@~f]f]Q2커%6~ n<2r@/^АtY璪l0j9ҟV.tEІE΃hgƷ/BOq$^do?߿i5̢?擧of68!>mU1ҷ}\'g":6)6~u3㶛j͈P8oOpkL Xm˲p?μGJ0 ṱSĖ@ 69Nib `O!W4(NrG`"çIΧ)*>zMDW4*.|E8 VnpW4![O6h$.?ӳSj9,F6%C;CE++ h>:L2McɗQI>I3'%%%`"uH7k #,VNbC5iIS[ ŬTXv҆R6Ǖ݌j1nfY#grƟ˥LBCc00UH14 FL1v bׄ) EQpsLaFY!itR$`<iqF: VYd{ٳd372 T`V٥1+>uL ۔:+dY@Qc8({J3s׼3 r9C+>1PbӅӅ震sJh+DF(%Yhњd#KR @IyF֞>9d#3=}{F&2gx`ߒ I خY z7hżU8z;BPz^;tgvtv:܍kEpdZ a'rF*`nP9خl\7Ֆ60Y~xI]Etx : #j7عҁH "K;eRGLhy=3+-'wZ"Bӡi13]s!̯+S!NZ13miXpwr;4{ORنǴwq̽8='Wcsyy֨~*̐jC$i[2MfaHRHșcƨݦnnŵM N/ 5>.ɇתᶑYi`xZ2R+|c2^_)<>g j=A<@o8.6a71V.(e$ Jy*W @8XQmhqcÜf@vRew{q߳Y}a"Q\Yu5חkeKy;3K^ T ʤ&S`n=Alp!NΡ1:o xL|߯gY5?\]) Q[D$GNDa 0-z 9:AYP#G  o{wMA1CwlN< >=&<2uv&(5! v2 D"*D(V2F(n8(yA۶hI3 v[$"t!LjTdQJSǙaB o͗6Cn_&'>q ߆6޸ *GW,e86k ǙkJG]su\cWs2/8JmjW9j z(h!Y=qXm=0'yXb?zX^$DoO9J)G>(ݧS *7\HK HB,q2F^STK*^J۔rA,_)(O(Ou%$V>$4̖s R4UpV҆񝕘#!j/c+J>:|)r)a-cn]D.'Ar":atܪ8xRIP:QI(Tȕqxk"WvhIIwH:bMRhp&g`qR}JBid99 cIIɨh(sQ D2p@e`4c˰ҥN vZIh<܍ҵm4DPCìiU?'f߹~ÞG~}'IC2^}r}7yU1 HWp?g5/%)8o3 =(3ܠ:Q8ب:OՉ|"&5,SEYdcyqK T5CHn@fTa̗ZĥbøFU}ۧQ:Q:6- XK R<>L|KGCC#1hXd /GfXPZiյ|YQs'q}DXAQItb> Zd 4z߉tʍ'HJ[+j ^GwFRtڷ9lƻPH=&ѻL2t29+o+=v!)X6b4-;;.|;m"S+LPv;1XGsmsDs8;C< 8a̘ۙNx>?z 3gų3./Jx/p`,DZJPƩ@ `OuJ HvptuSvL- mM{%٤`FA#tJ88A{ Ec+5Sfe 1L X1ފUfE,Z%H` 1FƃC- 6B M b"DŽ  Y4bKtZs\kXҝh#ذ;Fs,x=H"~q 4`ă;щl=YT,)S1E; ³"09XfVעeǵcqU&hG6-kEsr?-KMc7egFd@#g8i A8Hif\ KߒHDY'n[) RL #zHFzj#m"ztt$:ګTBS#H8+(`*K@AbY1YzڊeF:NP |9b8슗.&fb$7S?;S䡚߯׷7W>:.Å'qԖ$5'!5}z|0'ӻEӻ.)<NX_L3vqr}9m̭vQr ^/YlӤOQFA**+*%<2%6X/(؋ӵo~Wpxdjv!~OՌ{'+@ @s؉w1B8Y!]ehqX{-z jtCztRQDzo#yUQx#t(Zw~ +yߙ%E_eZHCnҭ`q*'A9T wk>̋p$^\|Į+tmQz%GuN=VX, u`4F8.*"p}mkk?#g-|y)"&arnݷgl?[SK6_o3lΚ^wSko[0PSN@GK0t^ oKRt* Q=w+ ' ?o;P~2owιZq fY5rp9:c/}ܝBS5mF?;xgTfTP(wZ#56ʼ[RRvwQ3Ka 7v^ >^\*Syn)! 9_x!|[=|n Dz2,ƀ\эi8nF]}\yf=qY8O生oҗ%~|qv&L'hS/]A0`πjV5~[.q^o$Wh .V %)]Qz+DZ\uQ+*0|)_FDQ i9mx\8N`W ǩ)s^2f∫ RWY4>5i5PQܻqq73o'jA!:$oWh+Ir..'![c @Ѹ @{O-gyõ3^HWIJҿ&`@] 6^[,+E" ):H WVVwK `o+v(r_29{`si`lV_Q֣mSG&X-vWd=X*&Jn7.xɯB(^č~M~"ѐ"WP˻obPԧg-&?R*OmRyhxbS[zp;sV1J=r-ꋶC+1U8IZdk[HNHk +'=<ͤAf{Kk?b` BIA | zPE4RUDZ?Hzx*{[n^X:6.Zzzp0çG8M%a2a ,6__-zћh98> =&$6Kt))aa;Ayn|&8Il m\ǀAnrUL[^OJӏujA >dNk4h2:& 4a3xɌ7Pp рBg341p& Q4Cݕ{7iY.5\| eqGqUvbf+K gC'm꛱V8 WS?GƐbn+Z#>ԃvƺ=;"׎QҴdԑrEuɢ{QZr'1?RUKm*Xu `.UuioPKl˨#MJu;`QZ!Xd%F3d3&q3^\0i9ъ<, sJu&[j,fFf5.Cp Dj q›o>elu?۟<k6Fן_jﭖUO5$Z7$0r[`ra1,%6O]*# '5ڳedjergd @]4dX:W z;F9x:/6DC]k5Rꀱ~zǾfm KF|nƔhך2%;8!Vo"<+,ȳ,p=YʁU r'3rVs mB+[JoE۶}y㥬3S1ABEp;ݧ&r2 IЪ`j!tbŽB *ˤ3YƂ 1JhTYl#*~*rkC6YҖUYFIi{X&JD,<.W)QdL腆 1)e)lp,\xdՂ@+j e5hԭep'k@ۗNݹF/#,(;(442 %83Js9&]hnw{vµ~v7w*'_E#(> ,gL1X!Ʃ\\`O7gzWF& rtyIˡIJL"r.7ߞ\-gҢfdL.?ׂj%򜔱4~;O&7k$Ug`;Y9$A!%䮳/WidB!ISk]ifN*P n7>yG//d_EVy%C'釈(Ǽ$%cEe\N4 C 75  dǦ!.݈ 0^d"M hvBŒ 1Q`řo`M/UE0'*$Zٮ Oδru`Ǫ6kA`A>wǪsEL ۗJ~m<.h˯WP~ RΠd[ב.ɇ nXE~gZ+h1\5|rzr'm5F F62{ h^ȗ<}q> 8Uҍ,-bɶSVXh9d@NMsNZ[> X)rm2"0*#O=͞$eAFQ*wmhQ:R;>hNwq' ]`&d朩 H5 p0le`(}dTsހ@>_:OsO'[-~;z9Cj6SPV˪Amz*ع I1g/'ߟĎŴ!ľR?i؂xd7!~7_ 1s9QN Tcian)ƹy-r׷Aޕs$!jdȸQ1e(l֜jMhhFuQpAgY!+i+-%6QhV.Y<'}-AC%,g@ +AIB,CzBFxm&h(hR:Cٜj%b BE lH!Ȟj8! ,xGK(jpX 7TAog`g"T<1?cjކ^nLq+:XN &H~_wsUt/wӥɸ+bzuD?-^lX6* !쓪Ԥm(點g iVJ[˙r>z#jmж_;ڠMO'2{i}'e֠EaĒ_8~NOOZ2HS8RXR3c{J庚9,G?c RVr+QvPI`K}liTaFRی)Oa%Ònv 2bKU`J֣)]/h>,e7eHar#2T0ltlFe- Ћ!P{OD}`:}^OM&g 'A.zܙk?ؤ?^^ng 6: rВQ%q T14 9pBIJ(H:3 ;iÅϖ;n\οTY2G >i;J)5',3f<]_"S=A&jrByJIF>~7 p0=wmp@2]]zt5e3&d)?vY\ogq[1dV:E4^sg(+ .[k=2!m_Ԫqۗ7^%:-j{MԸY ŐXUUe& ֋7.M79;&|M( 7NO35>Z"* +x!8kjzU.}P P(B9~$ߋKTGίza*plb sի׹5L03ܹy^,LfFR’BP觖i5-rѺ\!tUߞ .e ~4veL+1j&gq%'UذW0xIWqm{_Т␷Losr-=ȁojATui| 9#Y/{ _aa;A',#Q IٱzH{¡Db[NWU]]U]] ](A;{{UsИʰc@Ieiߡ ,.诔n_,?@^cPOZy-q0>O IgB"2Ʃ;DdXwXPB"g"PR6a?[x9cܝZ ΐ BBY~j `V N-eXYO$bywa:X^8YД~H-m8ʶBv]sɞ +;=&U[E5ysW J*B֩]l WY~(/lvL3߰)ȚLO7ܚ} \[=|\aQ$ oFN'=6i L*檬o iJ@{w&nVf0b<,arUAd`cEШz˶^G4/=d'l͊'(?bfWIKa'6X[j^/=;`_JJ̖1NQط&soF[CPe5{'uvϗBD_0I k(] Lh`勉'I֙g+4zA>`A`a:ոg\6ϑKNZJ:MWWX61-\ђ`Z qu\WgdS\ӳ q-Ni=JOG~!1ɨ]~YbLll&23Lsf )v|䙞ijٵ ũ9 uWK L'SXP831XS+K"$/JJ25w<.AвHhm/r=*p1d~2nd/r\"ױu v<H&@8-\ e¹B`IBy0Li3|2t{{;p{EU>|cxL`r=ri>foѯ6ʃ, 5'slӡ/I5(q O^a_O{*}*@ka绹),Қ^{{aR4Z ?~di[0^B߽%/~fh'PJ$~zlΘPFsdpM{š/zn> Goq ݿgƂXZg/ S1Y׊K& D~3AnyVBTgPD%>(@g4UQf4sIw2 fAg2&H%"i(E[ON(\%=<{r -F[k:\BН?zF.ЀK,]{4-8G;x?Ky?EjbƁe38F2/W/fb]c(/ww~9L;s"y[ O]nf[~(=K|9xv عLzu~Z`d&~H J2UHv#.hN1hE#O%jMUhvCB^V);ni7 kݪb":U(ݎB kMUhvCB^,Sxylw~Ƿ;}4h@-.]d#ynβP%mLEh +]{ױuw]TvA&8 ψz%:畒܀RlADBe^Y! UZ,Ɗi)EWQҘ. c:Яc:Я0K?!Pb1K,% cëBa03b/8M1|' sՁ* sW% 3@R*k Ua/߹H)atu16i( [U%23zڵRi,N 0g5zI?Kle^ .V!/ HHƴ!3Q0N 0%zXqmf,OιE93VoHll~9GH%M\Nܾ\L8L°RӾI'V)JT+AYke: $6rPD  9BYI+xR@ )eW JjaΊ}dY%H4 &0Iv\, > 鬬P dn&VyRЭ_\TAO#T#~2IF8s$Ee,r*N:0,VU3o(pSNI 86CNբ*9N@0J 3 cJQVEJg2C,xj%,ظՂֈ؇ 99'0&0"PP7Ɲu,D3_c {(?Z|Fp࣠jc$ J^* 7JY$uob$n|1sxSd{3xɌ_ W_kBc>yB`E9iМAm^:J,C08>R A5Ft`eM/M*HS䦤gnl.VqoP̸db &ְ9ޝ8nx)$JUQቢcݎb 3$_&3mld[Bx۽@Y&uA(u Pv1GQpy12ؾ6*Xf۔Tv)o" fd9% 8nxi*XUb"QЌ)Q^Zd~/=67KW(i3YHdx^Try\խEe{7)_X H}L+&-=lwq:&<"3Gƾw0^W8N W~nhnzS{[~"?y;},Xfa5= }2rqwV]>Ө)l]Y8L;P @AܣZzPJQ(!&9?t y!0S[ְHun-A'X7JeȔ\%sKYHbJ$v9 = ;=`9vw K5M,H _|| YQo9AOZΐU Qށ ڙN}%XhDeS9k)(eS{\:vܴM7?Ii0UsV)~Or?UA+LIll!jWurSm B7T̶~|^; Nzx[ǁwM4rw)Ikl}Hnl={un&%8*b.OܺU3r m/JvyxL[ 2To~ ϢdrpV&m X[G4kW=Ei\g&u7-%[ @PעPo8_d:'G%?xb#k|wQv:k F_.:E#v#]"_ zU杷YQ1{d20]ur> E wK_ Z+GS(QդЬUpRi@N ֑sBT=`ŅhcbgdJAhdʫTm"xrgt>%[8E]ʺ 7qlI0@]k@hױYMIj[yG :zW! B#76{I >^D 70;k rvc3BIҽĸɝ8J{|-qڡĸD7M]Ė0Zb\Vt1 9#qI%uSEH#i|r_Mɹb*i*IȨ)WGm%YH+ġSTݩ`eE@[ܪ!M5{Y3#|l PcJRq\EAڳZu1"r6_YgqE:Qb9A'|1V2ʄ'iq.&ΥVJH*#i3_^;UQD^U ,3]ms7+,}QKUpnm'T `lҒx0Cˈbf8rŴ84n4^4Dzm*sj ƫ1)uJi4WL+@ k48Ar+Pܕ7]Y8s^1FQfT2E ZH$p(-*$ Ņ-9&7A4O٤KKk a@U}r@:ETPeܡ` I O1{@MOv$tŶr":GP=ʠ#RFqR=2ɿG2it{|<]\5{Tc&NHxo_cP{4VL8+R]9P( ݓ1c1jX +EED8ӽ9:N\:ywpg|ۯ+E1] xqk|w5CXl={v6M!N4woLamAحu1L)t]VJl\o8ڄ0A i;vOwttJ`i]X7n96E`׻q2\ RL'm y^be~nxwBqͱ)(i؃yer$Hmw1/s˜?(Bt|Ґ{cq4^:~~;˚I[lX8]NةfpNsgS8h3‡CIқ 1m#i"F`P%JB[ P1DJ͚"HW'G 6C:#pصpSj"MSxqHLIm;F= %.+Qa(EY3܋{+/}}\ q+1=46O<]6At49hq_NӐ#SFM /`tKɕ-93š7H b0_ ,Ⱥd8rlAMPgɇ6n_sWI jM0~Z*H)UY 1s-8jz+J|uu,3Zye P\kD^ ƋVB/a0eշwU^ 4iz!p%#-DyT=zIY~!8|T:Ŧx}U Ub *AINmsŠ›+[FvDjܭ0ɢS~W-I+8H>bÀ ;X" o)-qn!d BQܫy<0_{֏{ֿQ-P'}w2>yR~) =wCYJ [;n>):w1( } | 52~;Ev]St&waX|6haeX rO"p@GrЯQ~9ShKZ!N{_onhrnvfT.cRk'34ʉ:ՙ[#osZ[4cG:l`_uo'Г߾62=Տ7iZZJ/'\yʼn=$R({xg۬TfWc^/Uap@"p9h'[EQ73Cv~Χy(G|c?V"@;+(0o4VɂRR\(%TB ĵ$ʅ0n Y<9;!.>wA$-$ZY@N;kLKGH#4V0CU.dRAfD nKk5Ǭ@%puSKf1+,~$yaD0&(1`utt8ki07hR*TFTd[D+#,D:+"PKi” -WThN8#p]i?(d|<#<([jD \zsVh*Īƻ+) !9XI$zrx5ds1%߿%oEj_~tAO=N#M~Ť/{үLzUg?Rv77 >#Oȼ_"s</zi>~U!B T"TUOP~AU<<|%8A$ܧZ2Pk{yq_;.wbY˰O)tJ-0@jiQ䝸e[.UiAvwEao3lXUUhiaҵ1F |"w]3I`:_pRraeRўRֵ:1DƘFNU>Ou<@<7~zpY,?|r7˓ Oe4uR=3%-u'F]>*.r}[=t-նSeI.NXmK(0)p<CXc<0ut^|3`'(FXA$U$+c8&F")=p;1̌jO9c}:T'Rem1 7D] F#lE $amrhqeNҭ,!V@)3*%Hg 9#NU$niY(&ܽ*'Hr˷#@'S&)LuL;{*|BS2\Ĭ@1#jZ΂c7h*I0޽v1TwX60l%aD%Z\b##]ݾn%@"#5B·FwkIc{o}8dEƗse{%ԲPn/Z )!`/J"u$;r#,!B #ls$R6 Xx]#PRr %-:1 G}~lrcքUKiy`XA%I-3SC'/g~w<U 1Y(j:x=CPО /3s57;xuD vl4oqqZ7aIeĐx ;!0@<t8L~/3p>0`D灪@ vjl÷SJ©8dbb T](5ʹ#L)1B40EԆ1ˬw9)!h TRp(BdkV ݎ>)Rb Tr!ҁKc%*EAIV1bmt_Wb 0oHz1xk0dE( bI?3`Uu=*E8z8X5f=|tzeI+ilRb45kN&hb($1__|=) 1ʭ, n?[ϖ/8s_-ry KG_WRft/'/rjof~ ·GᶤWuwP⚭N"ߋckǃ.lLǁw~ھ>!$sp ]h qd:wuZ,b>8#s8FٛFv(8&q[I? [O-y)b@ԃHD$g7Nk v'8H!=X5!#TY#+ٴۊ|6&هw8FC!7 #>-'vObtDF6җ\B /҇׹JrZfdK06/KHjNCJ!fCY.[G3{n Ҏs6Oum7n-ŅZhݧ:i@AXɏ״;#h֑!&Z޴CpM;Eao9~^'^3)8':;vTo.N=Rhvoe d|F5935k䗈+󤢱D['ʛǙ~9gZaj߾>z9H.)vy [YZ-gqA]L4@zvJI-\T[Av{֬@07PD) #oFэtIpULɕ[0IT2/IPtì,}˳H XXCSAT>.kE-δ ^ ),KB40pBn0̕pf'zg!s r2 Q on~hĆ$/Fh$6L'SYY׎hjdV~ކ$AZA۶v0 AGXz*K 쏐C 0[6*ZM@f5afg;r~t97ao 5|6B*|noFWpUx}%:8)E18w,(??۶DWU DXD⨪=2w+y"aM;R,"qPD8 dD0{)S?)x34)4i?4++ Zv-,n{_-dT-[EoVp_\![>4= kZ Z[-Edut+PɴԕFr/,ռ`qw06ryhѨLznal㉠oͶ;uH-Jp,;pՇ+[*%]V=9ݚ'.癟mcE4>k;ِ}jx/W^xM4Y|q%%;Q!5įn7agpmiK(,}6|/~D\etPw[RkQIOmq??(y򿷋?U4ǝ[O'ӵ_l hDne~5oMG ~4"HWnwak*En^bŻr0C,M)"RcAKhMECٞnq_#nN3nG5]n̰٭ M`DqPGwHڳY%Y3&(tt6#)&^y+3p$8+z:+AKxCf6Qf6,@(^< rCDZ<K ADJH1_7RiՆnKJJu×X/F;!ḧJӉI"?1&azsHc'%R^c1Gaz=ziw:tƎwG 0_P[&i2űd?Q28lHPwObQ167㲨a~0t~¡FB9hu'{,6nObdǛk8O6GWF0,eDc(5ʈjA-'}5jx`jyh9`$5 ‰0xXՃ%nQŗQ+v%FkeTj"- Wh0][TtJPҨx!XZfۨ5 ݪx`SoWT_FoC8=݇?Thw[]Z-H-Å%և(o'ڔ!}^492r i̿٤x)}NRX¡3kar"Rct,a*9빔fʭf7S M.w*D?K?IJQfMu4 OB2d1U7yn);~="]rEJBvV$jt! i>b5VH3_,;\Lۋ^GPq7O+'* A鮞(#(guN=N^S]᥵].N^SFvޑe\.jW7jzҀjڼDɽz*FOat G`a.,>]]jfIqzEnOX=;gDDd٫۶sJ9ꉉ~`$}+&zJ  *Si-~kPXNq e u<"[Cw#L"%8 #P 9ta4;[6Gp\ 4j4]t p%v=Td!NRioK]JG,cVyg [ zMtA^c"- ͑2D*.}`l-gH#[bKxk|DjEbҙ-MqZ/۰oDl*"5ݲU8\jb-/׉_J`gv۩\O ΐR,gnm`ͺNl5"m0c(rk3,Y/56YyMʚr LnyZ"PAos XKƖSE(r\+s4$w#DʮA×g[HO$i`3xa[i:%e}6 iqw៰s*Vd |:q _,v2xWqxWqTxA_[>}Лj-[1_զHns[@΍ha&ذ ZJI^z+)<4k[CzZĖ2I" _-KjڽW@6-xOJq ^v?ͽӲ\]"Z?.Wx;G?N=< "cw(2ӏoFd6_?Nn>^x\!Z6H>?A:|5B{t}p,9LbUO#u7 Ƙvo0;Wg/ X %aۋ"߬NQMeӛR*|ʶE$ڋ8|㈗v:å=.:5@HvsOCa8Q"X0I, "X)mi ynTjceg7Q|iEAC'n iN(LjTZin"uhgڊ?zOS9Ӎ ϓ47l\stF0٣v!5ITmצGX3tq5Klt5,lX}5KX{xc2qK6;`6@1ъw}mIH벗?F&bu|$^`zwF $6K"MF 1T*_R'u~*Qjv4Q/c K0U!v/̽f\2Hc_e!(&+BPK%OLA' H)!.G89n$ԫc1y ם#„@grV ;ULA$Z2ceiR{ bG' x>!3/Wq%+ߝ3FHà gOOK{B{tN<*_<4-Ff]^<+N`?^p?#L&0#ge!ϱ> 䧷_F %zc!82 nt]`chu`P/Z&B-b0|R 7|Qsu$63ENb BH!pFX.Ɖ>(Uw@ :Yc %Y9>YԒ`u}1"Zt0gKP'$cuob&_+aSSHae> krҸӝ_?XX$ׅ?%ӛd$>q}γ.:S#1eDX-!W@6w@f1d:!WN3?=]Tڂn,TClwӥjn^j4?^7 r&ih3N^QLأ:H?ն؆"4DBِvK NZ85>ە+ygvt\M/i46M=pa3 r3iaKid6*HZ8#\<qzl|?I+Ƈ}YnVzus2a1C]tq Akrӓ,p7O*X9S$sk^ۯq=LwpVg.Jx5rub\Nx5}j,_W>lQ @mԪr:XyRVv%ǡUSWCГb06=-תiZZ{%i%<Ȟpv OH&-u]1:y0G]i׷~Nޕ+Ϸ˕ ߛOXk8z_ `kzN'^5={f%m jA?l ZCU1{bft` *%XQV$XQv+JҙחH^W̳1佼-\y"_?=ҽ۔NrLH#.7OWӫyU7O$dndM~lvqza~$ʱ^huЕP9/LVpܡ'e,< )S?kN/./HRv9x_FAϾS+q(uu"hi"ŷe6s( h+}/eQO᝞3e(M %gn~a628Txm8p$FNpUCg}&IDkG+oL@fAX ZnwgL!$HFb0"M.aicBkau`cAPaSAgڶ!Iqyl Nˀ: Ѥq2 H10272PʧDf'#I栓\7,Rk<[SEq˓3^Q+%4gpOq%<:G F(l㉔Qx)ߞ,HB#5rA1kLC0k0΄1zmyi7Z@. [5_m{!gTж[Tщ*g̶͆ܶzۖո6mSPéYS +f5 Jޙe3',[btĜHiaiQ@F^厓'i& }k^!Q{{ru'8#˳ J*E?ͣΏJzt8$8vBWHg0֣k58IǼcKLPyf,}Hn^'ۻM(v{{cLSX3 +4# ePw9E)ryـ8geml?6_R}_@" h-$%\VFJ%v&:r۲_?,Z,S4}ܒ|^8! v殸Iw(ټ(^t^:^ UPRK÷Sk0e<@bM."B#&_ ieFj85a0b~rԲ$ǟF!.5ғ&N[lHg#=y#G i76P}LaS}O?-}pEOɊWa9Q'~ܔɞ>;*̽e=Aˣ{q?xŢ+j^ M s .:<[f=~Ν9> ƌ-)VI Bj1_)2BQ`fqIHYog-d1PV5sTUX%"Eb)#8'|2Iٮ 5$Q3g 99%4(rƈ(#\RKk"h5ڠb.POow<{_'*o| ,9j=Kcj_{VJj?޽}`xķŕW7!v9!LUWg÷xu7㛋)-&^~Y~vԾplRɅoOnsI.%dx1jw}qC5e"}f"$ROZSӴ4I]ݫqy];hdAr )Qg`Ԇ %Z̏E6H+c_GV=Zp;^cVS !Fa?>UXob JA ZWa쁎"+i\G^>ւG4S ͒QIlM˱l=e#ZfUwQv2w\اUn1io&]`aAC TҵG퍃Zeh:v0 TG..Q(Y{Jcll #Xb5 n]:zN:gt!84zI"LYX !W֣jp> ~)BDGy!Ofr>m(%oCdVK wH*)];5|oxHA/u57R5]1 J:\BXXPѺ|!>*+9}ړ%hy<s~4>N^Q tϝ*t˝tg5k49IױNz$-iIPSL^375zi ׊Zu-p h=_#Iֶ7l$` HBxIEm' b_!E=0+ZQlt̐ZJ,JYhB :E“TT}-_ z3cpy[_>r075~2pc@w(v)9 7igۛ,=bܰM?Eo|i FzP}w#/O7d\: :ޯ!Mbp}|3y/lh~7k^t/sq>P=۝73! Wu[i9Z/m  ]IQjxIV!GV֕8M>p>{BQ9z /Sp CTZUd [](Q |@WñTx4JLL޷ A mdS'X *bu1AkpWӜŤd_,)WTzW(^,__ή\xkalb̼">}Z rw># REorE i9 6(TN>?_`zv'm;z=)gLQOriIX OEPʷfdmU)\r+X'9./H$"@\zw IF2oiр HudZZm Z ES !2 idB;s H$J cr ET4&9/!ޗ׺eAvb50 JqAy54Y $1cP_MSTKwF/%I%\]AOr#ӲlkiTByAfe#m !ppvT]^Sk`W8x"}t4 K0gz7 E&fmfS#*$2) I?V[edzӮo?܈z =AŐHZKiJu Ab4.MB@F#'\QGN#apF))xJ7t\UYR39 T}/ i%!Rd5'D$\[ ޑdmB0(Jd \STR$L7A *! v^Do$&8}<:F $e ˔|ziK˒޵A (SS]!uIqIюYA(cJP+'v+{ 0UNxa}s*J3hTKNpL KN#ȅkR,nJ2h~[G+ r6H,INWEGPM=UNf ~]$5t)h AhDO76[x]Q"*1h ESܦT);L-h+ioUc@5'* 6 ,GSQޡڱ_W aOO%~e@צ ņ,\{\ETLot)8rWӪ%Wɯfzr},}+id+qntvY)vY)W҆oU&{K3]-}7< 2Źf_DWkxH i59״I ,rdkZPhM5D.U1w֕aŤJaqô }Prnx@ŠqywgYf%j;-HB@c4BgÌZEY#2A!!T0$s~(RP y0 5&E*3 \YSL5rvDV૆ԭ'?u -HYwpIKE_CUTTp]]LR;_azw3Zb16 y#أذh:EhgͿ `fg{veWttxS/>+TJ>͓V]htnFnt??sL3<-&U!=aSK,"-I+2%W4 0PjZnɲ4y¿}y@VaZ`Aa߬%TءԌ 7'Wzύ޼#3]]ހt򷰂6YM~Z#p֐?ÿ?WWfyq-J57@sg:%8jo&{9+@usVG1(kof4?7ѽElPZ8=yˢ[?~ d-DkT{4oq,>/ӫ9:.ӻko"Pe}2O9fcJ1A C֖w<~Kx]l=Lu`cOaG<3>-C͸\";v'L<'@=US= `Φj @'O@WW9A}%&%:>OA=wLj"¶op(#{jnxm[A6nLl#'G>l][Q ?ńchy:'r'G1 ~V`WV NIʊòT_WN5z.91!Cځ:#[I$^Ҽtjy߮.*kyI`=)|_&5z:A蹺sI#{t?Et;qoܣ ߬ 9dx5xI`) iF倊w!BR_뫂b*XZ% No,hG%ۊʦF\b XJTdZ%ܒLJ!W`XQ J,͝y%QZc,8a~4i{ѤaW?Jqg9%W1azܨ?z^,z-|>9=9`y9UR|d=Dr6<`t bO>&~3d[Iw6C}M@Uk\@doo6F e:1˜+.X3O4K2No,-qs%5Z(Rhn4BLIg 2*Os @86O5F;ᄫCl7|g1d_E V$?ij /k6Qq5*z$R^+,# Ҍ߲TF}Кݠ$%e ?ɂ^10w]( _e;2XlF`i6#fӻԛ>yS 㵠6E5+%ab빫`m:?&V Ǹbr] ~|iEϸspdmrd$T<ub|7d+C5 i!E ͨDxOR3NdjPt.[ƕMйUUߤgDN0f"A #T(cBq0Up〈|Oψ (s X(YJ\I8&i/MТ>+<ցjX ĉdHkDR3дKcW{2ʝƁ3'M3ȰQ΢ F?rIԉtuFIIK*AjkJgk<cX\J⨍Cƒ טxG9B܋(_A-iLFDs2EKZ3f}x f&[T7mĖRg=If m8_jd.4FHS)n@LloRa񧱼q44 DQ04Wu")R-R:I\JO5֧huT) X4N(;MVg;" IG4)RI6CFBx2E1o9Ph?mɂ+!Ҝ?ZPrbVhͿzŵųAaZzPqqBuj6"+.\|_%')aQ/񌐜s~W#r#O!=(ŵK׎rm$SbKئv+w-;vkA4}Fv@x*toڭDS[ELIp 혔KplHZᒭY׮w"W)I΂h"ii #؀*3X1ID1eJ̲==rTur͍dk{G͹rS|;1h&wS:gB4$ qyg=$Aktͨtc03Xq!D On0HwYCWCOZ{ח04C]Y~Iu濤IE4kDs >\.֛JkD3zMS‘JhQ8xe0 eWH+Bx?pE37URbEOKW~{yі<@NLeqc'uXwiiV:5cH/T3pLشjecRHRԪc,.QL(=WE|sML \LFwɧs$9p, KǞ0fgK6v icmΎN~5ӓd蓘7tr峛L^Z|Zھɧܡɗe3}OCS`T J nҴvhWSʹy[z5}, vMx a3!V[Nxk$ua$7i8"횪&V7j!2ʐ:8j{%L5R"G\@gZd!nz)5`Rf3-ώjQX(Y=tL p$;6+CFbOA"-RXL3_*-*y0&oòhE\Ĵ[b>2fb,t,r- 1a93Ӄd99{Qi,Wfcc*s~(\Mg% /n/ApAwi|͵Ϳ[pLn҇-n-"4bWnξ\cdk\ 3s]MqrSL`+&/{e~Ltc3nU:*z Ky/h'=]_Yߌ{?"ϸ~EXN6sk !J4(><1q" 5R֎1}9BeȨNGjs^ }YV?Ǖ,WɇN\}[S iKY'X 8W`Sy2 VusK.F yF3e 0Pf3˘RFF* f0sYɢ!+0C}MM ^3%e F„@(ރb0Lhn$e45L\Ƃ`Ix(G E8LOl"(e8 0somۥBaς^gTgY6ԆɎ1X1Q5{VprS .SbÞ *\ToyBS^yu62qM9&X֗I֝D>ZϏ"""к!;Fělm7##{jmx.ʪ&|.dncI_^S4<n<W3@n2~ b~:?}kmFE`O{x'k?e&A'A&}`мKI_%ۭꋨK0ui"E*.ݱK (5H[$s% ~q\$GqH95 R!J#gVo'owf!Nc?%T2Aڏ7aq}PU*(ڮYTF5V֦ڟxa^Iો_,:{ |򟝂Ǩ@X"T(eԂB_{?g/m^e6mo&:!a(aH;NhAE K `@XRەIHSbdX#b֠^¢ +pP ǥT)C՞9@.Iϯi?ۀ8DЈXH,Uh!GV[)$,%Ba,e{>X%hZ!wZJ`2{fsVl5=2ŵ =0'U8)yMd뢇xݽO/$z)ޔ_ւO._*[`z+Wzho3䄩YzsV@HwWZ}\Z}vsJt^931:)HqKUZ}CiKif*! w*>Һ,L[@ra[LJMJ_6DQ꨻dQ{G.ۯ C. Q҃tvZ]Bg_]_2{",)ᙎT Su27|N&ʛaC Y*)[(DLؒVmC!ͩ]vm,>&*"C ymO^TuO~&if=O9bCX5] q6J0[AXNhj$EWt}]!s:>NgYq%Fa"=]h/xIs5P6ҡ֒5zHb)*% `/8gV9G V;(0 c%.r8)"Njea%ȳn(nZhQ2R]Ԅi" ;1zpb v MgbH+9H1g N p7οOK9el b _RAV+״)PS{QJ i] $TdObD1i)dzԒ,Qhv"#/aw_ˋD%fCɼ%ȒeO7viےRZؔlÆzpU*naBQGt D5qgLZ0grL6<靗+b=kbΚԫ IMW4Hb'_Eg\DЩ. D߰fWFs!&Dt 3D(TafC[ p66SI(WZ(N9jsZͲOn/l[IEמƆn>7D/ê mneƷ8H2UgjĻ Ǥ)74Wp"\T_/'gocB RɢD6>xla92\ig[e2 ~;G0NOJtAX@R֦:]`^ 8}V.1}nj{}Mwi7TAb9X?$S\"Ni 'eja.x6GnW$Ё+OhHc^Y5Xfz˨&3J;O^zAçй+ $c33rAGi+9aA>@{?aL zb. CNph6b)t?g(zF6"<9XѧҸQxM/1H(*Y W2L2X 5UB,U:o߼H8鰒 + Fij-%%hP" S%-mI  c8?UOsEQp```h^U^+8H ޔ!-:)w!0 ^ !lTILm⢻`Şȟk܇aT͙)xJ`( Ayd,%2 !6&YMTT텲e|IJD)k1Qa D/k Hz@1 pہhL&P8{ǠfyD 9?xdDy\zc^WQ )pf0:;(u|A>ϋWB F)t~2wZ@FsBZ&Wj.>taE覼{>.0 O YR̼vDcO9*JK{ÈL+rBc%l1Yk$V% q^Jci描ӇS!iT:Gq6e$.@?>VPz=yo\U, " z}|{1tOEտBoܔws6\\'[?R_>_O.)"HrjLn5 q k%I'KP"O3R{ A4qI-v)œ1) ncObt ;$v#&Oq5Iz>zP isr3?bO |ئlǁ h0#y]+LS_oU YHJF ȟO/m UbȲ6Ӝ -<t-v#S%`*riK["d *s.l>w74;#U#r&![9VfSGZʂPv U^ǵ 0C8 ҡ+K1Fkl{Hk9:;̍1ᨷ#pvt^T DfG{FЦgUv(̌Zh r>f9YpIbWp=⒩0P>4y Af-EevUG~jUl40Fa.ʅL昍gCtԾ`,34̱q wˣRHLs0ATȽG =oj=*HȅC(S5j~m+a=gd(7Abis!kWLXKmew{ҧ9`¯嗜TRVmK kTcD1D;rBiunu<ms "Y )b 6X; 4FPI+J*`k8ROj !XdDײ,"Jw~\jM/P\)R405~w; ETW^TS&z>x/&ϩ*~xwuaCAR%+b\kh4 .^?8+k%g]!-:z O xlp|7֠;Tb+{SVZc+Tޖ s0b#<4ka:`-"/4z>ť,uâ1?T?c Y c IM iƬ0[ V3=8X %R0`RWx^0p0|8e\< pcV; h4JB( j$N`JIl  K"*D/iiK"h8t$i9z,ujSD7b~xc{V#_R/j*~f^uMFkdĄƎnK_短1f62.'c&y|q{&GGLbN% ϒ}+#.Kgϓ.؊f?ox|rӚ*g+zȉwV}o9Xthb8rL̺~TE9.{TWbW1[ v}kJJ˿BZRһ&%).G=vs,~=2X_G ^ʇww_owO?>Nhi1Nc!v7noX觛OٗѬޕ5Gn#鿢ΰH tv۽3a= V[Kޘ@TţXR}XdD"L|yn•g6MBd_\yfl0~N[reqI@7}q\<%i=/Jn2hbf`/yPw_]$-y7߅;}Qg8:rCO>Vƙ[S^*GjtQ@Uszadj㜖 _µKw5V!Jr0etލ-q^gB6`ZDack(t`Pp6og|}k֐p=`2P"&4R0M:dB ㆳio _VHhW`k~?9S L/` S~K%bnLڴ%OTR欃1V]WcacO'^,UvU,VOeԺX7CiU&4HR9(y4c ЦƢ_,]=U=77g?]^]H^)۽"d7eTJv#糽Up)jtŘ'4&3AJ~kb)qמj#DYTo!Ucoïbи~X*۫<ٳt'Ucn&l/)R +Z[L߃愑'dHf"wY̥!B 7kHWR#vmN0,%LxMYC64S0bЛ jmgL^zc7^m7Ս h_f8j4 N\9Έle\c7&F5N\©uy&l i6CCEst*`JKAoyYdͮYL#o- 4Ɂ{ Ec{b^i^=V'1z(im^B^9r8θX\Kdr4+㮴yv&@}zqIIPxX #ې B$3:/r1*lÕ?CXW3M5˂BRDEYآ,8p AXW! 3T9ų)ej&5QiJ. :+YK - ~gRm4j%P::\1օIh-!B#tSI/JEVAVØ0<hDY,s1R!nXLiIkb}Y&J7F㹷C43WÅGes>=>'qINNB]-DBZ RX˂B@SjXHoCXjKe.BXh8-!5ۥgH(F`)^;Bj( a&OSG outDOL1-Jf҉𧓺P=ZE @gPy/>dB*+J=H/J8"70k]8>WF)Bf,mgç,[:UNmκ;yi?xo^b&02:K3L7 Rq?2Rh m/ҤEQl=KDqy9:- Ή1ylZ8Z|eLd-&Kv{Sz(<* ̧(@G:,0%L2'It/Z ,h?flXSr_sd Y 1:`Fy<]PaU/bT,_>ȋ)ӀLǜj+DkJbr"})p]$ٮaOiɈĖ`;6pX8tɚ?^'+; U&^X@T_+d'uY/FOLZȟ2խI8zB@ FGP, 0ݱwȞń=b^5 T6mev8?;h+>`6KdLC_9dzȜD^J˜-ܨ0 5L!̃˼(t\irJ/LuAVa십#FO-C$uiپbԨyS@Y- %t~Aj'H 큱d^JƱBC~ :@E0v-úR$+eYE .4]p_ۈ݌AY-1ZVz~R6wh%o9~Ӏ펕\:}ˤ+!h\Re<7B\Bd x+S3bA)p,I@8q5(6e$Fe5m6IBap⨫}8 RШ"Q GEbs`T366苃E47cw[M~PDX߾YIٯ8 @u\۠ڶ hoיE4 wDQj*ty wNl4Ji{ LqUioM_LCf׫+!0L kS&P !qsNJ` c@XjJJe!%ŝsF$H3 !f.,K&Q,X)e0 ɴ6ګ"wl$suDzT3u+D`b`dE)U0N\[Kg!ctL GmFGYd$nnjy:{&gF9%%Db{RKk\h38cJ K '%girpdal^y*c&aAkW&.Pf lY]Gh)bqvyan S./.ϕrW]d1ŕj/gKK[Kimpr &O |!į趸>ZbvWn" dWe_O;Tl.E[ /k֜?P {M5ѻ0.L),2A yZ BVX{a7wA9C5U]PbDK|K2v0 ^&b|r`d'bowIƌ[+Ðz6@ #?ZOULӴ^TD0{\!]΂,av] K T$并`/Q!-H:$}s~T練s|跳23-#'e3l+l ]VuHK+#]8xX|fMpn~9:>-7?/6 b}'|ǹ9z iܷeD6Gn. 'dO 9Z+X-oG wYJgn X %T. ZP`yYI ɽx#W[4}[mm\&;ҿڝ/sd,{`Fk6r9Jd/>~/Q >yrG;=x.c/0TTN G?q탛n8|ʲů#_ۛ~'߹zp}NhRB?O6r#"K.ش—"܇Kf adK$ged{ߏd%K[ݭG 3ZSbY/21ggϩAq\5aFB uI O& FF< LRW67}]d<`m+o`jMBE,ʸ<|v'E:Oy2Γu^7#D[PVXNJ%DfܢiE[2KQMI4gSؿa|3󊻒[E fv wofwn?&C< ,cKgƒC@FKz]Amtj,9F\]e-Dsq_Y!庼#@iPWhBtS9Gޕt$!J*P\ 9CwdKwkHäW%sͮ(9>t`XPGQ!hm3QjaJɓ2HmSc8{{Ӆ"z979UqRKGB1$9[vBU [D,2]@=QWkDa (XI]*YY{&hWx+-C31aJpY{uS&DP)DŽJpǜsvA D1Ъ6|W RŤ5>l!T,^fb0M+m,;0TE Et6zujI ^'6+9xcy)EX Y%&(*՚ ,5wȻۨ~mb*LmAd= ?KQ<"+ZB*  ,~ٌ[MX֮sr!Ͱ= l~U~@ּu2P*}xAJE#$ӊj-- :;:oҌzK;\Eѻ F3dpI2Jb#ɋG9s1⪲43C3 Tp2C`Sa'歭J jmܲ.uZ_| rk"s$W`1n_LĐi&kN2Z$c% L V9( 9WăBg8g8qW$8 %U -JRk@=lgh (mQ h'܆`Jd2 4O+h9fh#7σ\?x;h#-~ww~<{78gZ)3r8hL w/Fp`6.?]| Y؅YOxtQ%䛺5d|̕XGQEOW?gca h̄D$C}7 \ ֶ@7jc8#fX|yt]BaIu}̦\җ<}%ZCXiG*G!ܯ=7?-oqW Zݹ_[;[YjBRKt~mpݯ;gj2r9ם_;Xz58~m42!!5+o)h՛h=VCyyPq+'A D_,.`0FD\F~SkZ9 ̜<S+T4IH'-%f\j ܖNpWRegT:Te-3+-C h(I3fe=`߀KTcyqq}Ȕv)̬@:PI%3_ &\ğRXV᧳iLBMbUzͅԝ{Z۱|J𿣤 !)2aWlWT="fS_I+o~?OXza0 V`PU(ϋ* T/ZHq~i_ֺ,l+h:t_74zMggo)TZV8{\VQ.n0NYtmXdqr;nM6 J_.q&G>\|]ō@&]0hqvq2?^;FiՒQ3M%Nz[Ka4'Š TƯ(ӗ30|#]"5b~ ]7K=k59"Ϭ)%e* wM,qZvU/vFj0:({0Iy#%SmSi*YTq^H7/|ЌQHcԁMH*]%YOfuXC^ \[]SES{*|,~<6$Kch U 0}oO$dwmMe2IT;̹Oc|7; w1ZɅm H2)ʈ0N(ROkb:r*ŗ3^bXx $SjrSfIyY9S3ҚpcmqB^FqXozѳhK]uEe2[-|&ж'Jc=mQo5?N:!3r:29@;PO'c$Ӯ Ie Q 8͘ IuulXfS C8˥|2bO k$ {̥TS+})a+*+iyK*yme+h+P3^:|(+3KJrb#3ų6V?GfV,jȤмV6@j@" C3M$fi:gD0%XTZ`0<r48 `(*ѦG_RȤ y^:,Hkp.bX9Жk'hc)DqiL#p(Sꨍ~V3.d+ƄL6E~J͛3タxr;wX]"QZO\ *0Cc Nn:3׏rR s]Pp6cn4cSS G[dZ]4~j/ ff< v0?zϨ6RPdJHXPCbҚwp2ܚ$'+/L_O&Y;Ψ}gЯ:[y[GʓepE՘%Հ딚Վ#ĠUqڵ7a4$V#= >YJvۜ82"h)ywO+H; HRmD)h!qBe*nMxY*) VΗ[rc*9g66аWTJqGVʠ*g5V(*  U2e\@HI{ ꪨZ>% hZV@~T-QS^L f;Fd)zȣ+@ (:v50Iz{K7+?ZʊCxf<ݤIh~mT3R)HC x$\ |yf5!Nř(i6R/t(k `܌\E)tus=N-D9St_er:@TZ!*Q,XJerjv/VHM$h~_[tU$00 a qknֻ;tL]׿Pw`QouR=}R6"4y4|׌!oXmN7ȟ42*jת|L7ऎ$vi@f@TCSYՌ "ٽhVq4+ͪ<` ؜qA»騡%;>DxŦj-=𺃄\ Q4!*k=)VсL.]O󤝉 .'ӛtE//ovU`mWf_-v p쪿kK~Us>5me:lЀ>dʪӧ9qk!5+3$mF&:ER:ҊWzg8"@-+o݊8ń<݂fd%{Ts;mN}N}3zϧ^:s&*np;vvB+yٛonҿ=MVԛٳwgg9Ymx^r =y[8+"-L9eDPu1ړ"(ܘ)+NLJ\^b'ȑ}B*֞;iXY@@{Ӗwp=DDn_sۋ~\߹S:.[{ Wva6Go(Rpmi-*ZwPmyupNY꣤=m>zK^ gNr|I""2`EaQXTwbkht֩7}1J 2ch1b"w|b:cD'keHO-|cYgϳΞg=n #dT@/2>()> Bd"1TΤđIYR{1]~e[dDy}IIaiO9_C_ .c,Ӹ30/j_lQͫͳqj"' [JyLx@8Cqpr"QuAl15Er~ 6V66D2:-6bj1_ i.|;ν HCX]/f~yloqީ@*[Pn0;Ӕh IGfTif;9r$c(CG+ RY2ER` Pkx>Ur:٘ U]3,uK֮uCS8}~X} 5ٲ}ĸ#E1bN ?:P% 7>eHƕ?,AࣄmǿHGYBF莏¾ zX"̍5^O/T%^U_19jq˱I)uG#^M[A9mgoXԤ\ b5;{}*Q!i qrB"Kl2SBbi“8R$T6 ?l;^ZY]1DƅऴEq˕jX]FS =3 x"8͖ 8ܓUb,Z9q$ W1Yƥukp"6k8frpwn1+\ߧѷY+Ճ0ucuoNMJ9U.(M>:ޝqX+<ޟ xƁWW7;OY13x;SΕuVW;3~|3KM9*.xݟI#P@Zi7}qM-hY]vptGoV|-\%\2-לN ԮjFwCZsw.*KIiRsb,RPȒFPڂzzZν7' [u+ (eőZ F}-ŐIvH.TN#>C4VOӨnʗ[l'F`)zIE/R yz1x_`2ALdr=r|YiB߲xD =zyq̨0S98U /* ;@eLlf> GS.Q%ؠۄ{Ue PH2I@Ű`I,կ:M&Y=y։Y'Ϋ:QmK` P eHC0&%鹣!7&*Lz7}kVM>|g(\@B2>3 ~L(_LʫS]vdM#ȥzjL{UQ'\=JEI\=oiiXhH& $s>rdf>*q`Y* FAH(NE:O%e00W~;"MJSiY蚿 ^eβ;+ ɯgzvMDxhtG"s߈?2OO?e9PpO:5>Þ|ÿpt<\\bitr;i# Uj]HmaubE0̶AïgU~o6P4WCnmG{ R;I XJ \WRmp}D# c'8t="cf8iݭE&=1Q6DL%!S-e ͢JښTۼybB%8F[Bmv+ݟo;[ө#:7z;j։d= `+:YT ޥ"Dk#V=pȮ7#H |zq_RQf֌Ѧֻ?nno7߮zѳ _ww@>1RsCsK_ZU60m#nC07SǮY:zSi'03A=F2B^V'as>< b2溽PQ^]ٍ/W|{mZaNQr(>$㐉_wTv[RK!I4(} m=FѰ <+-#Ѡ6SJxx:k}F 6*IV-bgʙ82)K[5LAe,x>{$vrrDOBp ߮Ug{I pƔL)C,DF#%Jq%d0&ibi gV);J%q7Cř,c[Nf*̩Up cޒ̷ -I I5R=u_=>kxw y5<VHL[ZFCÁ!DRh-EҎF\e;VH>;V+J77' ǁk9N%5r. :Ʌ>:hc<-DIf3ǢKE.TmCUd~ͤy@(6HjuG+0>L5S$c+PՕJ9Ȟ`ﻳ(;@&$k-~c8q ›Hc0E>B .PYғAdJsNM'e x8={)b T!1<ǂ˜(bJ S<a!hJt92E83L' _ZAT:4@u򚔰xu@K,ʫ_ľvh.ym.Q+ŪvBSX$|,lFCiQ铦s8y&p*:dzJjpRLzgWK S`iP*6C,vi& veN0Wki,qR ,D8R PL F6GxWTs+Eq9񡶹^<^~{F.zuͻg@M I rI;f' ɽgQBQE(цH?6mB1܊CD<_sT$Z> r5E&AȁCIivҠwyH!?{Wȍ0_{b@>,2no`@6xl,O2YْVlZAf,ŧ*VJTen1Չd#2.8IpI0y3C>û*쎺%lS=w9QO Q7h ,.of~ɇ+)[$ڑ݈Af,bb+8 :8鯣_dʉy]120O]>yfp3Fs";&Uidճ2~?Ƞi"1w6>wIje.5aPp忚KV>58VWŔ&5Tk5f ݾPdo8 *NġV 劊-lkO=ћ~?,hQʾ&wQ?-De0Lvwg_ʞ->$ rp%#Ƽ#oퟎQLSRV-.t ZlCR򴰂]V4D[ܯnm^ʧ7󉹿oҽ [t?NjHVԯ*SJѡVA~D4>a U18̠rSo=C`y6IIeċk/PL~}!^mYxN(EowgO0W SRCE :Hw|@|O~vU|:ʟ狳s i8vL)N2͵ N&GIUSٕ/@)K{%qjK൰*LV~,/Vߗ%҂/RS?C.UmO*kVCbO@|uաM (>ƇK_лP")L h>5+ 1?5}Oj56spyƱF4O) (& 6dIciAZѵۆ\śK OAHdP) <6fR.^b6FKcӖ1^0 w*=~s Iष3ӎҜ7nv/_4jx~P+1'Ԕ JJ{npH-8"êtybNSƟ:<'S Zt}LՏ$PG@ RRk# z&51v!*./jTs"q2r,Oej͢].&SYJ]kRtk 0fo v ,PP U@5O.%T p jL`稠%a},BMqGqƃfK(ѭ)gLpJ[DO)2}5]{OcHI\}$$Ŷ.R,nli8W׬ٖMǕPA)d|(lCP Oש7 BBcR&ﷷ 7?|oUi07Bh冽A;!_i{}f˺ib*>EMņ܊ݽ/BrnӃsq38JDq#[x+/ :ïH$ںA>Sqgp˳W@d2؈ A` ) +Y;f7P={(vP-NHg x#ؑÐ(cD՘H5flDEEaS8)8yYV^V#u)py41Di)jhLyy J)@_Hn q% Hh?1ڼy4&&9g00!S{o1+h^9o8J*K>uT%-̈6LQ L X`4+K*uN:'|uFsí[H bߖ糫y] Kې&8R޽9==ncp^\f_ \A9@we80j$B:TAv׷[2<r k)9Xn QkOLyXP>QxIV3<i4Gzхd3֎ F"p)xN>͉W+<84u\wX#kM"<~ml FD ZXԙU@~32LXD:mzcݶm2"ݛ/Õ8ܥ$qgG 2fK,Y]KzLt@ )7IXxG>`gϭudR՛tuӑIۢ0䆓L`̣Fi&-J5J}\!(P*%I!95*@r̀Dkʢё=Ϩb;B; f +pqA!B؀TR[&#)PE!er~ v4yd^R0RgBE5|Tub;ubWW ɑIfbU(=y-/sc_bO.fu%{[OH kAH_|<б{)Nf#T `ep/E([8NH=^lޟw\dqnFZʮ),ƒaO.ʱqG=O8I#a9o^R{,Y1]Qyo 9{^_@[!z筯ٟC`t PzJ{F #u+m*㡝跣9{vOv.(Xm"ۦ} WZ'2>ڀ \`JJ$^?Z`'PoW"H$v\$^8/@ZI UR4uBeQ09X{ ?k$r&TT_&yj4^T]_T <ͬ9u\ VP`3 e( AE(AgMWnⶶ;; 9TXjO; F.>:;=Js.վVi􈓿ZHd6o˦8ş--Sބ}/kjP,:=̂wp0{ϣ4ߝQq9SN5h8cu:+s=wr ?;lEo&ƿ#0-wlp|M1C Пcv1Zz^~qYn2k߭]VͲ1]!TWam7ofoH$LŘ?|gD6fkNlա.no,+~O#V*G娇`P:=g;o+{CbbrH+shf+/gO9.z0 IC4T2u,I QsG +ZbWc8nT׺ҶԼV|x.}tH¸]_v3ǻ2P+VSu%-~" "TaU^~ɿO_OC\o꘍x0Ct(-wG/@SOItȱy#[=0i@''gw*%pM[1dZ%yp!eK.ا*h֓uV&+$ Yjk,Y|# 1{uyH=@UQUEyZ+z̊88~EH+5Ǭ kJU_UGO-T[ WCA27S͚Չf!rK;ȣdWDbg㝩_]ͨ<, x xANnkgq/֝f}ܿf3 תkWܟ]w#ڨ,TR.;gPJϒLUNF*Ukb!B ^z鑱YVPmZ>Z"@W!bjGVlǫc?a3 9SZx-+#u^s]z&AzFžp)T(.fTSTT%+Q秔kojk8ܭҋW]8 J1=9 wHٰT>1[HcḊZD {` 5~n(Х8[x 9.y^Qltf2ν=s;6 6z R9MkBDQU*p,3RҦJI9nZuw~?=7=\Y󋋧wN;tSsB" %n;;M Yik>̆u&Xnj[[nwH 9cL=2e aJ')LE7HE@D:fREr|6?u˜fLbE .FKsXx Y4rb-v5w%Y3HAs[֒+>%6Vqs49DP-:h%Ʃ$YPly`*(pJ8fZ,Cx)3[hplfzoW2lG[hX5(e~OHK$4p2pw9:~ZLNr ]xuE[wl%T˷lB o.|y_zGt5Cu&&L 8ln${67pJטa2(,671!IIa~f?,607475A&,n 4Ӳvk[CY8e,yHFl68xѻ#O*Sbu9 V1&t-{Ӵ&DPd`4dw@w=D|ws\MȈ=&.FFܥFȐQQBnSRT%o%f-,P9S9qC`^ |TOdĻCnEHKܲ7lڴdݟkQB>"26R9TE{m-MۚCQLa MYw ELȈBF@l`e~Tj„ *VǬR~LQ5} ҾISkE%*i>iP oԲ!CG6g\9&C-V2cR6ҴpxAt, wQŠ0pkxWcW-ECm Pƾ)א19̗MgH'}^,Cc}iA0GF]p|_ZfT/:c!|_/|kW*P|_OQ՗ل6顨|_`k@R w [Tn2x{zze9|q(4,qw_NNuӋsxi~^zxv_vܡ9cI_sLS<1B rmE JGw$XdQ~2Ǻ=;ݗ?kۈ 㹽&5 ,Z֞pyJ2*9dM%<lk+ w/is/]( q0AࢧWReֆ+E<SPG_Pɂ.FtGRnho-{rNQq6ќGFs4N9IfgP2EbL[m}ưCK3k, $[VMg01fk|A-keAMNPkw7lP5dS`v7I2lbH$ǽvdE}<Fgg u[Z0! <@*lTz>KAq-aXR' &weM)cqiZ*h-;qM5gr' ܱteet8 C+ԞDf!&n)h@Jl yj]OrKV[Z<4X/ _H;i}7*[ƣQI2SȆn݊ԖlQgC0Cc.MypWʏC~ \ DD4X":'l!̵T˫}UmdVKejuj'b(5yC #TP m%>Bpe cTq{t6TM7kT]DAoT{mlIMv" xBp[ Yj~j z'IS(bve@/h/ٞqH[(V(s"R[Z^*I궽69-DF;FEgVBݍ%Pi! =qzuw~?=Ny*w=*f֍>O]#zuךIOWQwoNOpJOn2FrYTJ2q> 4њwџGa63`/u") V:Ia***$1Hs*Fp[X 0=8m,zOᐱ"T"% U r<$^% R0qJ]?b!Vr:.@R3!Kfx=&~w%X;bi^٘LNHGI4Z+F!3k2-ڥh`ZB0hZH-%;+RdԴ:pZ9PtM`heH Q*ZR" R[kˍ66yaU Ƹgf,; P*K3R*a 9 ]j$Ŏ]itQM%P,"QQUְEDN#+RB%dyMmnv]JksY+l%[^ Xo)Xڬ?H޼?[k^]y`Y!nԦV5>ư<ZtZ*7n]x7J"帼^B ~ÆV)Y;{UBݽ= vv5Ǯbv,U0cCӘ]ŪP@!;lxh`Vi9hI2^=:3։ )y_r {<h^֨Y?˖D>Z2;IInEj] z6R4t6Ӄp@Je(Tv.գr3=傷(= >Ǹoc<1n3-mh9m: ]Ku֖Ahl}zuPL8>`e9(Z^$b5`Kp2Q0Nqg’k򠵕&[:MN)~)tJ˨˒U<2)M@t+ЄڎѯFO:I c'f!P)VdFlJd;P'zM=Ǻ&Tg9;343M 7n 1CbvQn琘 ̴o(+tL)EUi=g)==)̤^XԆ' ~J50uK̔R93FB*`KGBՂJ. iERʫJ,1^btZ[b|a@AYPA;N)g KeéEaT+2E@TN| ⢍ZUY*мڔijI"F1~mq:ĤPJcYQK]ָ"<a8=K'l\RALkݻ8-5?vtlGq\O]$e3}mlh]c]|Lצ } u?=4;OO/ Wyx-|YvT邵6߰N:\ !?S_\Q_giu]ta++#+QGШp0ƂSN,hzτNFiHoM.( \Ƅ`'b0tE|8#I1Rq}vZgk>dk7]ҟ.^ 6(y!= wb*}4آ2&.v]e\;B FvS1KH_ t;m ?"[q ktPO[6&3gF,F3 f1cW2™|PpOGY/o뉚VXaXmCI 4ȭ7!o; f3]%5I/N~ ?mNK:5zw#oJ|ҨF3'V'쳷A!y";dCwI9L Ͷځ8h;) ۢo[a {O" Zr&ֆĺr뤂J*cJdmȧh4]`^`.2ZܺD&h,>y#v׸ڂ6M#v eI(ǹ/wҿF{A t*0!'*_d7ִOl{>1ݵJOcFȇiEƳ9uIcɽ8| ,4 WQ}r> }UUyZ\w.IjWnY0p -)T=kNAsRiH#6Nwu ?}~ *f)$xφx 1ݼ8kly?ߴQ(:=;L;}n-Th%~ JS˰zl Lέ:;Khr2v̕G~XN;?h58IbOrRO:SY֥ge[I㻑* y2UaHe]v8eȱEU:`eDPyQ1WXSh$PTXapZtAg-n,xtAoo׾x)>}^=/ux , (~?-$ekXQOw?|U.|ߧTSF.,Lϼ߿?ϗⷲX\fӼ窼i,%\y'7іxBcʋ$kpFڂ!쫋/sC0mzQV)^&_~,6:Ba fBs]rdI㬥K{uj ! YFhYe:0*PWE}BGez Ը|D0 ] 3 N5ӧlC .x`"Q+Q@`9ʑ0…[D0vCA/s-Ǒ_k(8LE)LFq Gѝ2E1kg"H)X kZl-|U%) }3w$1\wOҵ &:YX9앩*tIJBeC]x[#d3ზ3?`f$Qo$mWю[vtq5;OgCHaq[AA~ʌ uڅ&gye59[ngmYHf45:mʨ-$h<>꼕R.TGqk쎧W6KӶ%%.R6G}zn&c|J7hHsc0!awuu0(8{kgT?~0p`?Cjf駍*Ȥ=E;?o_mut]u>]Nf0 wLS+XA%: NdvU&*edT%w+TSK{fJ)ΔV($(|(nCȕǂ2)r}Xj.];i(,M,j|\n]5 L`ajck< #7v;X=F.4/m mDRfW{6pl)\Y3P;x#uJY ( ]M [jn7'9Oߙ̋0"6i;&l*LzpB 9X͆k1p@,u;WD1Tg8YWjȴDb  :2 PH++uVkK![EDAVgmemI\'4S.M߈^O$>kV_~Z7A}IGYLϤF{EȺh*a +*S[{"Tr(YL`Fq_^],}jY b(-WE,!d(?^9/3|j%>f.߭R-9#TfcVe207d QCVVBH>xM)u2..jXhoVG̬Ɔ7;DU (nC6ґjAܢNZ ߓUZ-;BJŖ";kSzQV6V,!ն,`[2vI_cc .>AlCb{!m͈ǫɣy{I?[=ܤz$Bg KS`mVB+KjG0yލNa273'QfLM<{W&R%ľ{Bӝpق*Cu$-׊QfG˞=ȝh-8?O,mylj NɜD{|鑒1+͎i˰?)/WRēWBZ fHE^J0lKQ9\nKNfT N1z[jodo}L ک-^h}} cyTy= gli#l#FOz :(J`FGEk;H̚3!ux§HBXWt"[XPXVX,Rf€NjqBw )1[q,junH㷝gX6M7 >ۼ<%ΐDʓ']0&UrLayѳܟS03r vz,T%}_t^#4鎼%~J"w#N^>mTv֚4qd%ԑ`^cPc])2Ew]ftW z?/Ku)(Pi+ 1xKBo*I ]9GRe7Nm`1deʈ2il&%_jˤ/LZ2}Gwe@JeV,26`|( sMJZʕ/$]_"M`f Ɉkmr}NU{kk7t[lsrNms:NhBcd LlI \4,:V6E<,ֿx^Ntz]zP FOH魋a*UQ8~dY[`˒KXTpx_ =0X-hfg ljUQ Rt76i\>[Y,~.xi޳Ϟ{$F&, Qq6VMzqCSJ*<6Sx *um* RdR)ؑs+)M4|E0Q(k*~mfk t`9;m~V)Yo&uVXy"DcgJɢ#4dՉ(-|?{OƱ_!!H AlCNl8qE}|H>}gHjD gH+HBGuuuUuW%|cؾڷ ERw`uluW7 Bt_pAc-ߘ+e#o)j,L+&=1\[>BhPޒnɻmP,a7C1KE%qtv )d\zpR0,8~Ej RV*pK\GFbԶ0'hmP*`JP+vvxSШ.6R5K7!^[Zoh^c'T՘ʖ ]Qi'g*W*9ց㷛az jyx#D "06¦wh4Ұ Z51H4I2e<0^dCdWl49Хn6&SA>@_H}&Ԩ54RpGEycAib "%"3xLL!0F_[XjXk:M JUh1UTv*4HZݣG>DAy컋o#6Ir}_Rb.>f*#,qVd`gY3P7"s0jbUbC (e*V|3S2`,V3);/V9:b=Xc}dz3ǻcn\ӅU (9VNF>+NZkvthô;&%҃|v_FO)*UWSK\axSyYeT9͌agajAACM sl<3$,E" uHmf"6A "'!UJ)mD !u-200ñEG8;:>q Np:X. /ׄvUZGsMӎ;{uIvK=4k~ J&bNi%Q_XerO08dS]k<q6uE1$>Oޏ{>|\rl WF2|1؟,FEeEꪱ^:C3c:?H3y;g-IG8ilqLJj_ z/X< 3 KPjR!dfއ1hI% ˫gËg@r!xF~]x8O3_:O'b2P0};FfxX͔?J?uhJ_ʟs>:I [za }l>|փWnU̝uχw bvytVײ"tNr^m;]\rJe}|a *Xiu]9bwfZ!,/߮{zg\^̆`;ٳif^v9~<t|S(*O_N&9OVtJ˜e.1FΌ ]U>ڿfi:P ʿ:y&@M.^.)x:}<%Z~e)c@<P/SQ VLyjFT%ɳ8KY~]U&c/|jȒ+?/t]{9fxͲ0/9khսj+~Yź*/Ag쬍XҰs|"lF22VU2nrG n-ѢjTr6dJV)R5#ZJW]U^E݅BSLq^'0U'iHmU8k~>M>գ_ CM-D|*ONp. SJ`.IF⻀~8VE_i"V/C p.س\w٫(*^qA/ ^⺡[X(>4H(nRee ʰI.oz)Kpgyǽǜ HPTuwʓߥxEq9Euј8GJ6&F!,ȕI^߲T6!jWk5&FdI6+ CXz#1T2g)Hu3,!zWBl )Fq)[P ~rv j._-g(\YJ폓$N;nr99.{^O7AQ 'u'A9 B4K6_<<߸u*^Qׂ&NW+5k^7o#]ݎv$.dT[ڍR>gnMiPEtQG}}6w9vknWnmH?\D7db2a}pzrw#s:d%o>l#(l->ҁӾy͸Zu\%ƍ*X1>Ė;T[wuqT9j&}(]NjzWNj-y|#\v9]Njv9]NjĸJ.'I' j2x w9T#."*;0RkŹH,(?dM@Tl݄82}oUf?Ly1aZH, Z]$D>=?wM^ )(kĶ*r:rY>a[P-Z qҶ6VT!ʻ@z6q(] w.Pʻ@]ZJw.P~gh Bඥ'}M {N !Ђd2Loh@(Ew_2kR5W'Z=ʜ @'<`dH#ꙏX 7UFhyMnyJ9,>Atwb9skŻ2B+kLZF^bVL +LqHȘ n䫌 s*۾ 9U3f2!*{FV 䉠m16:Ô$u]fv䈤n䫌yJcҧagA s$3RaS-"APkͅjDk} yI4+A2!.J''3fV <#i"SARƍ62%#zg= J"'5)h7 |"I0T94b\!%Mc'9QhbjN E{B-h TS̃D"82QlDo$ԺPK3I8DaSǠM@iP| {(1 "z&ԕ{}̵(MՊ V#bz&A  2JgkI}qSXN@9 u4C-*(j ؖ ytX(=Yv*" +w٫?yRQk!us;kf"!'BИ0 p9j\A\"byOY` Z#aZ!ST+"D\.8]EFN>6];؈3و$5ƃjAoY@}X ПSVG eӹш:1i*͟:MLw_ᷓ:UvY>oVɤV3lAW*nOlա%9! >ԸX1kki re1Zljgsכ糫s_9Xo._6 em߆Ϟq;?R#p89IdJ\KuL3$C jK~-ƈ"^'vq=ىX ĦA2t"yue%XTV{V@VG3FؒN34AKJmQun.>R:D吏q:3 (cR+A@Y ,Tʥj 'a2 мuBlm9`Z `Z `Z `ځin,8/gLcQ)uA-[Z T߂[ oA-A-= XB߂ȝ%,`% β}iG/m)(8<󦒧JʁPQ:V gAV R[.xfj 'c̱-Y,J(=/75giBBлdqҩ P$n|h9"_kqc3}b h!4eT_G , 5׍reSt5'&nvȻ?,Wo_#+Q}UCЫB5c/J*,hD*iU^V?ȹV[6?M1AV/L5Ü'_r\\ȯHY#$ÜP'b'@͊QNL  8o VVP!Q4l(V,Ph˃^;nQ [ߗiT{b6~ A9*mFŔ1/$H _E[%һ[[2֡[7ֺU- nY*f25.gji0˳jXiI[0 //[p.UY/G|o}d1ߟ ?sd"+ ?3}y6W0m:FP+f]o/ƣ//߿eٖٟ|3jFb&d)GD'a&|hnΞ\].S)veM*xIE;3FѺؓMmէëH'U`x2~o.d<$; *Q!Iz}zo&*1G97NJRzbHw˫Y3y]_nQPo>9%dԜ$ +@yǂ'!2'!|jf}l BUTנY%)SBGVacÓ:yjt)PhC@u^^(Ph @(V A:nݒcRО5HJfJT1R0|835L9o M`:.>;ދ,K 22iq)vq`h`MR@ŀ{P-(v+&$Hx4RP1:H QYlHT`5@\awaPݨfThEHОH8Z$h  r`+Ae<TEZZݨу"^‘E;Q`UgX RH*^=LSjwݍJ>reo`y4hc#(BXJ X(lx"H߃j QC()EIDVY\y.U+b"|lh N-!DDT(HAq; .0̳'D`HYstQ//\^$:xF,_fSKsnXyr"0gI0]N 3{Zd'#']Vi/ &vs$hEeLꎽ%]yd *e`@lFJ^YeS#%Ġvp 8lyN,o'fAdN!UYPg|($-[oE4ӝNq%_wr`f_?_~/ׅ@Vá Om(rꓠk:ՋbMܕbi4zܟD3o,$zty'M T`] @{1g-&%AO>RxGD p~"+eHSKMxx9#l<,cuW;">3^|^7]Q2D*Ъ8K+2T3>`36աyVt2aLt^<ݨ0ӥbI7*F%ݨtnFn",F<݈gގ P1Yr+(ixʆD 8hxh 9&XɆV$|- 9VrV̼[Rwyٸ?K$'ةbL^fUIWTe9'tejr1"KGmY| AG_l\$*"L)pI(*=K=;ƚ٥Gv]zd%6[zd r#O#[dV_Nac-dPE8+΍RquRZ3F &D8ENata ۝jƑ;֌q ,Z'8B!JBMY% jQ=yiЂ"j)Ȱ`pwe4@߬NN#D)fjiusChޢyɂuJQIf|.bBr# tDf,Nzr/<;Ic', ">SҘ>0DDkƼ j]_ 0Hnz^]|ӀRDXdZKWG-Lk#S ~B>NfJɿQ;rs?k!n6Smbl2I0o*OfrmLb7Zo,3Nk7>+r4[L ,uܧPGPW`,aIV8sԇR W*ðpFb$lB& HS4:ɢbV( uTҰnT %`EKj*SԘ佮ry ZHx8Fu3CL&DF&Y| {mYR4P&J2ylC8%!)%_~{tqw 1hbnf57UƫyI"rm<"vU`=]%&ylc9\ ܵd㟂N7n|b@&8-x;]ii=bSk: .Ѐ,UXDR`'1VV{N<m[CҾ>T;kw #xw9E=u1isz%,˜lEAhn{uX)eVHи劈$AYk"2p<A9#l$OYf#)!&orxfcEV)ΪJϥ隗cuٖgD(kDxfӹi5wVݏnJ)p+1p\MsJsV#aiś%+Nl#M[}ERo։eiv׷mѝC &m ?YIn虒(Y{?,Dpy.on?\^'J,"}B{ >"!-^)Px/P6}hiswn RhD*7hVI\7lL~-A,o˅rA.KԳ~FjV{Ʌ#,(b[} ֜lTۿo;F7pG.yQէIΚd(aW^1ji__sa!I1ɲirq# v[sf} W1WMnR %O%.gC)k.9ֳ?eV63K0TәY*(AzuLܶ" =VY3qxv%dH/SOC4`c 2:#DVB"ebeYW*Nw AOĈbĦQ#!gcux~‡'GhJr3p77ҦBbMq?}RSg>hgIk5(c-6˰6dT:;@8^* (yQ"X޻Lp0jq#'g'[@آߦvNXv+}bl:Rb{H@B@&K~XhvYY-|予IsV3kgrBJ}d9'A&"ip 0uWMuJ4ϜD`8+i-~kɗ''(eK)L)ThEn$5#CRdSZ+MSZLȑ<53Zzg\tٌ礳p9# B;Y6e$k 穙B `3ZR\k f0u1nKZ{Q;8oHkxc2e)DĎNF|!88oT@j5⺿ЅZVЍC$vn]Lk[9L6O*y–#>{Gm5Ѻ76r$ @zKňu2(K X&aVck|S=V*5þT6nEllӒx R+ ڒhejD%Jf1Պ֧&ΌS -R+ 6 )}4S]+p۵驪6A*~iV[f`+C"e!T̲H'+]Ĕ筌'l2 VƢdLcaM' SiDaa1s@4rPTC?xKkdh{.z_o, jLtsԣ;QԵ{el}Qkppbn!}SY>Q줺T&uD &'>=9dZ7D )<ļnki3`)sd06(:ǖMjv[U8[ &k+&L}$+H:ϰ о,ݫ [:M_ }nˈ 8pz5o؟jۣ8-,n3ٛ8~]:ՖZ7|l- ul'3xo.n!gMX͡ڌe3P _g{sqب:\}lW-[&L-`b-??\Y~;V7 i|-6拣9S#}lʫ^Z ) X|ߗ@>_w'2~'o;O V7D{ٱ|OZj2(Y:}EΚi0]S/-nCo|YSfzSF(l2(Y:}Eҋpj>vnP B% +!i l"w!ҽQ%yb,ݟF o9{X7(Uk{+ldea2K-chn&QN=_qOw(< Xm dE ]'RN=4HWezN5k{SocvBR;3D+hR('^z Е^qQTpoEN8cÖO5_iÜTl8c*OطjYo8VNQ Uk|bFҢ26ÓUtAK J"dR\(n&c(ln&ZqsDBs'=F6rgt|•k5XF[Hkݑ[lږp{Jۚ-# [ơI6Iie*T:if%9PNI%D )!9$M $GkdV,Xۯ;䂶PpzԨ"]=*,ssvi0Q"-8^rb+;:wVJ>Su-#9aWzqla!L@.۶kNgz|Q&3 #Lz֎*{ gdsy4A!O1  QkkIV,D$2.ɑ> lR!%UOII}ÜoFMIzV@>tO ?f0?pXUr Lu<t*yƠ3 W3c[".8?Ӯ} 2->B$uHʐ\E2 DL#0UYRWK\-P=uCShtlZ-,*T.P]y XaYdRz>.Ӷ/`t7%)ݢ;(fPnFWڍk@au͈>KpgkѶcJl^tOpoݞ8`ȢY00K)-[xwGRjWҏU kJ|iAAX )MJGKJ)֓i'ӹlBV*O!Wǐvy;2/py-",uhQ{M';gZjAQ@xoa <Zc@(<%%؁C_`Y;xz cZ`@4bZ5Sݧd''}]SvOܘ?CGL>yk`e\WCvn`;#F]odcG<\ cO mAEY 3ҳm$ud$H(alH*?S| o 7o89j94>?yEo.򞁀|LCH˷gZ/qcՁKج3ƙsI?>+K$d ZIgc!١-:WB@c0.bB P"׊M?^Bƭ:xz!5 =h%SsdߧT=Πg5;(X du/!jE\rX%"B"Z^YGB7 ظa[GCd˫ع X*l X9(s|Q{^,ߕR߷pu3R sR>y }]LXW.m/v樛$dĀsVgޏMI+'KҦ|gK:>GV_:qGmJ?}GjkYsl'8`0p߷1$F$u8]9nFwU5m>J?^yG#p&G c<]$Iʄ@{:+>lP# #`vE=pվ!瞿 d dR;F/i*n7vC7{(L0[EL+38qZx1[5z@F\rf/;`{sO~2ӝ ]'6{ۼcp[I鎯6ۍc4[{@2Y[i?teE<ܛůmqҜr%ȾE3W[P.DC3 fȓfw)%6'bv-J1!pε+ˆ+G'-JGu[ nf6;ͣkfJ̑4Ձʌ 6.gp$mj<9F]N>OJګT%tzaFgJͧ['S4 6`Np3@!&ba';*b-c Ń_t>uFW5@@vyLu'[@rݿSvoπh9l*\&QfRd,fiJ)R")ɘgMdb3s0LqʖT\F%˅nu! nֹÓp`3nU28iufnd;S(hi4vB :9Wལ [}{M:t,]K.25|K5W{temF\j?M z` 3/z~qٛW?=O?=ӝQ| Mp{@u:'"+Ǥ3tLe3^(& AѢ7 }|nH}2uP5F;|j=5(+${g9Ќ10<8FO`:\s>y|4gM;)%|D֤;5y&&*<"O#RJ B|u(SiR3R(dİ (HTx"7)Z2S&eTkL6b\wJ$Y,JAv2WMJgLByq>Fgz-R)˄U֘P8W5 êݘR``_ri _A䫭 \iW !h01R}5twm}8`De:*|2ny~_ ƟliF6).Tme"«~y>$H+,Pr9{>EXRТSPSoo:^4 "p1_kvg8OUV qxq}/AN( Xlth4ysSC4I9@ISzЗ_XUO-&Z,WW5Tѝ[)\2[Р` kŰDj5eJaJ@.DD9EDRjP& D:"JסD"*QTw)qN!-(QP E+j̖!]I 2C'_\y xڪւ-,,C"iR7M%Uh1@]6c.ƘB<JrG2*,<6ʦı3(u"8f"U+PX6ՙ<8aM-)Ya$h0< s La#7i׾:QHjq_B4d(M( @1jb&JZ̚U L 0lu1ʘJ((yJͽ`['j !ZRtJ+r)F3(Rd`pa| _ڬU0pjP)WxV 8jpnrdb91aH=NgX*zU"X+ :S2'[-3ntJ)OeT9oSX\ӒEVd`X.+gǢ; &W k%/uap `,VX3P{5oAPQ6j9ē!\KYITSІ **w E$I7AxF+s,w߯5Ql a<\=')59c9[LY-a#baYJؕjPs"#[36,7EOx-6Хb@+fHgT>xƌsr4/WZ|:ܯWE=V._|/=/=/=/H^]h ƖK)25,G>##X0P?`+D%ř]@.rLd⫞ICUR^d[pg#Y :G_)p䎼9j.)FGR)qc}d~[bǗr ؜ e~Vd)(VM =:~X`nD^Y8Ӯ3dW_дa# ~86Iw;tcFt/!=ֱhM+'qbnrvzl1t.V̿ J/!P]^E= @`?v1Fλ/HOqS.rV:ŠRBwuhlbq($˙nL%ye8ŭ c'+V1N(=5gS>6eUy Wa+ga3܊ !ţw|ɃlaD9%r6= M0XD ,A:H3=+-3jqʝ޵#_ ;-^dm j;%GeٖiNv^bXźW38&mB S#b U)=Hi>KPaJӺ}/{%c+GLD(XLe4h* ٮXr1z`%'um|ѐ#4>כ|sx|[G QxcFObُ7,9H"RofL}fӗ$+|y<7 GɎv¯ݳ»o9>=) N ^ݝީ\õ/jyVb{eӯA@ $sw\5˿,б߸`Mz=<裚 `@P'-}9OkciP-Rқ@?).aykE³oG>~ŏ_Rq1r跋;GoT|zCE}`Mc=ޜ';Zw}s Q B ~S]=171ŏ\ ( ~~ZgSj?=v\_=M&\B|-J)gFk4n{7{1]cp2^~܁, 2 s lzunעl j2lW쒰lVW|m[5?gww>G{x^Ol (FӃps>o⇘Vw@q:lHy;C,ڇSJ7P[5:}qtkOǖn%wnCp_E{Twc"i0w Nw#~Av[{:J!8/΢y :6yv6{ݢ/uy3{qCQfA?_Cvڢ{K p˳:KMG.5'vƝ֭+,כX-h:Q9b1 1'I0dYa00Ek #f_zI0m'v"ڑ`~(_6͖Txi:J@E̚)RFI}{I i nk姹DސcJ*-bF!JLf(wNMT`Dj% v5# «* f/BJDތ39k-g#zaEFZ[{pސsuɰUYKA0%$\M.vǔa&''@Rݦ]-ZCڹGj%l LGYYjD*2e_Cq9AX)kF"MaC1*T)5hlpF%{3(xt 4(p%ȕ޽OsVKdAleQeь߼틌ّ~QdOh;l=5EX"rj󶬗a7?R]T-TZEc)9cq^[T!>k]zT {r%3d//ݢlHl JCMd!@C@j` 嬄'tGRRVEa,2U BT-uy;/4Xhr;NG.+-2_7u”P+ YVj( ga}r|A}K^9MjA݇yn)%XA{F Vl5v|L=!v?-ZсCgzև#%S.İ)GIZlI >b|*ZX(BGV(ޱev8i\#v-:FU-l"2w2;^|GP&k`IZ $IRL'~ơwj}ulV ӵs[vNh]d5v;8#+ZmVwdiiG g`w\Y?-\Oj$y~&.92wu=Od$d)(+#c!%xP~j^tl _Q87|ٴwu.P!+*+rBd G;^Yed]Hr}IB7 T.)aDЕ?H;H>NuoY!p4ѻ닇)EHxuv Q $8O!HF xFHByv[o=MZp;OMUaA8x\>k1%VFhR[ s2uv?/%CZHuZCm2J]NĆ!g{baّPQq93~K͵aWZ9-vlZJJuFvcF$Z'AA[i\T6N֢*dQqFq5O q]lNSF j<,Vb#b$zxUzWn)ʜHhnԼ5[aS'H9;Q [bѴ;5I`v@+;96^cQi[^= 8kdwEYnw@RbeNda "8Ϳ#[%|$(gSz>=59'[Z ;KZ;Т:-!Sd+٦l.]þrդ9f!ƍ2@EhN1StVEi-yM2+7I { ;W#$ܾ4?Ϛg7\cz{$Yͬ}OgR1` 99tf]Č5,I%U1*UV̪%BLV..O @R:)E0*ښ ]YIb:jInHe@$eZ^/ߛ-O Cp "2uH۾+Is+qOʳ|l-1*4^91~EȩZUzѶ#3e4&d_B=W.?o1霨ok%3UpOh ug{ ?|uy/3( t#UA%zD7TS//@έT8요r;OM.>KgR+qZHE8v^̡;h`7|]^,݇~^L5zj Y-= HԾCьi/ [Ի^ʬ7j+F/"&*.4|v~X׫yze\*hzbTIuɡ '鼻Fp/>T@l}_'8JUTONij=*_hI/?g*o-yɶhɪ.Ujׂ^)&;_wYe'.N??]FnR^h,E"E{esn6D:"HcEc%hw ޟ}A9-b][o[Ir+v4ٗjl&dv_A_-B"eSMY6-Kth2`KuX_uݻd25b! _wɗ5[iҾ;lv^ַޒmk}u*fGVc躴:EgYyU6vX˂8˄NzÎac+ߒ-wqs:k]Bsy}al`aW٢Q59_ㄯb-Kɥ^-9!hiiiկl)jx!r fZ̤f_=gOU%G.h7Tg~ϝcGJy=[|#fT8M9WݐGh P$&$9,o&U>l~Q ZNbÆ*$Q5tDFS,U*vcDŽVsɉ.pd#hY!gNxF&hI!b->c9͗kd+mW`"jHJk63/˱Z"VN+n'uS JnUiiA%MZi0[NklM-Ë&)j68L"v+$f ;Oncuouтsyq]gn:l&TH+o*GJGBߦy1'яZ#eWE15 +"6.lBϖv 3TgaIݒ>O?R3D!=9 b} ~7]8J1CP";&u7ۄ.ɡD!\"`!h~n{YK+isY33WZwo:G9p[36qA1*eKVxur&?>V~,jSF-[nݳ^4u.h_3:suRՋZjHRh sE_O Wo&S8e3)Q(b3sI\2Slz^xi/@i1Cn>ÿq`v~uʯ8\ٯS|ЌW|:yg8![џC/I/U?;;ȥÐ6;{4'VjwWg6xWG؃o<ȓϱO_$?~n_=KW;O-[& Z5"Iv2Dy#MOfQpZ.D,8ROtӋuP/;r4wTfKDhc9qG(Շ6}و`trQx҃A7LC>f^H*ɵdeRªExۦ)Hޘ2^~C'HJgKz0tёEceВ%l&[!|šv2dtr }̉qL׳w`&neB,DM6Tƫ?b4d+{YYZ0젊|}z:abVvP.^p|)Bih2ک᳌D5A:Y- Bƛ@$|;" bPȊ_TXV}H<-N5WQ}~]> e?)Eʽ,C-f9fr-2l2՜@\6 ) ˗Ekl-^PYD5ʑZ// ̈́:d̎ݘZVi.DR\ }@,-= \. <}ccigfVm( Z9׽8E҇iMe9"ȠX[l= _R^2*g BOeb,`eVB,$r5|6k@Bfn5VgTpz-uUJS EPHނ6nP#0g?#aψLޫȨvw/#fG3b ʙMjg2rn@6r C^PH3qY%[uVI RMV$D9` B;քfψsRKRUC~Pr$Kٮ3k0BɟXAE 7M|:)X1;pǝPn-q>P 5AC)O$Dxy}z CΟ Vy? Zp ۦȻ`u0]M{2Dp2;X2V5e/b*C^\ꋷ1Fm ǻR@kQn2aۺյY],tm[]Fzo[şE]oCfz<鬮:EjC :$SVFCOz.zo@m~YQ%A.zT+")mDIBHXWuO90wdR}Շ_Q6B'w:RVgQIoڟr(OЀID3^1qjK}:,lzu“nnu`CF^j[k^Y2DS_Suo h3sfIA, PzVZj }V`1^e׺}HC͖~ΩMfa]lqbyt{MI[i>8'' cҔx1.ϧ ǏQFԓMIWZ>=c$,`D [N̈R̞rO'oJ?rZe6-Y>)_,^J/_rt}h/ j-~{'l1G]nm ™Qkim [wW4@(Z)Yl,)FIey/kggyxr3O `ݱ`/Uv6ꏡKDcagɧ''#@1cʳfdP18BRw}9&SVkn}3ǣ/

n.w::Lb;sZҍK76(fڵoVd䩜x0 dBkDadAĈ"BCP1gP3*&L;a?KkM\si$VAc|,{FC 5s&)cކ Z { `f)LjsR!f5%D1YY\^Xp<CԂ4YV$@Lx TBOPHCƙ=z?fA ?h fWfjJ sh|Q cTaWRI@J-;h'")epO6`^,w- /PrzSn$IJK%XtPHBUY]%.lݯ#uz0/Fn:m>2cOԽwnjS6T>x&} 2{ed@<^ KKRP42@o)AG 8Xh$ VIBFHitpw?Fvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005707265215136701634017721 0ustar rootrootJan 29 14:02:44 crc systemd[1]: Starting Kubernetes Kubelet... Jan 29 14:02:44 crc restorecon[4752]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:44 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:45 crc restorecon[4752]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 29 14:02:45 crc restorecon[4752]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 29 14:02:45 crc kubenswrapper[4753]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 14:02:45 crc kubenswrapper[4753]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 29 14:02:45 crc kubenswrapper[4753]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 14:02:45 crc kubenswrapper[4753]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 14:02:45 crc kubenswrapper[4753]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 29 14:02:45 crc kubenswrapper[4753]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.895196 4753 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903810 4753 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903837 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903842 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903847 4753 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903851 4753 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903855 4753 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903859 4753 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903863 4753 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903867 4753 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903872 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903876 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903881 4753 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903886 4753 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903890 4753 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903895 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903899 4753 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903904 4753 feature_gate.go:330] unrecognized feature gate: Example Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903909 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903913 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903918 4753 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903926 4753 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903930 4753 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903934 4753 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903937 4753 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903941 4753 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903944 4753 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903948 4753 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903952 4753 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903955 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903960 4753 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903966 4753 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903971 4753 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903975 4753 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903979 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903983 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903987 4753 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903991 4753 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.903996 4753 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904000 4753 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904003 4753 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904007 4753 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904011 4753 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904014 4753 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904019 4753 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904023 4753 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904028 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904032 4753 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904035 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904042 4753 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904046 4753 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904049 4753 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904053 4753 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904057 4753 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904060 4753 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904064 4753 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904067 4753 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904071 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904076 4753 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904080 4753 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904084 4753 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904088 4753 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904092 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904095 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904099 4753 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904102 4753 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904107 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904110 4753 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904114 4753 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904118 4753 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904123 4753 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.904127 4753 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905426 4753 flags.go:64] FLAG: --address="0.0.0.0" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905442 4753 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905452 4753 flags.go:64] FLAG: --anonymous-auth="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905459 4753 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905469 4753 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905474 4753 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905480 4753 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905485 4753 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905490 4753 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905496 4753 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905500 4753 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905505 4753 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905509 4753 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905514 4753 flags.go:64] FLAG: --cgroup-root="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905518 4753 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905521 4753 flags.go:64] FLAG: --client-ca-file="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905526 4753 flags.go:64] FLAG: --cloud-config="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905531 4753 flags.go:64] FLAG: --cloud-provider="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905536 4753 flags.go:64] FLAG: --cluster-dns="[]" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905543 4753 flags.go:64] FLAG: --cluster-domain="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905547 4753 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905554 4753 flags.go:64] FLAG: --config-dir="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905559 4753 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905564 4753 flags.go:64] FLAG: --container-log-max-files="5" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905571 4753 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905576 4753 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905582 4753 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905587 4753 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905592 4753 flags.go:64] FLAG: --contention-profiling="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905597 4753 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905602 4753 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905607 4753 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905612 4753 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905619 4753 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905624 4753 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905629 4753 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905634 4753 flags.go:64] FLAG: --enable-load-reader="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905639 4753 flags.go:64] FLAG: --enable-server="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905644 4753 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905653 4753 flags.go:64] FLAG: --event-burst="100" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905659 4753 flags.go:64] FLAG: --event-qps="50" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905663 4753 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905667 4753 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905672 4753 flags.go:64] FLAG: --eviction-hard="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905677 4753 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905682 4753 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905686 4753 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905691 4753 flags.go:64] FLAG: --eviction-soft="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905695 4753 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905699 4753 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905703 4753 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905707 4753 flags.go:64] FLAG: --experimental-mounter-path="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905711 4753 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905716 4753 flags.go:64] FLAG: --fail-swap-on="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905721 4753 flags.go:64] FLAG: --feature-gates="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905727 4753 flags.go:64] FLAG: --file-check-frequency="20s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905732 4753 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905739 4753 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905744 4753 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905750 4753 flags.go:64] FLAG: --healthz-port="10248" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905755 4753 flags.go:64] FLAG: --help="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905761 4753 flags.go:64] FLAG: --hostname-override="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905767 4753 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905773 4753 flags.go:64] FLAG: --http-check-frequency="20s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905777 4753 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905782 4753 flags.go:64] FLAG: --image-credential-provider-config="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905786 4753 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905790 4753 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905794 4753 flags.go:64] FLAG: --image-service-endpoint="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905798 4753 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905802 4753 flags.go:64] FLAG: --kube-api-burst="100" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905807 4753 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905812 4753 flags.go:64] FLAG: --kube-api-qps="50" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905817 4753 flags.go:64] FLAG: --kube-reserved="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905822 4753 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905827 4753 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905835 4753 flags.go:64] FLAG: --kubelet-cgroups="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905840 4753 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905845 4753 flags.go:64] FLAG: --lock-file="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905850 4753 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905856 4753 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905861 4753 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905869 4753 flags.go:64] FLAG: --log-json-split-stream="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905875 4753 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905880 4753 flags.go:64] FLAG: --log-text-split-stream="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905885 4753 flags.go:64] FLAG: --logging-format="text" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905890 4753 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905896 4753 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905902 4753 flags.go:64] FLAG: --manifest-url="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905907 4753 flags.go:64] FLAG: --manifest-url-header="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905914 4753 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905920 4753 flags.go:64] FLAG: --max-open-files="1000000" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905927 4753 flags.go:64] FLAG: --max-pods="110" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905933 4753 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905938 4753 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905943 4753 flags.go:64] FLAG: --memory-manager-policy="None" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905948 4753 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905954 4753 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905959 4753 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905965 4753 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905976 4753 flags.go:64] FLAG: --node-status-max-images="50" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905981 4753 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905986 4753 flags.go:64] FLAG: --oom-score-adj="-999" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905990 4753 flags.go:64] FLAG: --pod-cidr="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.905995 4753 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906006 4753 flags.go:64] FLAG: --pod-manifest-path="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906012 4753 flags.go:64] FLAG: --pod-max-pids="-1" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906018 4753 flags.go:64] FLAG: --pods-per-core="0" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906024 4753 flags.go:64] FLAG: --port="10250" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906030 4753 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906036 4753 flags.go:64] FLAG: --provider-id="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906042 4753 flags.go:64] FLAG: --qos-reserved="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906049 4753 flags.go:64] FLAG: --read-only-port="10255" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906054 4753 flags.go:64] FLAG: --register-node="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906059 4753 flags.go:64] FLAG: --register-schedulable="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906064 4753 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906073 4753 flags.go:64] FLAG: --registry-burst="10" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906079 4753 flags.go:64] FLAG: --registry-qps="5" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906084 4753 flags.go:64] FLAG: --reserved-cpus="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906090 4753 flags.go:64] FLAG: --reserved-memory="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906097 4753 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906102 4753 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906108 4753 flags.go:64] FLAG: --rotate-certificates="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906113 4753 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906118 4753 flags.go:64] FLAG: --runonce="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906123 4753 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906129 4753 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906134 4753 flags.go:64] FLAG: --seccomp-default="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906139 4753 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906145 4753 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906173 4753 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906180 4753 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906185 4753 flags.go:64] FLAG: --storage-driver-password="root" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906191 4753 flags.go:64] FLAG: --storage-driver-secure="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906196 4753 flags.go:64] FLAG: --storage-driver-table="stats" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906201 4753 flags.go:64] FLAG: --storage-driver-user="root" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906208 4753 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906219 4753 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906225 4753 flags.go:64] FLAG: --system-cgroups="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906230 4753 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906239 4753 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906244 4753 flags.go:64] FLAG: --tls-cert-file="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906249 4753 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906256 4753 flags.go:64] FLAG: --tls-min-version="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906261 4753 flags.go:64] FLAG: --tls-private-key-file="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906266 4753 flags.go:64] FLAG: --topology-manager-policy="none" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906271 4753 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906276 4753 flags.go:64] FLAG: --topology-manager-scope="container" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906282 4753 flags.go:64] FLAG: --v="2" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906290 4753 flags.go:64] FLAG: --version="false" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906296 4753 flags.go:64] FLAG: --vmodule="" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906302 4753 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906308 4753 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906455 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906463 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906468 4753 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906475 4753 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906480 4753 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906485 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906490 4753 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906494 4753 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906499 4753 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906503 4753 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906507 4753 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906512 4753 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906518 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906522 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906527 4753 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906534 4753 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906542 4753 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906548 4753 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906553 4753 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906558 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906562 4753 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906567 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906571 4753 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906575 4753 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906580 4753 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906584 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906589 4753 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906593 4753 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906598 4753 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906603 4753 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906607 4753 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906611 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906616 4753 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906620 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906624 4753 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906629 4753 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906633 4753 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906637 4753 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906642 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906646 4753 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906650 4753 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906654 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906659 4753 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906663 4753 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906667 4753 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906673 4753 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906679 4753 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906687 4753 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906695 4753 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906700 4753 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906705 4753 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906709 4753 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906714 4753 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906718 4753 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906723 4753 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906727 4753 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906732 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906736 4753 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906741 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906745 4753 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906749 4753 feature_gate.go:330] unrecognized feature gate: Example Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906753 4753 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906759 4753 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906764 4753 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906768 4753 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906773 4753 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906777 4753 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906782 4753 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906787 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906791 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.906796 4753 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.906811 4753 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.917217 4753 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.917292 4753 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.917557 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.917571 4753 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.917576 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.917583 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.917632 4753 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.917667 4753 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.917675 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918070 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918089 4753 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918095 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918101 4753 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918106 4753 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918112 4753 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918117 4753 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918122 4753 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918128 4753 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918135 4753 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918143 4753 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918177 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918184 4753 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918190 4753 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918195 4753 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918201 4753 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918206 4753 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918211 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918216 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918221 4753 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918228 4753 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918239 4753 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918244 4753 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918250 4753 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918255 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918264 4753 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918270 4753 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918276 4753 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918281 4753 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918287 4753 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918292 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918297 4753 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918302 4753 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918307 4753 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918312 4753 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918317 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918322 4753 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918329 4753 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918335 4753 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918340 4753 feature_gate.go:330] unrecognized feature gate: Example Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918345 4753 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918350 4753 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918356 4753 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918361 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918366 4753 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918372 4753 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918378 4753 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918384 4753 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918389 4753 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918394 4753 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918399 4753 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918404 4753 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918410 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918415 4753 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918420 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918426 4753 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918431 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918440 4753 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918448 4753 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918455 4753 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918461 4753 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918467 4753 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918473 4753 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918479 4753 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.918489 4753 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918662 4753 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918673 4753 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918680 4753 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918685 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918691 4753 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918698 4753 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918703 4753 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918709 4753 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918714 4753 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918720 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918725 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918731 4753 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918737 4753 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918742 4753 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918748 4753 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918754 4753 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918759 4753 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918765 4753 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918771 4753 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918776 4753 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918781 4753 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918787 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918793 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918799 4753 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918805 4753 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918811 4753 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918816 4753 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918821 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918826 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918832 4753 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918837 4753 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918842 4753 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918849 4753 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918857 4753 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918863 4753 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918869 4753 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918876 4753 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918881 4753 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918887 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918892 4753 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918896 4753 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918902 4753 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918907 4753 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918913 4753 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918918 4753 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918923 4753 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918929 4753 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918936 4753 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918942 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918947 4753 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918953 4753 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918959 4753 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918964 4753 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918970 4753 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918975 4753 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918980 4753 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918986 4753 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918991 4753 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.918996 4753 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919001 4753 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919007 4753 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919011 4753 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919016 4753 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919020 4753 feature_gate.go:330] unrecognized feature gate: Example Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919026 4753 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919030 4753 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919037 4753 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919043 4753 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919048 4753 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919053 4753 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 29 14:02:45 crc kubenswrapper[4753]: W0129 14:02:45.919058 4753 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.919066 4753 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.920218 4753 server.go:940] "Client rotation is on, will bootstrap in background" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.924683 4753 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.924770 4753 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.927712 4753 server.go:997] "Starting client certificate rotation" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.927735 4753 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.927960 4753 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-05 19:26:32.714909307 +0000 UTC Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.928080 4753 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.966305 4753 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.969486 4753 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 29 14:02:45 crc kubenswrapper[4753]: E0129 14:02:45.969831 4753 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:45 crc kubenswrapper[4753]: I0129 14:02:45.988818 4753 log.go:25] "Validated CRI v1 runtime API" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.030263 4753 log.go:25] "Validated CRI v1 image API" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.032068 4753 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.037215 4753 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-29-13-57-56-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.037260 4753 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.055544 4753 manager.go:217] Machine: {Timestamp:2026-01-29 14:02:46.052834086 +0000 UTC m=+0.747568488 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:aa3924a6-9f3e-446b-bf11-65e8bcfab058 BootID:dd5460f3-0655-48f4-971d-c3e6b7a9c2ef Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:26:ff:d0 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:26:ff:d0 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:9d:e7:34 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d0:33:5f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:cf:db:f4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:52:35:5a Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:d8:e5:ed Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a2:fc:01:19:a7:2d Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a6:11:25:a0:dd:cb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.055860 4753 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.056073 4753 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.057869 4753 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.058091 4753 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.058133 4753 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.058409 4753 topology_manager.go:138] "Creating topology manager with none policy" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.058422 4753 container_manager_linux.go:303] "Creating device plugin manager" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.059062 4753 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.059097 4753 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.059849 4753 state_mem.go:36] "Initialized new in-memory state store" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.060439 4753 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.064912 4753 kubelet.go:418] "Attempting to sync node with API server" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.064941 4753 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.064977 4753 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.064996 4753 kubelet.go:324] "Adding apiserver pod source" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.065013 4753 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.070043 4753 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.071024 4753 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 29 14:02:46 crc kubenswrapper[4753]: W0129 14:02:46.072970 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:46 crc kubenswrapper[4753]: W0129 14:02:46.073044 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.073138 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.073229 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.075047 4753 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076645 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076673 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076681 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076689 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076701 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076711 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076722 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076737 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076748 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076759 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076771 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.076779 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.077463 4753 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.078045 4753 server.go:1280] "Started kubelet" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.078486 4753 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.079434 4753 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.080411 4753 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.080525 4753 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 29 14:02:46 crc systemd[1]: Started Kubernetes Kubelet. Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.081884 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.081932 4753 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.082514 4753 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.082546 4753 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.082706 4753 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.082853 4753 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.083019 4753 server.go:460] "Adding debug handlers to kubelet server" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.082529 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 03:13:45.351853085 +0000 UTC Jan 29 14:02:46 crc kubenswrapper[4753]: W0129 14:02:46.083743 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.084921 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.085324 4753 factory.go:55] Registering systemd factory Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.089060 4753 factory.go:221] Registration of the systemd container factory successfully Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.089907 4753 factory.go:153] Registering CRI-O factory Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.089972 4753 factory.go:221] Registration of the crio container factory successfully Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.090144 4753 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.090212 4753 factory.go:103] Registering Raw factory Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.090242 4753 manager.go:1196] Started watching for new ooms in manager Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.091094 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="200ms" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.092936 4753 manager.go:319] Starting recovery of all containers Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.099693 4753 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.142:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f388f27bc2abc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-29 14:02:46.078016188 +0000 UTC m=+0.772750570,LastTimestamp:2026-01-29 14:02:46.078016188 +0000 UTC m=+0.772750570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111024 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111089 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111110 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111129 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111147 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111239 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111300 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111318 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111339 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111356 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111375 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111397 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111413 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111432 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111449 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111468 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111484 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111501 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111548 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111570 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111588 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111630 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111650 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111666 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111687 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111704 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111726 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111743 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111762 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111778 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111795 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111820 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111836 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111857 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111874 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111895 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111913 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111930 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111947 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111962 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111979 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.111998 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112016 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112033 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112050 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112066 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112083 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112104 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112128 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112145 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112186 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112204 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112229 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112246 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112288 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112305 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112318 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112332 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112350 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112366 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112382 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112400 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112416 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112436 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112452 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112470 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112488 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112506 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112523 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112539 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112554 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112571 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112587 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112603 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112619 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112634 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112650 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112666 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112680 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112699 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112714 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112730 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112745 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112766 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112780 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112796 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112814 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112829 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112844 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112859 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112878 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112895 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112911 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112928 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112946 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112965 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.112982 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113000 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113018 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113035 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113052 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113069 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113086 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113104 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113212 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113238 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113258 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113278 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113301 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113322 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113342 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113398 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113421 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113439 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113458 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113480 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113503 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113521 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113539 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113556 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113575 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113593 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113611 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113630 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113648 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113665 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113682 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113702 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113721 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113739 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113757 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113775 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113794 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113812 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113831 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113848 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113865 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113883 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113900 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113954 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113973 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.113994 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.114015 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.114035 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.114052 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.114070 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.114089 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.114110 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.114129 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119333 4753 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119467 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119494 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119522 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119550 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119576 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119594 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119797 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119814 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119829 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119849 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119863 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119877 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119892 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119907 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119922 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119939 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119954 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119968 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119982 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.119996 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120009 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120028 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120059 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120080 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120096 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120114 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120128 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120141 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120179 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120194 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120214 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120229 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120244 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120258 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120273 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120299 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120310 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120324 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120334 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120345 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120374 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120385 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120401 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120427 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120443 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120458 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120473 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120490 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120506 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120521 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120536 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120551 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120566 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120579 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120592 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120602 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120614 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120625 4753 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120636 4753 reconstruct.go:97] "Volume reconstruction finished" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.120646 4753 reconciler.go:26] "Reconciler: start to sync state" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.123332 4753 manager.go:324] Recovery completed Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.141709 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.143885 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.143919 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.143928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.144566 4753 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.145831 4753 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.145870 4753 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.145901 4753 state_mem.go:36] "Initialized new in-memory state store" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.147690 4753 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.148033 4753 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.148083 4753 kubelet.go:2335] "Starting kubelet main sync loop" Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.148176 4753 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 29 14:02:46 crc kubenswrapper[4753]: W0129 14:02:46.148822 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.148912 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.170509 4753 policy_none.go:49] "None policy: Start" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.171683 4753 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.171771 4753 state_mem.go:35] "Initializing new in-memory state store" Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.183331 4753 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.224132 4753 manager.go:334] "Starting Device Plugin manager" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.224235 4753 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.224250 4753 server.go:79] "Starting device plugin registration server" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.224741 4753 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.224761 4753 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.224950 4753 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.225121 4753 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.225133 4753 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.234069 4753 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.249286 4753 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.249424 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.251089 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.251123 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.251133 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.251277 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.251664 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.251738 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.252221 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.252272 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.252285 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.252591 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.252685 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.252713 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.253754 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.253794 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.253811 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.253762 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.253944 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.253956 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.254109 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.254297 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.254306 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.254345 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.254317 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.254420 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.255416 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.255446 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.255456 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.255605 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.255700 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.255744 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.255285 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.256222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.256243 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.256999 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.257028 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.257039 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.257956 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.257978 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.257988 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.258190 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.258215 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.259008 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.259038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.259049 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.292722 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="400ms" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.322844 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.322914 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.322942 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.322965 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.322990 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.323011 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.323033 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.323053 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.323076 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.323201 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.323283 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.323329 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.323355 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.323409 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.323435 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.326143 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.327498 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.327544 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.327570 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.327603 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.328117 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424682 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424750 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424785 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424818 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424849 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424879 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424909 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424942 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424958 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424984 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425052 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.424981 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425078 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425127 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425230 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425245 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425244 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425253 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425269 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425289 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425327 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425391 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425411 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425484 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425489 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425572 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425575 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425622 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425628 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.425762 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.528923 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.531989 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.532058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.532078 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.532120 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.532856 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.596907 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.606243 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.636348 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.658958 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: W0129 14:02:46.659288 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e050c5e2410eebddab82cb0ef9e333cddde8a61df1caa3990a1862a7746cc63d WatchSource:0}: Error finding container e050c5e2410eebddab82cb0ef9e333cddde8a61df1caa3990a1862a7746cc63d: Status 404 returned error can't find the container with id e050c5e2410eebddab82cb0ef9e333cddde8a61df1caa3990a1862a7746cc63d Jan 29 14:02:46 crc kubenswrapper[4753]: W0129 14:02:46.660936 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-be05c4e1f8aba928f137d8ca9b4bb6e04a1c4000dbfacbb5a55fb8525157c01b WatchSource:0}: Error finding container be05c4e1f8aba928f137d8ca9b4bb6e04a1c4000dbfacbb5a55fb8525157c01b: Status 404 returned error can't find the container with id be05c4e1f8aba928f137d8ca9b4bb6e04a1c4000dbfacbb5a55fb8525157c01b Jan 29 14:02:46 crc kubenswrapper[4753]: W0129 14:02:46.666912 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3448e809b3edf7c90118a8bb722941fe6d4b350a9fc253bf2b15986efd01485a WatchSource:0}: Error finding container 3448e809b3edf7c90118a8bb722941fe6d4b350a9fc253bf2b15986efd01485a: Status 404 returned error can't find the container with id 3448e809b3edf7c90118a8bb722941fe6d4b350a9fc253bf2b15986efd01485a Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.671095 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 14:02:46 crc kubenswrapper[4753]: W0129 14:02:46.676178 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-aa9ffd16f525ef156c430cd8fa81ebb980563cd73ef1e99b6b4043b9b8a1f529 WatchSource:0}: Error finding container aa9ffd16f525ef156c430cd8fa81ebb980563cd73ef1e99b6b4043b9b8a1f529: Status 404 returned error can't find the container with id aa9ffd16f525ef156c430cd8fa81ebb980563cd73ef1e99b6b4043b9b8a1f529 Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.693964 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="800ms" Jan 29 14:02:46 crc kubenswrapper[4753]: W0129 14:02:46.702096 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a10555b58b43fd7596607456bead7b0ac303364b0a64eee469395cc7bbafe3c1 WatchSource:0}: Error finding container a10555b58b43fd7596607456bead7b0ac303364b0a64eee469395cc7bbafe3c1: Status 404 returned error can't find the container with id a10555b58b43fd7596607456bead7b0ac303364b0a64eee469395cc7bbafe3c1 Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.933006 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.934556 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.934600 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.934612 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:46 crc kubenswrapper[4753]: I0129 14:02:46.934655 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 29 14:02:46 crc kubenswrapper[4753]: E0129 14:02:46.935024 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.080209 4753 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.083675 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 23:06:34.364326149 +0000 UTC Jan 29 14:02:47 crc kubenswrapper[4753]: W0129 14:02:47.094452 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:47 crc kubenswrapper[4753]: E0129 14:02:47.094568 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.154062 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a10555b58b43fd7596607456bead7b0ac303364b0a64eee469395cc7bbafe3c1"} Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.155672 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aa9ffd16f525ef156c430cd8fa81ebb980563cd73ef1e99b6b4043b9b8a1f529"} Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.158933 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3448e809b3edf7c90118a8bb722941fe6d4b350a9fc253bf2b15986efd01485a"} Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.161032 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"be05c4e1f8aba928f137d8ca9b4bb6e04a1c4000dbfacbb5a55fb8525157c01b"} Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.162358 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e050c5e2410eebddab82cb0ef9e333cddde8a61df1caa3990a1862a7746cc63d"} Jan 29 14:02:47 crc kubenswrapper[4753]: W0129 14:02:47.303457 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:47 crc kubenswrapper[4753]: E0129 14:02:47.303548 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:47 crc kubenswrapper[4753]: E0129 14:02:47.496002 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="1.6s" Jan 29 14:02:47 crc kubenswrapper[4753]: W0129 14:02:47.603920 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:47 crc kubenswrapper[4753]: E0129 14:02:47.604063 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:47 crc kubenswrapper[4753]: W0129 14:02:47.645126 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:47 crc kubenswrapper[4753]: E0129 14:02:47.645221 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.735202 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.737442 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.737477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.737489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:47 crc kubenswrapper[4753]: I0129 14:02:47.737513 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 29 14:02:47 crc kubenswrapper[4753]: E0129 14:02:47.737954 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.081651 4753 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.084058 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 03:32:13.133758935 +0000 UTC Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.167986 4753 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0" exitCode=0 Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.168077 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0"} Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.168196 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.169543 4753 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.169633 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.169683 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.169702 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:48 crc kubenswrapper[4753]: E0129 14:02:48.170656 4753 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.171335 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1"} Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.171395 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd"} Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.171421 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de"} Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.171440 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438"} Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.171461 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.172556 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.172596 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.172610 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.173879 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70" exitCode=0 Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.173949 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70"} Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.173972 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.174739 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.174767 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.174777 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.175476 4753 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309" exitCode=0 Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.175536 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309"} Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.175588 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.176319 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.176374 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.176386 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.177040 4753 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="154b0de3118bd87d7adfbe58f93f3d11adf7eb3e7a592782ce074f8f5cfaaa62" exitCode=0 Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.177064 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"154b0de3118bd87d7adfbe58f93f3d11adf7eb3e7a592782ce074f8f5cfaaa62"} Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.177115 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.177821 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.177858 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.177868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.178896 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.179816 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.179841 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.179851 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:48 crc kubenswrapper[4753]: I0129 14:02:48.238300 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:49 crc kubenswrapper[4753]: W0129 14:02:49.013412 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:49 crc kubenswrapper[4753]: E0129 14:02:49.013653 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.079362 4753 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.084940 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 17:49:30.461980508 +0000 UTC Jan 29 14:02:49 crc kubenswrapper[4753]: E0129 14:02:49.098137 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="3.2s" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.186942 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5bff2518f4e02462e15ff6ddba09a3c44fb04cb19072d20841bb3fba30106d45"} Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.187012 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a5d367f98d62f21486659db334571303be0b003c240182fe5fc70f072014f31e"} Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.187029 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"37d5fc7c12e2b1310d540f26ef183c81181b633925e1ae8eaa54cd0852a80c63"} Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.187026 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.190332 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.190375 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.190386 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.199456 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d"} Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.199494 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b"} Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.199505 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b"} Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.199516 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78"} Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.202134 4753 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b" exitCode=0 Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.202211 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b"} Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.202297 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.203343 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.203376 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.203386 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.205776 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5c34fd26eab07375195917e0641340b52b6d2dbbec4913a9b5512fec19df2fba"} Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.205872 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.205903 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.206781 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.206820 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.206998 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.207310 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.207350 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.207359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:49 crc kubenswrapper[4753]: E0129 14:02:49.282976 4753 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.142:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f388f27bc2abc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-29 14:02:46.078016188 +0000 UTC m=+0.772750570,LastTimestamp:2026-01-29 14:02:46.078016188 +0000 UTC m=+0.772750570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.338315 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.339647 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.339701 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.339714 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:49 crc kubenswrapper[4753]: I0129 14:02:49.339752 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 29 14:02:49 crc kubenswrapper[4753]: E0129 14:02:49.340436 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.142:6443: connect: connection refused" node="crc" Jan 29 14:02:49 crc kubenswrapper[4753]: W0129 14:02:49.341197 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.142:6443: connect: connection refused Jan 29 14:02:49 crc kubenswrapper[4753]: E0129 14:02:49.341291 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.142:6443: connect: connection refused" logger="UnhandledError" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.085448 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 16:07:59.685559761 +0000 UTC Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.213103 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792"} Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.213220 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.214845 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.214887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.214898 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.217135 4753 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395" exitCode=0 Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.217228 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395"} Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.217308 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.217349 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.217314 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.217484 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.217571 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.219051 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.219100 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.219119 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.219183 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.219201 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.219210 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.219293 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.219343 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.219364 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.220359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.220424 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:50 crc kubenswrapper[4753]: I0129 14:02:50.220453 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.086219 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 10:13:45.462594223 +0000 UTC Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.224555 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee"} Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.224634 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536"} Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.224660 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee"} Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.224695 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.224661 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.226044 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.226095 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.226109 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.453756 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.453987 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.455517 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.455559 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:51 crc kubenswrapper[4753]: I0129 14:02:51.455576 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.086620 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 11:45:37.944285705 +0000 UTC Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.232099 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.232212 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117"} Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.232274 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127"} Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.232413 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.234913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.234994 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.235018 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.238023 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.238092 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.238118 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.489400 4753 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.541438 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.543776 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.543833 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.543843 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.543881 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.863680 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.863952 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.865925 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.866010 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:52 crc kubenswrapper[4753]: I0129 14:02:52.866030 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:53 crc kubenswrapper[4753]: I0129 14:02:53.087320 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 18:38:24.527996913 +0000 UTC Jan 29 14:02:53 crc kubenswrapper[4753]: I0129 14:02:53.235352 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:53 crc kubenswrapper[4753]: I0129 14:02:53.236681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:53 crc kubenswrapper[4753]: I0129 14:02:53.236724 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:53 crc kubenswrapper[4753]: I0129 14:02:53.236739 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.088516 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 00:45:23.005035752 +0000 UTC Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.194466 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.194725 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.196489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.196545 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.196571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.374227 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.374476 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.376289 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.376372 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.376402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.441745 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.441969 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.443626 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.443719 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:54 crc kubenswrapper[4753]: I0129 14:02:54.443744 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:55 crc kubenswrapper[4753]: I0129 14:02:55.089058 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 04:52:13.754352617 +0000 UTC Jan 29 14:02:55 crc kubenswrapper[4753]: I0129 14:02:55.475016 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:55 crc kubenswrapper[4753]: I0129 14:02:55.475288 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:55 crc kubenswrapper[4753]: I0129 14:02:55.476916 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:55 crc kubenswrapper[4753]: I0129 14:02:55.476971 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:55 crc kubenswrapper[4753]: I0129 14:02:55.476988 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:55 crc kubenswrapper[4753]: I0129 14:02:55.484761 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:56 crc kubenswrapper[4753]: I0129 14:02:56.089238 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 19:21:14.215281119 +0000 UTC Jan 29 14:02:56 crc kubenswrapper[4753]: I0129 14:02:56.127809 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 29 14:02:56 crc kubenswrapper[4753]: I0129 14:02:56.128082 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:56 crc kubenswrapper[4753]: I0129 14:02:56.129921 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:56 crc kubenswrapper[4753]: I0129 14:02:56.129981 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:56 crc kubenswrapper[4753]: I0129 14:02:56.129999 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:56 crc kubenswrapper[4753]: E0129 14:02:56.234253 4753 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 29 14:02:56 crc kubenswrapper[4753]: I0129 14:02:56.244178 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:56 crc kubenswrapper[4753]: I0129 14:02:56.245604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:56 crc kubenswrapper[4753]: I0129 14:02:56.245671 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:56 crc kubenswrapper[4753]: I0129 14:02:56.245690 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:57 crc kubenswrapper[4753]: I0129 14:02:57.089479 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 22:30:46.246603666 +0000 UTC Jan 29 14:02:57 crc kubenswrapper[4753]: I0129 14:02:57.442711 4753 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 29 14:02:57 crc kubenswrapper[4753]: I0129 14:02:57.442820 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 14:02:57 crc kubenswrapper[4753]: I0129 14:02:57.560747 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 29 14:02:57 crc kubenswrapper[4753]: I0129 14:02:57.561073 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:57 crc kubenswrapper[4753]: I0129 14:02:57.563058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:57 crc kubenswrapper[4753]: I0129 14:02:57.563205 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:57 crc kubenswrapper[4753]: I0129 14:02:57.563238 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:58 crc kubenswrapper[4753]: I0129 14:02:58.090187 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 21:20:28.262051484 +0000 UTC Jan 29 14:02:58 crc kubenswrapper[4753]: I0129 14:02:58.245201 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:02:58 crc kubenswrapper[4753]: I0129 14:02:58.245448 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:02:58 crc kubenswrapper[4753]: I0129 14:02:58.247447 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:02:58 crc kubenswrapper[4753]: I0129 14:02:58.247549 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:02:58 crc kubenswrapper[4753]: I0129 14:02:58.247574 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:02:59 crc kubenswrapper[4753]: I0129 14:02:59.091194 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 10:43:13.714203633 +0000 UTC Jan 29 14:02:59 crc kubenswrapper[4753]: W0129 14:02:59.869742 4753 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 29 14:02:59 crc kubenswrapper[4753]: I0129 14:02:59.869897 4753 trace.go:236] Trace[723801495]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Jan-2026 14:02:49.868) (total time: 10001ms): Jan 29 14:02:59 crc kubenswrapper[4753]: Trace[723801495]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:02:59.869) Jan 29 14:02:59 crc kubenswrapper[4753]: Trace[723801495]: [10.001731561s] [10.001731561s] END Jan 29 14:02:59 crc kubenswrapper[4753]: E0129 14:02:59.869936 4753 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 29 14:03:00 crc kubenswrapper[4753]: I0129 14:03:00.014918 4753 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 29 14:03:00 crc kubenswrapper[4753]: I0129 14:03:00.015011 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 29 14:03:00 crc kubenswrapper[4753]: I0129 14:03:00.019547 4753 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 29 14:03:00 crc kubenswrapper[4753]: I0129 14:03:00.019632 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 29 14:03:00 crc kubenswrapper[4753]: I0129 14:03:00.091452 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 17:30:25.605291333 +0000 UTC Jan 29 14:03:01 crc kubenswrapper[4753]: I0129 14:03:01.092583 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 15:07:45.335068883 +0000 UTC Jan 29 14:03:02 crc kubenswrapper[4753]: I0129 14:03:02.093273 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 11:33:58.328468646 +0000 UTC Jan 29 14:03:03 crc kubenswrapper[4753]: I0129 14:03:03.094445 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 00:15:09.724159301 +0000 UTC Jan 29 14:03:04 crc kubenswrapper[4753]: I0129 14:03:04.095954 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 17:53:29.312257874 +0000 UTC Jan 29 14:03:04 crc kubenswrapper[4753]: I0129 14:03:04.378395 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:03:04 crc kubenswrapper[4753]: I0129 14:03:04.378991 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:03:04 crc kubenswrapper[4753]: I0129 14:03:04.384271 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:04 crc kubenswrapper[4753]: I0129 14:03:04.384539 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:04 crc kubenswrapper[4753]: I0129 14:03:04.384688 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:04 crc kubenswrapper[4753]: I0129 14:03:04.386802 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:03:04 crc kubenswrapper[4753]: I0129 14:03:04.863802 4753 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.011201 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.013823 4753 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.014196 4753 trace.go:236] Trace[1000116949]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Jan-2026 14:02:53.201) (total time: 11812ms): Jan 29 14:03:05 crc kubenswrapper[4753]: Trace[1000116949]: ---"Objects listed" error: 11812ms (14:03:05.014) Jan 29 14:03:05 crc kubenswrapper[4753]: Trace[1000116949]: [11.81273399s] [11.81273399s] END Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.014262 4753 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.015883 4753 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.016574 4753 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.021543 4753 trace.go:236] Trace[1657647016]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Jan-2026 14:02:50.083) (total time: 14937ms): Jan 29 14:03:05 crc kubenswrapper[4753]: Trace[1657647016]: ---"Objects listed" error: 14937ms (14:03:05.021) Jan 29 14:03:05 crc kubenswrapper[4753]: Trace[1657647016]: [14.937745423s] [14.937745423s] END Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.021590 4753 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.039903 4753 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.060991 4753 csr.go:261] certificate signing request csr-x7q28 is approved, waiting to be issued Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.076629 4753 apiserver.go:52] "Watching apiserver" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.080631 4753 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.081032 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.081669 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.081806 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.081833 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.082043 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.082282 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.082430 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.082544 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.082590 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.082667 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.083703 4753 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.088009 4753 csr.go:257] certificate signing request csr-x7q28 is issued Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.089040 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.089220 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.089598 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.091106 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.092047 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.092189 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.092299 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.093046 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.093716 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.096767 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 14:13:50.159837583 +0000 UTC Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.116646 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.116707 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.116745 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.116783 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.116818 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.116851 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.116889 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.116921 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.116952 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.116989 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117031 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117066 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117098 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117132 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117197 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117232 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117271 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117309 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117345 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117378 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117414 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117449 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117482 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117587 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117624 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117658 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117695 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117729 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117761 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117793 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117840 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117873 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117943 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117979 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118026 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118065 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118133 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118192 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118233 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118272 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118310 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118346 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118381 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118414 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118448 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118484 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118532 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118569 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118606 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118641 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118677 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118710 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118750 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118801 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118836 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118867 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118939 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118976 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119017 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119055 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119091 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119126 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119185 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119222 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119257 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119290 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119321 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119355 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119393 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119429 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119461 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119495 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119527 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119561 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119672 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119711 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119750 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119779 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119801 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119823 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119848 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119870 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119893 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119915 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119937 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119960 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119986 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120008 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120032 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120055 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120078 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120099 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120121 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120145 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120201 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120786 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120822 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.123139 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.123207 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.123241 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.123281 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.125825 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126025 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126079 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126116 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126176 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126245 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126275 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126310 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126344 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126371 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126405 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126439 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126471 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126497 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126527 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126560 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126586 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126616 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126659 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126692 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126731 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126771 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126812 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126841 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126879 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126917 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126950 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126988 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127027 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127065 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127097 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127137 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127195 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127227 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127261 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127310 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127345 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127378 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127405 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127427 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127452 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127480 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127501 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127620 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127647 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127667 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127692 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127714 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127736 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127783 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127807 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127889 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127916 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128009 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128106 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128133 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128178 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128262 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128385 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128432 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128457 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128480 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128590 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128617 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128641 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128713 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128735 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128758 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128782 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128802 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128826 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128873 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128895 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.128920 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129023 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129051 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129072 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117792 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117906 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.117979 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118422 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118493 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118498 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118598 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118706 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129141 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.132801 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.132846 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.132884 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.132922 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118740 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.132957 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118719 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.132989 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.118992 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119011 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119145 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133022 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133369 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133519 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133569 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133592 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133618 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133651 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133728 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133775 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133878 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133896 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119209 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119464 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119715 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119794 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119930 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119948 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119967 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.119982 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120670 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120696 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120752 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.120964 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.121419 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.121816 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.121824 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.122063 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.122191 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.122361 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.122447 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.122455 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.122596 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.122747 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.122736 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.122782 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.123056 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.123227 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.124875 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.125574 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.125773 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.126062 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127370 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127504 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127845 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.127914 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129237 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129442 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129583 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129788 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129813 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129833 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129922 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.129967 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.130415 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.130875 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.131249 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.135019 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.135070 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.135261 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.135969 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.136166 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.137130 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.137347 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.139080 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.141049 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.141762 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.142305 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.133909 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.142618 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.142663 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.142752 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.142747 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.142788 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.143244 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.143263 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.143394 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.143464 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.143496 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.143992 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.144041 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.144926 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.145560 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.145859 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.146718 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.146991 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.147104 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.147499 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.147798 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.148547 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.148836 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.150240 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.150388 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.151400 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.151686 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.151996 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.152448 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.153127 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.153243 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.153476 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.153786 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.153915 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.154044 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.144043 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.156476 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.156579 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.157581 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:05.657534707 +0000 UTC m=+20.352269089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.157897 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.157959 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:05.657941429 +0000 UTC m=+20.352675811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.157990 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.158668 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.158720 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.158845 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.159061 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.159225 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.159274 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.159316 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.166378 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.167604 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.167763 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.167940 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.168247 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.168347 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.168424 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.172501 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.172502 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.172792 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.173780 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.174335 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.174518 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.175441 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.175716 4753 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.178440 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.178755 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.178946 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.178974 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.179140 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.179938 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.180209 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.180364 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.157658 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.180738 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.180921 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.181292 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.181364 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.181389 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.181439 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.181466 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.181509 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.181536 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.181557 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.181577 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.181935 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.183201 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.183284 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.183465 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.183691 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.183968 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.184792 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.184425 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.184969 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.184983 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.184994 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185006 4753 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185016 4753 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185025 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185035 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185047 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185057 4753 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185068 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185077 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185087 4753 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185096 4753 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185105 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185114 4753 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185122 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185131 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185139 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185171 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185183 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185195 4753 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185207 4753 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185218 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185230 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185242 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185254 4753 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185266 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185277 4753 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185288 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185298 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185307 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185316 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185324 4753 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185332 4753 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185340 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185351 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185361 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185370 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185379 4753 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185390 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185401 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185412 4753 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185424 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185435 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185445 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185456 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185468 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185479 4753 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185491 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185502 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185513 4753 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185528 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185542 4753 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185553 4753 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185565 4753 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185576 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185588 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185599 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185610 4753 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185623 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185634 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185645 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185656 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185667 4753 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185679 4753 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185694 4753 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185705 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185718 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185730 4753 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185742 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185754 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185765 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185777 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185790 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185800 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185812 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185822 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185833 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185844 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185858 4753 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185871 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185883 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185895 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185906 4753 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185918 4753 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185929 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185941 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185956 4753 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185967 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185979 4753 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185990 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186003 4753 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186022 4753 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186033 4753 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186044 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186054 4753 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186064 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186075 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186086 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186096 4753 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186107 4753 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186118 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186129 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186140 4753 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186174 4753 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186188 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186200 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186212 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186224 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186237 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186249 4753 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186260 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186271 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186284 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.186297 4753 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.185966 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.187330 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.188457 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.188537 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.188901 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.189825 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.191529 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.191991 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.192914 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.193307 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.193489 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.193605 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.194023 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.194379 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.194616 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:03:05.694593303 +0000 UTC m=+20.389327685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.194947 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.196022 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.196107 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.196380 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.196829 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.197687 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.198257 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.200342 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.209852 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.211118 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.213793 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.213824 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.213840 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.213910 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:05.713889153 +0000 UTC m=+20.408623745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.214078 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.216084 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.217189 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.217672 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.217794 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.217943 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.218300 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.218822 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.218850 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.218865 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.218928 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:05.718905687 +0000 UTC m=+20.413640059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.221530 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.222062 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.222360 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.222937 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.223440 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.223628 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.224379 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.224908 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.224992 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.228616 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.228906 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.229295 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.234474 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.234912 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.235190 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.235240 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-s7czm"] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.235411 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.235524 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.235643 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s7czm" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.235890 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4trnx"] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.236326 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.237013 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.238204 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.238477 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.238853 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.239125 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.239215 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.239271 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.239427 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.239443 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.239528 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.239669 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.239700 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.239710 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.239782 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.240252 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.240332 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.240926 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.247258 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.252999 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.254404 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.261269 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.267809 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.274572 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.277275 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288479 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7qcg\" (UniqueName: \"kubernetes.io/projected/e0052a69-b4c2-41a7-9acc-3c9a936c20b9-kube-api-access-b7qcg\") pod \"node-ca-4trnx\" (UID: \"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\") " pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288535 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bf54982-69e3-4afd-95d9-654f58871b60-hosts-file\") pod \"node-resolver-s7czm\" (UID: \"1bf54982-69e3-4afd-95d9-654f58871b60\") " pod="openshift-dns/node-resolver-s7czm" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288553 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288574 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e0052a69-b4c2-41a7-9acc-3c9a936c20b9-serviceca\") pod \"node-ca-4trnx\" (UID: \"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\") " pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288602 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288624 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqvhr\" (UniqueName: \"kubernetes.io/projected/1bf54982-69e3-4afd-95d9-654f58871b60-kube-api-access-zqvhr\") pod \"node-resolver-s7czm\" (UID: \"1bf54982-69e3-4afd-95d9-654f58871b60\") " pod="openshift-dns/node-resolver-s7czm" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288637 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0052a69-b4c2-41a7-9acc-3c9a936c20b9-host\") pod \"node-ca-4trnx\" (UID: \"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\") " pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288665 4753 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288675 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288683 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288691 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288700 4753 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288708 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288718 4753 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288727 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288735 4753 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288746 4753 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288755 4753 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288762 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288771 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288779 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288789 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288796 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288804 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288813 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.288863 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289143 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289234 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289408 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289430 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289442 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289453 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289463 4753 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289476 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289485 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289499 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289510 4753 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289521 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289532 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289542 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289552 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289564 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289573 4753 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289584 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289594 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289606 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289616 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289629 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289641 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289653 4753 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289663 4753 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289674 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289684 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289695 4753 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289704 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289715 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289727 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289737 4753 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289747 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289757 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289767 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289777 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289789 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289799 4753 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289811 4753 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289823 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289837 4753 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289848 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289859 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289869 4753 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289880 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289890 4753 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289903 4753 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289913 4753 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289923 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289933 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289944 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289956 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289966 4753 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289978 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.289989 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.290002 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.290011 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.290021 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.290030 4753 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.290040 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.290049 4753 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.290383 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.307236 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.323587 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.334558 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.345361 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.347436 4753 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38268->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.347474 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38268->192.168.126.11:17697: read: connection reset by peer" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.347790 4753 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.347817 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.348000 4753 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38282->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.348055 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38282->192.168.126.11:17697: read: connection reset by peer" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.373582 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.383377 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.390456 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bf54982-69e3-4afd-95d9-654f58871b60-hosts-file\") pod \"node-resolver-s7czm\" (UID: \"1bf54982-69e3-4afd-95d9-654f58871b60\") " pod="openshift-dns/node-resolver-s7czm" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.390517 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e0052a69-b4c2-41a7-9acc-3c9a936c20b9-serviceca\") pod \"node-ca-4trnx\" (UID: \"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\") " pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.390555 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqvhr\" (UniqueName: \"kubernetes.io/projected/1bf54982-69e3-4afd-95d9-654f58871b60-kube-api-access-zqvhr\") pod \"node-resolver-s7czm\" (UID: \"1bf54982-69e3-4afd-95d9-654f58871b60\") " pod="openshift-dns/node-resolver-s7czm" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.390573 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0052a69-b4c2-41a7-9acc-3c9a936c20b9-host\") pod \"node-ca-4trnx\" (UID: \"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\") " pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.390606 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7qcg\" (UniqueName: \"kubernetes.io/projected/e0052a69-b4c2-41a7-9acc-3c9a936c20b9-kube-api-access-b7qcg\") pod \"node-ca-4trnx\" (UID: \"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\") " pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.390679 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0052a69-b4c2-41a7-9acc-3c9a936c20b9-host\") pod \"node-ca-4trnx\" (UID: \"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\") " pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.390803 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1bf54982-69e3-4afd-95d9-654f58871b60-hosts-file\") pod \"node-resolver-s7czm\" (UID: \"1bf54982-69e3-4afd-95d9-654f58871b60\") " pod="openshift-dns/node-resolver-s7czm" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.391681 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e0052a69-b4c2-41a7-9acc-3c9a936c20b9-serviceca\") pod \"node-ca-4trnx\" (UID: \"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\") " pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.392768 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.403529 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.407254 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7qcg\" (UniqueName: \"kubernetes.io/projected/e0052a69-b4c2-41a7-9acc-3c9a936c20b9-kube-api-access-b7qcg\") pod \"node-ca-4trnx\" (UID: \"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\") " pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.407309 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqvhr\" (UniqueName: \"kubernetes.io/projected/1bf54982-69e3-4afd-95d9-654f58871b60-kube-api-access-zqvhr\") pod \"node-resolver-s7czm\" (UID: \"1bf54982-69e3-4afd-95d9-654f58871b60\") " pod="openshift-dns/node-resolver-s7czm" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.407667 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.407773 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.418827 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.433395 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.437680 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.453915 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-49474ad7c0c01e003c12a63b1739861aac4bfde8601cd92c1bcf6dfadbd434a1 WatchSource:0}: Error finding container 49474ad7c0c01e003c12a63b1739861aac4bfde8601cd92c1bcf6dfadbd434a1: Status 404 returned error can't find the container with id 49474ad7c0c01e003c12a63b1739861aac4bfde8601cd92c1bcf6dfadbd434a1 Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.459993 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.553020 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s7czm" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.561632 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4trnx" Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.571863 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bf54982_69e3_4afd_95d9_654f58871b60.slice/crio-b1781f356d1a05b66a725508a10fa5fe6d2edcabe94f2470caf5ff5c63879bf1 WatchSource:0}: Error finding container b1781f356d1a05b66a725508a10fa5fe6d2edcabe94f2470caf5ff5c63879bf1: Status 404 returned error can't find the container with id b1781f356d1a05b66a725508a10fa5fe6d2edcabe94f2470caf5ff5c63879bf1 Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.693064 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.693109 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.693245 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.693302 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:06.693285138 +0000 UTC m=+21.388019520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.693612 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.693665 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:06.693651569 +0000 UTC m=+21.388385951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.789740 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.794316 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.794399 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.794440 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:03:06.794412551 +0000 UTC m=+21.489146933 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.794480 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.794529 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.794545 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.794557 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.794608 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:06.794592706 +0000 UTC m=+21.489327088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.794630 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.794666 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.794678 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:05 crc kubenswrapper[4753]: E0129 14:03:05.794753 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:06.79474213 +0000 UTC m=+21.489476512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.797233 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.799392 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.812613 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.824451 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.837350 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.853609 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.855581 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.865824 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.877203 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.888594 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.903620 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.913950 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.925852 4753 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.926133 4753 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.926164 4753 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.926198 4753 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.926165 4753 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.926747 4753 reflector.go:484] object-"openshift-image-registry"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.926761 4753 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.926806 4753 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.926848 4753 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.927132 4753 reflector.go:484] object-"openshift-image-registry"/"image-registry-certificates": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"image-registry-certificates": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.927193 4753 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.927246 4753 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.927355 4753 reflector.go:484] object-"openshift-image-registry"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.927369 4753 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.927393 4753 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.927419 4753 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.927589 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-85b44fc459-gdk6g/status\": read tcp 38.102.83.142:46614->38.102.83.142:6443: use of closed network connection" Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.926186 4753 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.927947 4753 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.928060 4753 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: W0129 14:03:05.928179 4753 reflector.go:484] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": watch of *v1.Secret ended with: very short watch: object-"openshift-image-registry"/"node-ca-dockercfg-4777p": Unexpected watch close - watch lasted less than a second and no items received Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.941424 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.951898 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.963662 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.974863 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.978916 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-x6rpz"] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.979471 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.980702 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-vfrvp"] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.981211 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.981489 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.981553 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-g2rf5"] Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.981908 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.982090 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.982324 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.982459 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.984014 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.984742 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.984839 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.984894 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.984930 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.985136 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.985220 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.985131 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.995059 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996018 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf2500dd-260b-477b-905b-6f1c52455890-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996071 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qljt\" (UniqueName: \"kubernetes.io/projected/cf2500dd-260b-477b-905b-6f1c52455890-kube-api-access-9qljt\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996103 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-system-cni-dir\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996131 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63926a91-5e42-4768-8277-55a0113cb5e2-cni-binary-copy\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996176 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-var-lib-cni-multus\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996211 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-os-release\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996240 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf2500dd-260b-477b-905b-6f1c52455890-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996271 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-hostroot\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996297 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/49d14260-5f77-47b9-97e1-c843cf322a0f-mcd-auth-proxy-config\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996330 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/63926a91-5e42-4768-8277-55a0113cb5e2-multus-daemon-config\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996355 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-etc-kubernetes\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996397 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np2fm\" (UniqueName: \"kubernetes.io/projected/63926a91-5e42-4768-8277-55a0113cb5e2-kube-api-access-np2fm\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996424 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-multus-socket-dir-parent\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996453 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-os-release\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996497 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-cnibin\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996522 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996550 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49d14260-5f77-47b9-97e1-c843cf322a0f-proxy-tls\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996574 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-run-multus-certs\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996637 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-var-lib-kubelet\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996699 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/49d14260-5f77-47b9-97e1-c843cf322a0f-rootfs\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996732 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-multus-conf-dir\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996763 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-run-k8s-cni-cncf-io\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996793 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdjpc\" (UniqueName: \"kubernetes.io/projected/49d14260-5f77-47b9-97e1-c843cf322a0f-kube-api-access-sdjpc\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996822 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-system-cni-dir\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996846 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-multus-cni-dir\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996873 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-run-netns\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996897 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-var-lib-cni-bin\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:05 crc kubenswrapper[4753]: I0129 14:03:05.996927 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-cnibin\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.004837 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.017053 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.027195 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.036106 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.047964 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.058417 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.069105 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.080863 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.089541 4753 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-29 13:58:05 +0000 UTC, rotation deadline is 2026-11-22 23:15:17.091861369 +0000 UTC Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.089598 4753 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7137h12m11.002265685s for next certificate rotation Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.092072 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.096963 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 23:37:10.627578134 +0000 UTC Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097431 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-system-cni-dir\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097462 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-var-lib-cni-multus\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097482 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-os-release\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097499 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf2500dd-260b-477b-905b-6f1c52455890-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097516 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63926a91-5e42-4768-8277-55a0113cb5e2-cni-binary-copy\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097537 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-hostroot\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097557 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/63926a91-5e42-4768-8277-55a0113cb5e2-multus-daemon-config\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097563 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-system-cni-dir\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097609 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-etc-kubernetes\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097573 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-etc-kubernetes\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097648 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-hostroot\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097653 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np2fm\" (UniqueName: \"kubernetes.io/projected/63926a91-5e42-4768-8277-55a0113cb5e2-kube-api-access-np2fm\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097673 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/49d14260-5f77-47b9-97e1-c843cf322a0f-mcd-auth-proxy-config\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097678 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-var-lib-cni-multus\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097693 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-os-release\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097773 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-multus-socket-dir-parent\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097812 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-cnibin\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097839 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097907 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49d14260-5f77-47b9-97e1-c843cf322a0f-proxy-tls\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097933 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-os-release\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097938 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-run-multus-certs\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097926 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-os-release\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097975 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-var-lib-kubelet\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.097999 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/49d14260-5f77-47b9-97e1-c843cf322a0f-rootfs\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098028 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-run-k8s-cni-cncf-io\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098051 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-multus-conf-dir\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098078 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-system-cni-dir\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098105 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdjpc\" (UniqueName: \"kubernetes.io/projected/49d14260-5f77-47b9-97e1-c843cf322a0f-kube-api-access-sdjpc\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098129 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-run-netns\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098171 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-var-lib-cni-bin\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098193 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-cnibin\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098657 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-multus-cni-dir\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098690 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qljt\" (UniqueName: \"kubernetes.io/projected/cf2500dd-260b-477b-905b-6f1c52455890-kube-api-access-9qljt\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098719 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf2500dd-260b-477b-905b-6f1c52455890-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098980 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63926a91-5e42-4768-8277-55a0113cb5e2-cni-binary-copy\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.098995 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/49d14260-5f77-47b9-97e1-c843cf322a0f-mcd-auth-proxy-config\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099380 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/63926a91-5e42-4768-8277-55a0113cb5e2-multus-daemon-config\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099444 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-multus-conf-dir\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099536 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-multus-socket-dir-parent\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099589 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-cnibin\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099606 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cf2500dd-260b-477b-905b-6f1c52455890-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099624 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cf2500dd-260b-477b-905b-6f1c52455890-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099732 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-system-cni-dir\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099823 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-cnibin\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099941 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-multus-cni-dir\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099948 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-run-netns\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.099986 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-var-lib-cni-bin\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.100095 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-var-lib-kubelet\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.100176 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cf2500dd-260b-477b-905b-6f1c52455890-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.100242 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-run-multus-certs\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.100284 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/49d14260-5f77-47b9-97e1-c843cf322a0f-rootfs\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.100323 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/63926a91-5e42-4768-8277-55a0113cb5e2-host-run-k8s-cni-cncf-io\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.105403 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/49d14260-5f77-47b9-97e1-c843cf322a0f-proxy-tls\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.107097 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.115086 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np2fm\" (UniqueName: \"kubernetes.io/projected/63926a91-5e42-4768-8277-55a0113cb5e2-kube-api-access-np2fm\") pod \"multus-vfrvp\" (UID: \"63926a91-5e42-4768-8277-55a0113cb5e2\") " pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.117615 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.117782 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qljt\" (UniqueName: \"kubernetes.io/projected/cf2500dd-260b-477b-905b-6f1c52455890-kube-api-access-9qljt\") pod \"multus-additional-cni-plugins-g2rf5\" (UID: \"cf2500dd-260b-477b-905b-6f1c52455890\") " pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.122599 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdjpc\" (UniqueName: \"kubernetes.io/projected/49d14260-5f77-47b9-97e1-c843cf322a0f-kube-api-access-sdjpc\") pod \"machine-config-daemon-x6rpz\" (UID: \"49d14260-5f77-47b9-97e1-c843cf322a0f\") " pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.131469 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.145023 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.152730 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.153490 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.154616 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.155222 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.156160 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.156642 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.156763 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.157251 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.158218 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.158808 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.159706 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.160219 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.161208 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.161727 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.162249 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.163113 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.163620 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.164548 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.164659 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.164917 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.165477 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.166546 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.166997 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.168013 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.168449 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.169441 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.169829 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.170574 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.171582 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.172024 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.172924 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.173408 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.174223 4753 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.174319 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.175872 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.176756 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.177273 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.177268 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.178713 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.179337 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.180188 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.180768 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.181796 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.182258 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.183200 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.183812 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.184740 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.185237 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.186069 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.186729 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.187785 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.188284 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.189226 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.189710 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.190647 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.194919 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.195545 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.197502 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.231899 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.275601 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.277652 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792" exitCode=255 Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.277729 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.278498 4753 scope.go:117] "RemoveContainer" containerID="e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.279631 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s7czm" event={"ID":"1bf54982-69e3-4afd-95d9-654f58871b60","Type":"ContainerStarted","Data":"c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.279668 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s7czm" event={"ID":"1bf54982-69e3-4afd-95d9-654f58871b60","Type":"ContainerStarted","Data":"b1781f356d1a05b66a725508a10fa5fe6d2edcabe94f2470caf5ff5c63879bf1"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.282996 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.283025 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b0f9d22c52eaf2ada78fab3084dd382c858a6778e88baac80ab8b15428348a40"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.284803 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4trnx" event={"ID":"e0052a69-b4c2-41a7-9acc-3c9a936c20b9","Type":"ContainerStarted","Data":"2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.284829 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4trnx" event={"ID":"e0052a69-b4c2-41a7-9acc-3c9a936c20b9","Type":"ContainerStarted","Data":"ba69dd9c7f518b1e84c3db170cbc4467ab267093f474c664ac546af006b96596"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.286695 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"82625ed06a30936da8c3b8ae147effbe495780066ef722207000fbaf7ac81330"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.289027 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.289066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.291224 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.292309 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.297191 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"49474ad7c0c01e003c12a63b1739861aac4bfde8601cd92c1bcf6dfadbd434a1"} Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.301449 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vfrvp" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.308551 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" Jan 29 14:03:06 crc kubenswrapper[4753]: W0129 14:03:06.325755 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63926a91_5e42_4768_8277_55a0113cb5e2.slice/crio-cfb2e6bcbd5c2cf91e491123389296f52a21443c02ef8c7198d6686f10c41b93 WatchSource:0}: Error finding container cfb2e6bcbd5c2cf91e491123389296f52a21443c02ef8c7198d6686f10c41b93: Status 404 returned error can't find the container with id cfb2e6bcbd5c2cf91e491123389296f52a21443c02ef8c7198d6686f10c41b93 Jan 29 14:03:06 crc kubenswrapper[4753]: W0129 14:03:06.327407 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49d14260_5f77_47b9_97e1_c843cf322a0f.slice/crio-781d214c009d16c73ead60d3908470561b227ea4af27ff47110133f986fd429c WatchSource:0}: Error finding container 781d214c009d16c73ead60d3908470561b227ea4af27ff47110133f986fd429c: Status 404 returned error can't find the container with id 781d214c009d16c73ead60d3908470561b227ea4af27ff47110133f986fd429c Jan 29 14:03:06 crc kubenswrapper[4753]: W0129 14:03:06.330288 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf2500dd_260b_477b_905b_6f1c52455890.slice/crio-b34a8bce7b916230ee7030950927722cd81126b863b98c8e0d8775447d9c7f2a WatchSource:0}: Error finding container b34a8bce7b916230ee7030950927722cd81126b863b98c8e0d8775447d9c7f2a: Status 404 returned error can't find the container with id b34a8bce7b916230ee7030950927722cd81126b863b98c8e0d8775447d9c7f2a Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.333367 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.356211 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.356636 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9pd9r"] Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.359011 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.362591 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.362790 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.362901 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.363033 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.363051 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.363831 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.365134 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.377395 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.391136 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402214 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-netd\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402258 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a796c89a-761f-48d7-80b5-031f75703f32-ovn-node-metrics-cert\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402285 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402315 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-systemd-units\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402338 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-kubelet\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402357 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-var-lib-openvswitch\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402377 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-node-log\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402398 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-script-lib\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402429 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-log-socket\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402450 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-ovn-kubernetes\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402468 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-env-overrides\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402502 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-netns\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402521 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-ovn\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402544 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-config\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402562 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-etc-openvswitch\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402582 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-bin\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402604 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-slash\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402625 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-systemd\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402644 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-openvswitch\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.402668 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl6hh\" (UniqueName: \"kubernetes.io/projected/a796c89a-761f-48d7-80b5-031f75703f32-kube-api-access-xl6hh\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.403216 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.414334 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.444277 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.484599 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503273 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl6hh\" (UniqueName: \"kubernetes.io/projected/a796c89a-761f-48d7-80b5-031f75703f32-kube-api-access-xl6hh\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503362 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-netd\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503386 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a796c89a-761f-48d7-80b5-031f75703f32-ovn-node-metrics-cert\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503406 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503428 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-systemd-units\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503456 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-kubelet\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503475 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-var-lib-openvswitch\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503493 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-node-log\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503509 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-script-lib\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503509 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503528 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-log-socket\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503544 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-ovn-kubernetes\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503538 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-netd\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503592 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-systemd-units\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503562 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-env-overrides\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503647 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-var-lib-openvswitch\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503687 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-kubelet\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503717 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-log-socket\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503742 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-ovn-kubernetes\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503707 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-node-log\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503794 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-netns\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503769 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-netns\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503955 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-ovn\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.503980 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-ovn\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504111 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-env-overrides\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504122 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-bin\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504204 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-bin\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504233 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-config\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504242 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-script-lib\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504330 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-etc-openvswitch\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504506 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-etc-openvswitch\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504665 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-config\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504740 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-slash\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504742 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-slash\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504794 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-systemd\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504816 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-openvswitch\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504825 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-systemd\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.504905 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-openvswitch\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.510822 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a796c89a-761f-48d7-80b5-031f75703f32-ovn-node-metrics-cert\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.521932 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.551323 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl6hh\" (UniqueName: \"kubernetes.io/projected/a796c89a-761f-48d7-80b5-031f75703f32-kube-api-access-xl6hh\") pod \"ovnkube-node-9pd9r\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.583831 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.627178 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.662045 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.674056 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:06 crc kubenswrapper[4753]: W0129 14:03:06.686075 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda796c89a_761f_48d7_80b5_031f75703f32.slice/crio-5404f3066943ea510e2486041cf0532862eeb302ba50806da7c2eed384dcdc8a WatchSource:0}: Error finding container 5404f3066943ea510e2486041cf0532862eeb302ba50806da7c2eed384dcdc8a: Status 404 returned error can't find the container with id 5404f3066943ea510e2486041cf0532862eeb302ba50806da7c2eed384dcdc8a Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.700507 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.706978 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.707022 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.707209 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.707269 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:08.70725029 +0000 UTC m=+23.401984682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.707644 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.707687 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:08.707676812 +0000 UTC m=+23.402411204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.741515 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.781792 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.793399 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.808462 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.808637 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.808744 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:03:08.808711681 +0000 UTC m=+23.503446073 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.808808 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.808828 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.808842 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.808917 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:08.808899417 +0000 UTC m=+23.503633979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.809346 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.809461 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.809481 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.809490 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:06 crc kubenswrapper[4753]: E0129 14:03:06.809534 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:08.809524905 +0000 UTC m=+23.504259477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.812502 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.864739 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.872743 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.912931 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.949324 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:06 crc kubenswrapper[4753]: I0129 14:03:06.973686 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.004794 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.047436 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.053299 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.073990 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.098125 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 03:21:48.656394901 +0000 UTC Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.114255 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.133076 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.148495 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.148564 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.148631 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:07 crc kubenswrapper[4753]: E0129 14:03:07.148662 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:07 crc kubenswrapper[4753]: E0129 14:03:07.148741 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:07 crc kubenswrapper[4753]: E0129 14:03:07.148851 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.166545 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.173106 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.213247 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.241745 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.273406 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.301596 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.303189 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8"} Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.303432 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.304955 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f" exitCode=0 Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.305022 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f"} Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.305049 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"5404f3066943ea510e2486041cf0532862eeb302ba50806da7c2eed384dcdc8a"} Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.305685 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.307060 4753 generic.go:334] "Generic (PLEG): container finished" podID="cf2500dd-260b-477b-905b-6f1c52455890" containerID="43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e" exitCode=0 Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.307119 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" event={"ID":"cf2500dd-260b-477b-905b-6f1c52455890","Type":"ContainerDied","Data":"43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e"} Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.307137 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" event={"ID":"cf2500dd-260b-477b-905b-6f1c52455890","Type":"ContainerStarted","Data":"b34a8bce7b916230ee7030950927722cd81126b863b98c8e0d8775447d9c7f2a"} Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.310313 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vfrvp" event={"ID":"63926a91-5e42-4768-8277-55a0113cb5e2","Type":"ContainerStarted","Data":"450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6"} Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.310343 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vfrvp" event={"ID":"63926a91-5e42-4768-8277-55a0113cb5e2","Type":"ContainerStarted","Data":"cfb2e6bcbd5c2cf91e491123389296f52a21443c02ef8c7198d6686f10c41b93"} Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.313952 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547"} Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.313998 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257"} Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.314018 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"781d214c009d16c73ead60d3908470561b227ea4af27ff47110133f986fd429c"} Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.314044 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.333431 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.354731 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.394319 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.425227 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.469339 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.506227 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.546599 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.584663 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.629172 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.640498 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.645497 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.646020 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.697306 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.737447 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.773455 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.819402 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.843714 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.883806 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.920991 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:07 crc kubenswrapper[4753]: I0129 14:03:07.973261 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.001880 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:07Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.045068 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.084172 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.098451 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 08:20:14.939260309 +0000 UTC Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.123041 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.162186 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.207279 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.249111 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.286828 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.317368 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e"} Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.319952 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f"} Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.319988 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9"} Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.320000 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed"} Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.320015 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e"} Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.321859 4753 generic.go:334] "Generic (PLEG): container finished" podID="cf2500dd-260b-477b-905b-6f1c52455890" containerID="36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55" exitCode=0 Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.321955 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" event={"ID":"cf2500dd-260b-477b-905b-6f1c52455890","Type":"ContainerDied","Data":"36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55"} Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.328902 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.364021 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.410053 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.443316 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.486052 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.524602 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.561400 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.601888 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.644912 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.683809 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.726458 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.729439 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.729686 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.729595 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.729978 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:12.729956733 +0000 UTC m=+27.424691125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.729807 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.730138 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:12.730104878 +0000 UTC m=+27.424839260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.764775 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.803909 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.830891 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.831066 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.831104 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.831319 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.831341 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.831337 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.831378 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.831392 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.831358 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.831475 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:12.831444296 +0000 UTC m=+27.526178678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.831498 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:12.831490397 +0000 UTC m=+27.526224779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:08 crc kubenswrapper[4753]: E0129 14:03:08.831546 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:03:12.831538219 +0000 UTC m=+27.526272601 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.848302 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.891629 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.928079 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:08 crc kubenswrapper[4753]: I0129 14:03:08.962951 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:08Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.021028 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.054007 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.083563 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.098797 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 23:28:14.488802419 +0000 UTC Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.125213 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.149141 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.149211 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.149357 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:09 crc kubenswrapper[4753]: E0129 14:03:09.149594 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:09 crc kubenswrapper[4753]: E0129 14:03:09.149658 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:09 crc kubenswrapper[4753]: E0129 14:03:09.149511 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.176816 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.202709 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.241920 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.332199 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43"} Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.332278 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86"} Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.335068 4753 generic.go:334] "Generic (PLEG): container finished" podID="cf2500dd-260b-477b-905b-6f1c52455890" containerID="145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb" exitCode=0 Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.335126 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" event={"ID":"cf2500dd-260b-477b-905b-6f1c52455890","Type":"ContainerDied","Data":"145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb"} Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.351060 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.373874 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.391357 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.409887 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.453474 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.486769 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.528983 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.563026 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.601955 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.643587 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.683378 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.733231 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.762535 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.803667 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.851306 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.889293 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.928486 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:09 crc kubenswrapper[4753]: I0129 14:03:09.963816 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:09Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.013368 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.046984 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.082037 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.099523 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 14:35:00.400174989 +0000 UTC Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.131815 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.162758 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.200808 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.242447 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.288629 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.327933 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.343800 4753 generic.go:334] "Generic (PLEG): container finished" podID="cf2500dd-260b-477b-905b-6f1c52455890" containerID="21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e" exitCode=0 Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.343877 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" event={"ID":"cf2500dd-260b-477b-905b-6f1c52455890","Type":"ContainerDied","Data":"21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e"} Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.366181 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.412609 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.454870 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.497704 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.522238 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.563557 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.604196 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.641903 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.683592 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.722902 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.769988 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.803473 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.843390 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.881484 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.924908 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:10 crc kubenswrapper[4753]: I0129 14:03:10.973459 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:10Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.008717 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.017861 4753 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.064240 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.099946 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 10:14:04.418278418 +0000 UTC Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.149337 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.149369 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.149361 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:11 crc kubenswrapper[4753]: E0129 14:03:11.149525 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:11 crc kubenswrapper[4753]: E0129 14:03:11.149673 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:11 crc kubenswrapper[4753]: E0129 14:03:11.149741 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.352014 4753 generic.go:334] "Generic (PLEG): container finished" podID="cf2500dd-260b-477b-905b-6f1c52455890" containerID="204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58" exitCode=0 Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.352104 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" event={"ID":"cf2500dd-260b-477b-905b-6f1c52455890","Type":"ContainerDied","Data":"204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58"} Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.358997 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355"} Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.378184 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.401867 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.417505 4753 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.420306 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.420366 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.420381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.420583 4753 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.424367 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.433811 4753 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.434357 4753 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.436350 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.436408 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.436435 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.436488 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.436507 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:11Z","lastTransitionTime":"2026-01-29T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.453564 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: E0129 14:03:11.459049 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.464563 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.464636 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.464661 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.464690 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.464711 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:11Z","lastTransitionTime":"2026-01-29T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.472795 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: E0129 14:03:11.485523 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.492356 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.492689 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.492743 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.492756 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.492779 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.492795 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:11Z","lastTransitionTime":"2026-01-29T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:11 crc kubenswrapper[4753]: E0129 14:03:11.509594 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.511692 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.516976 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.517041 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.517053 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.517075 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.517092 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:11Z","lastTransitionTime":"2026-01-29T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.529999 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: E0129 14:03:11.534973 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.539756 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.539822 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.539837 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.539864 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.539882 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:11Z","lastTransitionTime":"2026-01-29T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.551726 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: E0129 14:03:11.556020 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: E0129 14:03:11.556226 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.558222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.558270 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.558287 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.558314 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.558332 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:11Z","lastTransitionTime":"2026-01-29T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.570853 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.584878 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.608116 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.622729 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.662246 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.662292 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.662306 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.662344 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.662360 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:11Z","lastTransitionTime":"2026-01-29T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.662955 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.706507 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:11Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.766274 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.766341 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.766359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.766388 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.766412 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:11Z","lastTransitionTime":"2026-01-29T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.871438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.871480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.871490 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.871555 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.871567 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:11Z","lastTransitionTime":"2026-01-29T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.975119 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.975216 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.975237 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.975270 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:11 crc kubenswrapper[4753]: I0129 14:03:11.975291 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:11Z","lastTransitionTime":"2026-01-29T14:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.078196 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.078254 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.078272 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.078297 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.078315 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:12Z","lastTransitionTime":"2026-01-29T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.100919 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 19:18:46.754242509 +0000 UTC Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.182383 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.182461 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.182481 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.182513 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.182536 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:12Z","lastTransitionTime":"2026-01-29T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.286581 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.286679 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.286702 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.286731 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.286754 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:12Z","lastTransitionTime":"2026-01-29T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.370567 4753 generic.go:334] "Generic (PLEG): container finished" podID="cf2500dd-260b-477b-905b-6f1c52455890" containerID="6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3" exitCode=0 Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.370643 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" event={"ID":"cf2500dd-260b-477b-905b-6f1c52455890","Type":"ContainerDied","Data":"6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3"} Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.389446 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.389493 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.389511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.389534 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.389553 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:12Z","lastTransitionTime":"2026-01-29T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.394804 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.415963 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.436906 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.459094 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.488052 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.493202 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.493277 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.493296 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.493326 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.493347 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:12Z","lastTransitionTime":"2026-01-29T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.526485 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.555673 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.577417 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.598499 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.602918 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.602947 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.602958 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.602974 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.602987 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:12Z","lastTransitionTime":"2026-01-29T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.618654 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.639648 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.659429 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.686075 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.699958 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.705237 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.705290 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.705305 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.705326 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.705339 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:12Z","lastTransitionTime":"2026-01-29T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.713103 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:12Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.773742 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.773805 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.773914 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.774018 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:20.773993412 +0000 UTC m=+35.468727794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.774084 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.774210 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:20.774186097 +0000 UTC m=+35.468920479 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.809111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.809195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.809208 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.809230 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.809243 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:12Z","lastTransitionTime":"2026-01-29T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.874819 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.875123 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:03:20.875083923 +0000 UTC m=+35.569818305 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.875371 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.875446 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.875704 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.875728 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.875741 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.875788 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:20.875780772 +0000 UTC m=+35.570515154 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.875853 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.875935 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.875999 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:12 crc kubenswrapper[4753]: E0129 14:03:12.876134 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:20.876103972 +0000 UTC m=+35.570838514 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.913044 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.913691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.913712 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.913740 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:12 crc kubenswrapper[4753]: I0129 14:03:12.913758 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:12Z","lastTransitionTime":"2026-01-29T14:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.016725 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.016801 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.016820 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.016849 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.016868 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:13Z","lastTransitionTime":"2026-01-29T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.101070 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 15:49:43.41126153 +0000 UTC Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.119713 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.119783 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.119801 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.119828 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.119847 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:13Z","lastTransitionTime":"2026-01-29T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.149041 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:13 crc kubenswrapper[4753]: E0129 14:03:13.149235 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.149448 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.149548 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:13 crc kubenswrapper[4753]: E0129 14:03:13.149665 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:13 crc kubenswrapper[4753]: E0129 14:03:13.149757 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.223438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.223490 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.223503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.223528 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.223547 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:13Z","lastTransitionTime":"2026-01-29T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.327108 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.327165 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.327183 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.327207 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.327220 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:13Z","lastTransitionTime":"2026-01-29T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.379216 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.379590 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.383958 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" event={"ID":"cf2500dd-260b-477b-905b-6f1c52455890","Type":"ContainerStarted","Data":"1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.401193 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.419712 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.421162 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.430880 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.430948 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.430976 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.431013 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.431038 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:13Z","lastTransitionTime":"2026-01-29T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.434558 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.449635 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.479676 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.498548 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.517985 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.531858 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.533901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.534049 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.534186 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.534276 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.534353 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:13Z","lastTransitionTime":"2026-01-29T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.554684 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.573212 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.589760 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.605997 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.632450 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.637752 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.637800 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.637820 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.637841 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.637857 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:13Z","lastTransitionTime":"2026-01-29T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.653365 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.680743 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.682580 4753 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.695915 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.709418 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.727272 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.740599 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.740632 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.740650 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.740672 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.740690 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:13Z","lastTransitionTime":"2026-01-29T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.743351 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.758863 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.782122 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.802034 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.817058 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.836372 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.844328 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.844377 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.844391 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.844412 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.844425 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:13Z","lastTransitionTime":"2026-01-29T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.853828 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.867845 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.883364 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.897637 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.919954 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.932708 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:13Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.947058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.947093 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.947103 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.947121 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:13 crc kubenswrapper[4753]: I0129 14:03:13.947135 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:13Z","lastTransitionTime":"2026-01-29T14:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.050842 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.050907 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.050924 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.050949 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.050968 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:14Z","lastTransitionTime":"2026-01-29T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.101873 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 12:25:49.400891494 +0000 UTC Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.154312 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.154384 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.154407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.154437 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.154461 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:14Z","lastTransitionTime":"2026-01-29T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.258515 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.258613 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.258637 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.258672 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.258696 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:14Z","lastTransitionTime":"2026-01-29T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.362326 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.362395 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.362408 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.362433 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.362450 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:14Z","lastTransitionTime":"2026-01-29T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.388534 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.389253 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.423299 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.440474 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.453501 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.466382 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.466431 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.466443 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.466468 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.466483 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:14Z","lastTransitionTime":"2026-01-29T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.467187 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.491603 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.526319 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.557002 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.569204 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.569261 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.569275 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.569298 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.569313 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:14Z","lastTransitionTime":"2026-01-29T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.578977 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.593254 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.610827 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.623004 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.637354 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.665461 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.672451 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.672506 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.672525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.672552 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.672569 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:14Z","lastTransitionTime":"2026-01-29T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.687635 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.707077 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.722251 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:14Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.811110 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.811244 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.811265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.811326 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.811347 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:14Z","lastTransitionTime":"2026-01-29T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.914836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.914928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.914952 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.914987 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:14 crc kubenswrapper[4753]: I0129 14:03:14.915010 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:14Z","lastTransitionTime":"2026-01-29T14:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.018928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.018993 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.019003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.019020 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.019032 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:15Z","lastTransitionTime":"2026-01-29T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.102612 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 22:03:47.938244777 +0000 UTC Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.122318 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.122378 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.122394 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.122419 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.122440 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:15Z","lastTransitionTime":"2026-01-29T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.148836 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.148877 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.148898 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:15 crc kubenswrapper[4753]: E0129 14:03:15.149050 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:15 crc kubenswrapper[4753]: E0129 14:03:15.149237 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:15 crc kubenswrapper[4753]: E0129 14:03:15.149389 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.225993 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.226059 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.226073 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.226103 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.226117 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:15Z","lastTransitionTime":"2026-01-29T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.329305 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.329370 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.329394 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.329425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.329450 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:15Z","lastTransitionTime":"2026-01-29T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.392984 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.433572 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.433674 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.433693 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.433751 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.433770 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:15Z","lastTransitionTime":"2026-01-29T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.537954 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.538008 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.538021 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.538045 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.538062 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:15Z","lastTransitionTime":"2026-01-29T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.622466 4753 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.640191 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.640240 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.640253 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.640275 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.640288 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:15Z","lastTransitionTime":"2026-01-29T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.742390 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.742435 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.742450 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.742468 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.742483 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:15Z","lastTransitionTime":"2026-01-29T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.844547 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.844586 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.844596 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.844609 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.844621 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:15Z","lastTransitionTime":"2026-01-29T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.946845 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.946891 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.946901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.946918 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:15 crc kubenswrapper[4753]: I0129 14:03:15.946930 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:15Z","lastTransitionTime":"2026-01-29T14:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.050379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.050448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.050470 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.050492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.050506 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:16Z","lastTransitionTime":"2026-01-29T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.103284 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 17:16:17.811869627 +0000 UTC Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.153463 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.153519 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.153567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.153590 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.153605 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:16Z","lastTransitionTime":"2026-01-29T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.187063 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.203999 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.216884 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.236883 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.255488 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.255556 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.255570 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.255587 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.255599 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:16Z","lastTransitionTime":"2026-01-29T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.256511 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.269436 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.287944 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.311018 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.324186 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.338028 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.351188 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.358466 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.358539 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.358566 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.358599 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.358626 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:16Z","lastTransitionTime":"2026-01-29T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.362609 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.375592 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.387331 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.395321 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.397992 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:16Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.461826 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.461863 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.461876 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.461893 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.461904 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:16Z","lastTransitionTime":"2026-01-29T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.565751 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.565819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.565839 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.565867 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.565888 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:16Z","lastTransitionTime":"2026-01-29T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.670022 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.670085 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.670109 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.670141 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.670198 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:16Z","lastTransitionTime":"2026-01-29T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.773257 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.773341 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.773367 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.773404 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.773461 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:16Z","lastTransitionTime":"2026-01-29T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.876632 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.876697 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.876721 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.876751 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.876776 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:16Z","lastTransitionTime":"2026-01-29T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.979810 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.979874 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.979893 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.979917 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:16 crc kubenswrapper[4753]: I0129 14:03:16.979936 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:16Z","lastTransitionTime":"2026-01-29T14:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.082889 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.082950 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.082975 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.083006 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.083030 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:17Z","lastTransitionTime":"2026-01-29T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.104800 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 04:44:47.789023117 +0000 UTC Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.148401 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.148459 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.148400 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:17 crc kubenswrapper[4753]: E0129 14:03:17.148624 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:17 crc kubenswrapper[4753]: E0129 14:03:17.148799 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:17 crc kubenswrapper[4753]: E0129 14:03:17.148927 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.186134 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.186223 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.186246 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.186273 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.186294 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:17Z","lastTransitionTime":"2026-01-29T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.289941 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.290007 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.290026 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.290054 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.290074 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:17Z","lastTransitionTime":"2026-01-29T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.392707 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.392776 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.392794 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.392829 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.392848 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:17Z","lastTransitionTime":"2026-01-29T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.404270 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/0.log" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.408931 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e" exitCode=1 Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.409009 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.410505 4753 scope.go:117] "RemoveContainer" containerID="8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.434585 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.458333 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.481468 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.501769 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.501852 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.501877 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.501909 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.501936 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:17Z","lastTransitionTime":"2026-01-29T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.518095 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:16Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.342963 6057 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.342993 6057 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.343020 6057 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0129 14:03:16.343048 6057 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.343072 6057 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.344721 6057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 14:03:16.344741 6057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 14:03:16.344776 6057 factory.go:656] Stopping watch factory\\\\nI0129 14:03:16.344802 6057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 14:03:16.344815 6057 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.538271 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.554690 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.573108 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.596593 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.604732 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.604775 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.604788 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.604806 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.604820 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:17Z","lastTransitionTime":"2026-01-29T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.622430 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.648798 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.651372 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.665538 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.692558 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.708631 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.708703 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.708718 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.708742 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.708754 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:17Z","lastTransitionTime":"2026-01-29T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.710626 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.728219 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.746413 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.763661 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.788407 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.807401 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.811868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.811926 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.811945 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.811971 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.811989 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:17Z","lastTransitionTime":"2026-01-29T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.825875 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.852931 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:16Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.342963 6057 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.342993 6057 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.343020 6057 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0129 14:03:16.343048 6057 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.343072 6057 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.344721 6057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 14:03:16.344741 6057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 14:03:16.344776 6057 factory.go:656] Stopping watch factory\\\\nI0129 14:03:16.344802 6057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 14:03:16.344815 6057 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.874551 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.895414 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.914521 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.914614 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.914641 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.914670 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.914692 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:17Z","lastTransitionTime":"2026-01-29T14:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.915627 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.942696 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.972584 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:17 crc kubenswrapper[4753]: I0129 14:03:17.991729 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:17Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.018548 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.018620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.018636 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.018663 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.018679 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:18Z","lastTransitionTime":"2026-01-29T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.021755 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.039974 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.057639 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.079721 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.104997 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 01:01:18.443746787 +0000 UTC Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.122088 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.122181 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.122256 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.122280 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.122293 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:18Z","lastTransitionTime":"2026-01-29T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.177036 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72"] Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.177803 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.180498 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.180890 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.197539 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.220984 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.225420 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.225503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.225519 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.225550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.225566 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:18Z","lastTransitionTime":"2026-01-29T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.247850 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.255874 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08d5c2a2-f3dc-4cda-991d-bfa44033357e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.255944 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08d5c2a2-f3dc-4cda-991d-bfa44033357e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.256204 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08d5c2a2-f3dc-4cda-991d-bfa44033357e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.256258 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4drqk\" (UniqueName: \"kubernetes.io/projected/08d5c2a2-f3dc-4cda-991d-bfa44033357e-kube-api-access-4drqk\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.298921 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.324412 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.328438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.328526 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.328550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.328579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.328597 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:18Z","lastTransitionTime":"2026-01-29T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.351057 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.357915 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08d5c2a2-f3dc-4cda-991d-bfa44033357e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.358004 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08d5c2a2-f3dc-4cda-991d-bfa44033357e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.358052 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08d5c2a2-f3dc-4cda-991d-bfa44033357e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.358078 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4drqk\" (UniqueName: \"kubernetes.io/projected/08d5c2a2-f3dc-4cda-991d-bfa44033357e-kube-api-access-4drqk\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.358635 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08d5c2a2-f3dc-4cda-991d-bfa44033357e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.359376 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08d5c2a2-f3dc-4cda-991d-bfa44033357e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.365761 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08d5c2a2-f3dc-4cda-991d-bfa44033357e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.372974 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.379506 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4drqk\" (UniqueName: \"kubernetes.io/projected/08d5c2a2-f3dc-4cda-991d-bfa44033357e-kube-api-access-4drqk\") pod \"ovnkube-control-plane-749d76644c-z8p72\" (UID: \"08d5c2a2-f3dc-4cda-991d-bfa44033357e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.401049 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.414928 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/0.log" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.419010 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139"} Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.419249 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.423553 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.431174 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.431226 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.431239 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.431255 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.431269 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:18Z","lastTransitionTime":"2026-01-29T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.443415 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.461608 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.476253 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.498517 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.500564 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" Jan 29 14:03:18 crc kubenswrapper[4753]: W0129 14:03:18.516371 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08d5c2a2_f3dc_4cda_991d_bfa44033357e.slice/crio-1567777e61430f5127bf2bd54d4a00b4ab6eb24932fc1e4baec29301e397b427 WatchSource:0}: Error finding container 1567777e61430f5127bf2bd54d4a00b4ab6eb24932fc1e4baec29301e397b427: Status 404 returned error can't find the container with id 1567777e61430f5127bf2bd54d4a00b4ab6eb24932fc1e4baec29301e397b427 Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.534105 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.534172 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.534187 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.534207 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.534218 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:18Z","lastTransitionTime":"2026-01-29T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.534104 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:16Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.342963 6057 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.342993 6057 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.343020 6057 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0129 14:03:16.343048 6057 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.343072 6057 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.344721 6057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 14:03:16.344741 6057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 14:03:16.344776 6057 factory.go:656] Stopping watch factory\\\\nI0129 14:03:16.344802 6057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 14:03:16.344815 6057 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.553247 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.577132 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.588701 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.602997 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.616563 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.637603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.637639 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.637652 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.637671 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.637685 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:18Z","lastTransitionTime":"2026-01-29T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.637715 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:16Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.342963 6057 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.342993 6057 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.343020 6057 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0129 14:03:16.343048 6057 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.343072 6057 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.344721 6057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 14:03:16.344741 6057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 14:03:16.344776 6057 factory.go:656] Stopping watch factory\\\\nI0129 14:03:16.344802 6057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 14:03:16.344815 6057 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.652551 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.668353 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.680482 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.697127 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.714082 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.729840 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.740252 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.740455 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.740525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.740601 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.740678 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:18Z","lastTransitionTime":"2026-01-29T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.743932 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.767944 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.783921 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.801903 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.820604 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.837428 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:18Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.843781 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.843811 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.843821 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.843835 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.843847 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:18Z","lastTransitionTime":"2026-01-29T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.945992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.946307 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.946398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.946498 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:18 crc kubenswrapper[4753]: I0129 14:03:18.946592 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:18Z","lastTransitionTime":"2026-01-29T14:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.049950 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.050017 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.050029 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.050047 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.050064 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:19Z","lastTransitionTime":"2026-01-29T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.105675 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 19:47:23.643726305 +0000 UTC Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.149263 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.149288 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.149406 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:19 crc kubenswrapper[4753]: E0129 14:03:19.149537 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:19 crc kubenswrapper[4753]: E0129 14:03:19.149862 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:19 crc kubenswrapper[4753]: E0129 14:03:19.149972 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.152247 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.152275 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.152286 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.152297 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.152308 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:19Z","lastTransitionTime":"2026-01-29T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.255601 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.255928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.256012 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.256142 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.256294 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:19Z","lastTransitionTime":"2026-01-29T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.360513 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.360598 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.360626 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.360660 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.360684 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:19Z","lastTransitionTime":"2026-01-29T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.424542 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/1.log" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.425262 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/0.log" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.428962 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139" exitCode=2 Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.429008 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.429075 4753 scope.go:117] "RemoveContainer" containerID="8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.429989 4753 scope.go:117] "RemoveContainer" containerID="f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139" Jan 29 14:03:19 crc kubenswrapper[4753]: E0129 14:03:19.430253 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.432201 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" event={"ID":"08d5c2a2-f3dc-4cda-991d-bfa44033357e","Type":"ContainerStarted","Data":"ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.432259 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" event={"ID":"08d5c2a2-f3dc-4cda-991d-bfa44033357e","Type":"ContainerStarted","Data":"6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.432279 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" event={"ID":"08d5c2a2-f3dc-4cda-991d-bfa44033357e","Type":"ContainerStarted","Data":"1567777e61430f5127bf2bd54d4a00b4ab6eb24932fc1e4baec29301e397b427"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.447499 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.462571 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.462761 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.462790 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.462801 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.462819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.462831 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:19Z","lastTransitionTime":"2026-01-29T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.476022 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.490441 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.521802 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:16Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.342963 6057 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.342993 6057 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.343020 6057 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0129 14:03:16.343048 6057 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.343072 6057 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.344721 6057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 14:03:16.344741 6057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 14:03:16.344776 6057 factory.go:656] Stopping watch factory\\\\nI0129 14:03:16.344802 6057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 14:03:16.344815 6057 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"dbus.Signature{str:\\\\\\\"(uo)\\\\\\\"}, value:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\"}}, \\\\\\\"StateChangeTimestamp\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x64987519d763c}, \\\\\\\"StateChangeTimestampMonotonic\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x4d04347}, \\\\\\\"SubState\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"running\\\\\\\"}}, []string{\\\\\\\"Conditions\\\\\\\", \\\\\\\"Asserts\\\\\\\"}}, Sequence:0x20}\\\\nI0129 14:03:18.952571 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1\\\\\\\", Name:\\\\\\\"org.freedesktop.systemd1.Manager.JobRemoved\\\\\\\", Body:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\", \\\\\\\"crio-conmon-ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e.scope\\\\\\\", \\\\\\\"done\\\\\\\"}, Sequence:0x21}\\\\nI0129 14:03:18.952582 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2dconmon_2dded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x22}\\\\nI0129 14:03:18.952593 6189 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.540664 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.554614 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.565555 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.565618 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.565634 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.565658 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.565676 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:19Z","lastTransitionTime":"2026-01-29T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.567671 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.583868 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.596052 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.612702 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.625599 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.651028 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.667726 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.668682 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.668716 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.668731 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.668749 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.668763 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:19Z","lastTransitionTime":"2026-01-29T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.685359 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.700915 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.714948 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.731494 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.759189 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:16Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.342963 6057 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.342993 6057 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.343020 6057 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0129 14:03:16.343048 6057 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.343072 6057 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.344721 6057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 14:03:16.344741 6057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 14:03:16.344776 6057 factory.go:656] Stopping watch factory\\\\nI0129 14:03:16.344802 6057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 14:03:16.344815 6057 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"dbus.Signature{str:\\\\\\\"(uo)\\\\\\\"}, value:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\"}}, \\\\\\\"StateChangeTimestamp\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x64987519d763c}, \\\\\\\"StateChangeTimestampMonotonic\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x4d04347}, \\\\\\\"SubState\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"running\\\\\\\"}}, []string{\\\\\\\"Conditions\\\\\\\", \\\\\\\"Asserts\\\\\\\"}}, Sequence:0x20}\\\\nI0129 14:03:18.952571 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1\\\\\\\", Name:\\\\\\\"org.freedesktop.systemd1.Manager.JobRemoved\\\\\\\", Body:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\", \\\\\\\"crio-conmon-ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e.scope\\\\\\\", \\\\\\\"done\\\\\\\"}, Sequence:0x21}\\\\nI0129 14:03:18.952582 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2dconmon_2dded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x22}\\\\nI0129 14:03:18.952593 6189 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.771850 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.771906 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.771917 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.771935 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.771966 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:19Z","lastTransitionTime":"2026-01-29T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.774784 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.791239 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.801363 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.816424 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.831841 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.855693 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.873340 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.874694 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.874772 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.874788 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.874808 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.874823 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:19Z","lastTransitionTime":"2026-01-29T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.902312 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.922984 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.935603 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.952764 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.972225 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.977890 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.977924 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.977934 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.977952 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.977966 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:19Z","lastTransitionTime":"2026-01-29T14:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:19 crc kubenswrapper[4753]: I0129 14:03:19.992309 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:19Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.080982 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.081066 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.081092 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.081127 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.081201 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:20Z","lastTransitionTime":"2026-01-29T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.085694 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-57lv7"] Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.086699 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.086841 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.106560 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 02:48:56.729648602 +0000 UTC Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.109424 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.127500 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.142497 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.176762 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.180365 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74bbq\" (UniqueName: \"kubernetes.io/projected/281741ae-7781-4682-8b1d-207c9a437581-kube-api-access-74bbq\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.180461 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.185018 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.185080 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.185108 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.185139 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.185208 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:20Z","lastTransitionTime":"2026-01-29T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.195130 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.209496 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.232736 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.247028 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.259298 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.274840 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.282022 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.282123 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74bbq\" (UniqueName: \"kubernetes.io/projected/281741ae-7781-4682-8b1d-207c9a437581-kube-api-access-74bbq\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.282260 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.282344 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs podName:281741ae-7781-4682-8b1d-207c9a437581 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:20.782318569 +0000 UTC m=+35.477052961 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs") pod "network-metrics-daemon-57lv7" (UID: "281741ae-7781-4682-8b1d-207c9a437581") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.288418 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.288619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.288708 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.288786 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.288854 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:20Z","lastTransitionTime":"2026-01-29T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.291896 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.308706 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74bbq\" (UniqueName: \"kubernetes.io/projected/281741ae-7781-4682-8b1d-207c9a437581-kube-api-access-74bbq\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.311479 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.325317 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.336276 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.349087 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.360498 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.379483 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:16Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.342963 6057 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.342993 6057 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.343020 6057 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0129 14:03:16.343048 6057 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.343072 6057 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.344721 6057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 14:03:16.344741 6057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 14:03:16.344776 6057 factory.go:656] Stopping watch factory\\\\nI0129 14:03:16.344802 6057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 14:03:16.344815 6057 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"dbus.Signature{str:\\\\\\\"(uo)\\\\\\\"}, value:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\"}}, \\\\\\\"StateChangeTimestamp\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x64987519d763c}, \\\\\\\"StateChangeTimestampMonotonic\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x4d04347}, \\\\\\\"SubState\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"running\\\\\\\"}}, []string{\\\\\\\"Conditions\\\\\\\", \\\\\\\"Asserts\\\\\\\"}}, Sequence:0x20}\\\\nI0129 14:03:18.952571 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1\\\\\\\", Name:\\\\\\\"org.freedesktop.systemd1.Manager.JobRemoved\\\\\\\", Body:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\", \\\\\\\"crio-conmon-ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e.scope\\\\\\\", \\\\\\\"done\\\\\\\"}, Sequence:0x21}\\\\nI0129 14:03:18.952582 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2dconmon_2dded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x22}\\\\nI0129 14:03:18.952593 6189 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:20Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.391813 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.391860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.391871 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.391887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.391898 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:20Z","lastTransitionTime":"2026-01-29T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.437968 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/1.log" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.494750 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.494825 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.494845 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.494872 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.494891 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:20Z","lastTransitionTime":"2026-01-29T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.597817 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.597894 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.597915 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.597943 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.597961 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:20Z","lastTransitionTime":"2026-01-29T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.700972 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.701072 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.701095 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.701124 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.701145 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:20Z","lastTransitionTime":"2026-01-29T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.788362 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.788440 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.788476 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.788550 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.788616 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.788661 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs podName:281741ae-7781-4682-8b1d-207c9a437581 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:21.788635512 +0000 UTC m=+36.483369934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs") pod "network-metrics-daemon-57lv7" (UID: "281741ae-7781-4682-8b1d-207c9a437581") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.788671 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.788691 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:36.788677684 +0000 UTC m=+51.483412096 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.788881 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:36.788849448 +0000 UTC m=+51.483583870 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.804988 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.805050 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.805068 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.805091 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.805111 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:20Z","lastTransitionTime":"2026-01-29T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.889038 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.889135 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.889220 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.889340 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.889346 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:03:36.889294872 +0000 UTC m=+51.584029294 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.889359 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.889422 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.889506 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.889532 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.889441 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.889638 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:36.889599971 +0000 UTC m=+51.584334523 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:20 crc kubenswrapper[4753]: E0129 14:03:20.889718 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:36.889693953 +0000 UTC m=+51.584428535 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.908359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.908432 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.908451 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.908477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:20 crc kubenswrapper[4753]: I0129 14:03:20.908495 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:20Z","lastTransitionTime":"2026-01-29T14:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.011763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.011832 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.011854 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.011886 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.011906 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.106996 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 08:54:28.166089004 +0000 UTC Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.114964 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.115030 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.115073 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.115099 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.115117 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.148393 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.148494 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.148392 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:21 crc kubenswrapper[4753]: E0129 14:03:21.148578 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:21 crc kubenswrapper[4753]: E0129 14:03:21.148719 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:21 crc kubenswrapper[4753]: E0129 14:03:21.148873 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.218516 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.218574 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.218594 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.218621 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.218641 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.322595 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.322666 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.322684 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.322886 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.322903 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.426491 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.426575 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.426593 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.426618 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.426638 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.529908 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.529939 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.529947 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.529960 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.529970 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.633136 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.633234 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.633264 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.633285 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.633297 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.737006 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.737096 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.737116 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.737198 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.737291 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.800956 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:21 crc kubenswrapper[4753]: E0129 14:03:21.801338 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:21 crc kubenswrapper[4753]: E0129 14:03:21.801504 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs podName:281741ae-7781-4682-8b1d-207c9a437581 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:23.801469455 +0000 UTC m=+38.496204027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs") pod "network-metrics-daemon-57lv7" (UID: "281741ae-7781-4682-8b1d-207c9a437581") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.841022 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.841121 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.841132 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.841183 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.841197 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.894789 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.894851 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.894864 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.894887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.894902 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: E0129 14:03:21.914779 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:21Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.924865 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.924917 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.924931 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.924962 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.924983 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: E0129 14:03:21.946833 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:21Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.951901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.951953 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.951965 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.951989 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.952002 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: E0129 14:03:21.970928 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:21Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.976788 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.976878 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.976898 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.976928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:21 crc kubenswrapper[4753]: I0129 14:03:21.976948 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:21Z","lastTransitionTime":"2026-01-29T14:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:21 crc kubenswrapper[4753]: E0129 14:03:21.999583 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:21Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.006334 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.006395 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.006412 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.006486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.006503 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:22 crc kubenswrapper[4753]: E0129 14:03:22.026786 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:22Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:22 crc kubenswrapper[4753]: E0129 14:03:22.026980 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.030001 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.030060 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.030077 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.030107 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.030131 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.107354 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 22:28:23.44511716 +0000 UTC Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.132783 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.132887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.132905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.132936 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.132954 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.148439 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:22 crc kubenswrapper[4753]: E0129 14:03:22.148622 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.236841 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.236908 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.236919 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.236942 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.236961 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.340374 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.340480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.340514 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.340548 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.340575 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.444731 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.444815 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.444833 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.444861 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.444879 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.548318 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.548404 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.548422 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.548447 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.548464 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.651379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.651448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.651465 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.651487 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.651508 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.754340 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.754401 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.754417 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.754443 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.754458 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.857741 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.857809 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.857830 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.857866 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.857888 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.961822 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.961907 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.961932 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.961966 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:22 crc kubenswrapper[4753]: I0129 14:03:22.961994 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:22Z","lastTransitionTime":"2026-01-29T14:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.065991 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.066062 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.066080 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.066112 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.066132 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:23Z","lastTransitionTime":"2026-01-29T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.107763 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 06:29:04.24426094 +0000 UTC Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.148592 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.148712 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.148597 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:23 crc kubenswrapper[4753]: E0129 14:03:23.148858 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:23 crc kubenswrapper[4753]: E0129 14:03:23.148973 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:23 crc kubenswrapper[4753]: E0129 14:03:23.149250 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.168992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.169278 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.169377 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.169467 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.169552 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:23Z","lastTransitionTime":"2026-01-29T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.272842 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.272908 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.272925 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.272950 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.272969 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:23Z","lastTransitionTime":"2026-01-29T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.376826 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.376910 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.376968 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.377006 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.377033 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:23Z","lastTransitionTime":"2026-01-29T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.480652 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.480720 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.480743 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.480770 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.480790 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:23Z","lastTransitionTime":"2026-01-29T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.586265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.586390 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.586512 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.586620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.586714 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:23Z","lastTransitionTime":"2026-01-29T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.691329 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.691396 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.691413 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.691438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.691454 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:23Z","lastTransitionTime":"2026-01-29T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.795307 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.795388 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.795423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.795454 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.795476 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:23Z","lastTransitionTime":"2026-01-29T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.827027 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:23 crc kubenswrapper[4753]: E0129 14:03:23.827307 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:23 crc kubenswrapper[4753]: E0129 14:03:23.827460 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs podName:281741ae-7781-4682-8b1d-207c9a437581 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:27.827422238 +0000 UTC m=+42.522156800 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs") pod "network-metrics-daemon-57lv7" (UID: "281741ae-7781-4682-8b1d-207c9a437581") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.898894 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.898951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.898968 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.898991 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:23 crc kubenswrapper[4753]: I0129 14:03:23.899008 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:23Z","lastTransitionTime":"2026-01-29T14:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.003362 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.003422 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.003432 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.003452 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.003465 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:24Z","lastTransitionTime":"2026-01-29T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.107198 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.107280 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.107294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.107320 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.107342 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:24Z","lastTransitionTime":"2026-01-29T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.108003 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 05:59:52.196385325 +0000 UTC Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.148875 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:24 crc kubenswrapper[4753]: E0129 14:03:24.149021 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.210443 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.210485 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.210494 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.210510 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.210521 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:24Z","lastTransitionTime":"2026-01-29T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.313713 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.313762 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.313774 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.313792 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.313803 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:24Z","lastTransitionTime":"2026-01-29T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.417038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.417102 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.417115 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.417135 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.417174 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:24Z","lastTransitionTime":"2026-01-29T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.520571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.520637 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.520658 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.520683 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.520702 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:24Z","lastTransitionTime":"2026-01-29T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.624477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.624557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.624580 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.624614 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.624636 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:24Z","lastTransitionTime":"2026-01-29T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.728565 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.728641 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.728661 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.728688 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.728713 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:24Z","lastTransitionTime":"2026-01-29T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.832106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.832217 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.832243 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.832279 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.832297 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:24Z","lastTransitionTime":"2026-01-29T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.935136 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.935283 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.935307 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.935331 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:24 crc kubenswrapper[4753]: I0129 14:03:24.935348 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:24Z","lastTransitionTime":"2026-01-29T14:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.038836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.038901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.038921 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.038951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.038968 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:25Z","lastTransitionTime":"2026-01-29T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.108192 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 12:59:39.561081112 +0000 UTC Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.141885 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.141951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.141969 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.142002 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.142019 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:25Z","lastTransitionTime":"2026-01-29T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.148555 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.148625 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.148588 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:25 crc kubenswrapper[4753]: E0129 14:03:25.148736 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:25 crc kubenswrapper[4753]: E0129 14:03:25.148993 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:25 crc kubenswrapper[4753]: E0129 14:03:25.149213 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.246071 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.246144 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.246204 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.246237 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.246257 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:25Z","lastTransitionTime":"2026-01-29T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.349233 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.349311 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.349338 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.349369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.349398 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:25Z","lastTransitionTime":"2026-01-29T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.453641 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.453722 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.453741 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.453768 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.453788 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:25Z","lastTransitionTime":"2026-01-29T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.557599 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.557646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.557658 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.557675 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.557686 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:25Z","lastTransitionTime":"2026-01-29T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.661353 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.661409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.661424 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.661444 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.661460 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:25Z","lastTransitionTime":"2026-01-29T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.764393 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.764450 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.764464 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.764487 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.764504 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:25Z","lastTransitionTime":"2026-01-29T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.868209 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.868297 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.868317 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.868348 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.868368 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:25Z","lastTransitionTime":"2026-01-29T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.971447 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.971497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.971507 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.971526 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:25 crc kubenswrapper[4753]: I0129 14:03:25.971538 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:25Z","lastTransitionTime":"2026-01-29T14:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.074634 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.074685 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.074697 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.074717 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.074730 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:26Z","lastTransitionTime":"2026-01-29T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.109225 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 22:43:10.323860157 +0000 UTC Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.148666 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:26 crc kubenswrapper[4753]: E0129 14:03:26.148844 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.168873 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.177788 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.177870 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.177892 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.178349 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.178575 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:26Z","lastTransitionTime":"2026-01-29T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.187056 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.201033 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.220749 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.236438 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.260763 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.275218 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.282194 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.282246 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.282259 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.282279 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.282293 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:26Z","lastTransitionTime":"2026-01-29T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.288645 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.310248 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.345222 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.370113 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.385038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.385080 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.385092 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.385111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.385123 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:26Z","lastTransitionTime":"2026-01-29T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.394779 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.408299 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.419312 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.429757 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.445757 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.468251 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b9d9ec2dd5bf54cafce5d4b422f9d2c647ba71e1819ec0455eb080acde6f55e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:16Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.342963 6057 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.342993 6057 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.343020 6057 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0129 14:03:16.343048 6057 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0129 14:03:16.343072 6057 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0129 14:03:16.344721 6057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0129 14:03:16.344741 6057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0129 14:03:16.344776 6057 factory.go:656] Stopping watch factory\\\\nI0129 14:03:16.344802 6057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0129 14:03:16.344815 6057 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"dbus.Signature{str:\\\\\\\"(uo)\\\\\\\"}, value:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\"}}, \\\\\\\"StateChangeTimestamp\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x64987519d763c}, \\\\\\\"StateChangeTimestampMonotonic\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x4d04347}, \\\\\\\"SubState\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"running\\\\\\\"}}, []string{\\\\\\\"Conditions\\\\\\\", \\\\\\\"Asserts\\\\\\\"}}, Sequence:0x20}\\\\nI0129 14:03:18.952571 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1\\\\\\\", Name:\\\\\\\"org.freedesktop.systemd1.Manager.JobRemoved\\\\\\\", Body:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\", \\\\\\\"crio-conmon-ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e.scope\\\\\\\", \\\\\\\"done\\\\\\\"}, Sequence:0x21}\\\\nI0129 14:03:18.952582 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2dconmon_2dded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x22}\\\\nI0129 14:03:18.952593 6189 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:26Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.488771 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.488834 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.488845 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.488865 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.488877 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:26Z","lastTransitionTime":"2026-01-29T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.592432 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.592507 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.592526 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.592555 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.592576 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:26Z","lastTransitionTime":"2026-01-29T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.696255 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.696307 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.696320 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.696339 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.696351 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:26Z","lastTransitionTime":"2026-01-29T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.799866 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.799914 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.799926 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.799945 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.799962 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:26Z","lastTransitionTime":"2026-01-29T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.903558 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.903606 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.903616 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.903637 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:26 crc kubenswrapper[4753]: I0129 14:03:26.903649 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:26Z","lastTransitionTime":"2026-01-29T14:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.007036 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.007107 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.007124 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.007148 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.007202 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:27Z","lastTransitionTime":"2026-01-29T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.109426 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 07:59:28.682454568 +0000 UTC Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.111029 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.111142 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.111177 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.111199 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.111216 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:27Z","lastTransitionTime":"2026-01-29T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.148899 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.148925 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:27 crc kubenswrapper[4753]: E0129 14:03:27.149559 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.149733 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:27 crc kubenswrapper[4753]: E0129 14:03:27.149807 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:27 crc kubenswrapper[4753]: E0129 14:03:27.149929 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.215084 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.215141 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.215184 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.215207 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.215224 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:27Z","lastTransitionTime":"2026-01-29T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.317889 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.317926 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.317935 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.317952 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.317962 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:27Z","lastTransitionTime":"2026-01-29T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.421623 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.421691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.421707 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.421737 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.421758 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:27Z","lastTransitionTime":"2026-01-29T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.525365 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.525437 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.525446 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.525468 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.525480 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:27Z","lastTransitionTime":"2026-01-29T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.629603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.629694 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.629715 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.629746 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.629768 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:27Z","lastTransitionTime":"2026-01-29T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.733364 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.733432 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.733445 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.733463 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.733475 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:27Z","lastTransitionTime":"2026-01-29T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.836854 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.836937 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.836950 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.836971 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.836983 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:27Z","lastTransitionTime":"2026-01-29T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.875626 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:27 crc kubenswrapper[4753]: E0129 14:03:27.875840 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:27 crc kubenswrapper[4753]: E0129 14:03:27.875928 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs podName:281741ae-7781-4682-8b1d-207c9a437581 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:35.87590202 +0000 UTC m=+50.570636402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs") pod "network-metrics-daemon-57lv7" (UID: "281741ae-7781-4682-8b1d-207c9a437581") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.940427 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.940489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.940509 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.940531 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:27 crc kubenswrapper[4753]: I0129 14:03:27.940542 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:27Z","lastTransitionTime":"2026-01-29T14:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.044340 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.044439 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.044451 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.044475 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.044488 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:28Z","lastTransitionTime":"2026-01-29T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.109854 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 23:00:45.16940235 +0000 UTC Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.147991 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.148065 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.148078 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.148103 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.148119 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:28Z","lastTransitionTime":"2026-01-29T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.148446 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:28 crc kubenswrapper[4753]: E0129 14:03:28.148602 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.251511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.251580 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.251590 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.251611 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.251624 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:28Z","lastTransitionTime":"2026-01-29T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.354902 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.355017 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.355038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.355067 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.355085 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:28Z","lastTransitionTime":"2026-01-29T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.459085 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.459144 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.459198 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.459224 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.459240 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:28Z","lastTransitionTime":"2026-01-29T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.563452 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.563531 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.563549 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.563575 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.563594 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:28Z","lastTransitionTime":"2026-01-29T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.666279 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.666365 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.666382 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.666409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.666426 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:28Z","lastTransitionTime":"2026-01-29T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.769822 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.769874 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.769884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.769901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.769912 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:28Z","lastTransitionTime":"2026-01-29T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.877978 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.878011 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.878026 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.878041 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.878050 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:28Z","lastTransitionTime":"2026-01-29T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.945190 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.946797 4753 scope.go:117] "RemoveContainer" containerID="f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139" Jan 29 14:03:28 crc kubenswrapper[4753]: E0129 14:03:28.947120 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.981846 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.982366 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.982498 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.982635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.982818 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:28Z","lastTransitionTime":"2026-01-29T14:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:28 crc kubenswrapper[4753]: I0129 14:03:28.983093 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:28Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.002753 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.016073 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.039503 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.057079 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.074487 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.086891 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.087020 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.087042 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.087082 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.087119 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:29Z","lastTransitionTime":"2026-01-29T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.095506 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.110998 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 22:30:31.616970456 +0000 UTC Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.117403 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.139012 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.148940 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.149004 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.148944 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:29 crc kubenswrapper[4753]: E0129 14:03:29.149209 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:29 crc kubenswrapper[4753]: E0129 14:03:29.149400 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:29 crc kubenswrapper[4753]: E0129 14:03:29.149508 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.154720 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.170003 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.190209 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.190295 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.190318 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.190350 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.190389 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:29Z","lastTransitionTime":"2026-01-29T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.192373 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.212604 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.249327 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"dbus.Signature{str:\\\\\\\"(uo)\\\\\\\"}, value:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\"}}, \\\\\\\"StateChangeTimestamp\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x64987519d763c}, \\\\\\\"StateChangeTimestampMonotonic\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x4d04347}, \\\\\\\"SubState\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"running\\\\\\\"}}, []string{\\\\\\\"Conditions\\\\\\\", \\\\\\\"Asserts\\\\\\\"}}, Sequence:0x20}\\\\nI0129 14:03:18.952571 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1\\\\\\\", Name:\\\\\\\"org.freedesktop.systemd1.Manager.JobRemoved\\\\\\\", Body:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\", \\\\\\\"crio-conmon-ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e.scope\\\\\\\", \\\\\\\"done\\\\\\\"}, Sequence:0x21}\\\\nI0129 14:03:18.952582 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2dconmon_2dded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x22}\\\\nI0129 14:03:18.952593 6189 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.272448 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.290768 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.293437 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.293492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.293513 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.293539 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.293557 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:29Z","lastTransitionTime":"2026-01-29T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.305949 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:29Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.397732 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.397803 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.397823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.397850 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.397867 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:29Z","lastTransitionTime":"2026-01-29T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.501205 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.501258 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.501269 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.501291 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.501317 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:29Z","lastTransitionTime":"2026-01-29T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.604197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.604247 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.604258 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.604280 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.604294 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:29Z","lastTransitionTime":"2026-01-29T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.707377 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.707447 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.707497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.707523 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.707677 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:29Z","lastTransitionTime":"2026-01-29T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.810355 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.810414 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.810427 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.810447 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.810462 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:29Z","lastTransitionTime":"2026-01-29T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.914268 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.914363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.914388 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.914473 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:29 crc kubenswrapper[4753]: I0129 14:03:29.914522 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:29Z","lastTransitionTime":"2026-01-29T14:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.017691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.017762 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.017785 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.017816 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.017836 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:30Z","lastTransitionTime":"2026-01-29T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.111244 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 03:29:24.366054738 +0000 UTC Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.120920 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.120984 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.121002 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.121026 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.121044 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:30Z","lastTransitionTime":"2026-01-29T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.149422 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:30 crc kubenswrapper[4753]: E0129 14:03:30.149651 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.225521 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.225594 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.225613 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.225654 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.225674 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:30Z","lastTransitionTime":"2026-01-29T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.328922 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.329013 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.329049 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.329080 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.329101 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:30Z","lastTransitionTime":"2026-01-29T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.432418 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.432492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.432510 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.432537 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.432556 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:30Z","lastTransitionTime":"2026-01-29T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.535531 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.535638 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.535663 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.535699 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.535723 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:30Z","lastTransitionTime":"2026-01-29T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.639527 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.639593 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.639615 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.639646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.639671 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:30Z","lastTransitionTime":"2026-01-29T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.743304 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.743378 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.743400 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.743430 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.743459 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:30Z","lastTransitionTime":"2026-01-29T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.846889 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.846972 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.846997 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.847029 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.847051 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:30Z","lastTransitionTime":"2026-01-29T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.950650 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.950718 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.950740 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.950767 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:30 crc kubenswrapper[4753]: I0129 14:03:30.950787 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:30Z","lastTransitionTime":"2026-01-29T14:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.053967 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.054052 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.054072 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.054099 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.054119 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:31Z","lastTransitionTime":"2026-01-29T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.112449 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 12:57:56.326803384 +0000 UTC Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.149411 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.149470 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.149511 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:31 crc kubenswrapper[4753]: E0129 14:03:31.150222 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:31 crc kubenswrapper[4753]: E0129 14:03:31.150294 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:31 crc kubenswrapper[4753]: E0129 14:03:31.150479 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.157891 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.157945 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.157980 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.158008 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.158029 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:31Z","lastTransitionTime":"2026-01-29T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.261353 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.261441 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.261465 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.261497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.261521 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:31Z","lastTransitionTime":"2026-01-29T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.364956 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.365017 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.365035 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.365059 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.365076 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:31Z","lastTransitionTime":"2026-01-29T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.468502 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.468571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.468590 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.468615 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.468633 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:31Z","lastTransitionTime":"2026-01-29T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.572266 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.572335 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.572349 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.572367 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.572379 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:31Z","lastTransitionTime":"2026-01-29T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.675603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.675673 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.675718 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.675747 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.675770 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:31Z","lastTransitionTime":"2026-01-29T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.779759 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.779833 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.779852 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.779878 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.779896 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:31Z","lastTransitionTime":"2026-01-29T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.883384 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.883442 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.883459 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.883482 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.883499 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:31Z","lastTransitionTime":"2026-01-29T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.991241 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.991328 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.991347 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.991375 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:31 crc kubenswrapper[4753]: I0129 14:03:31.991394 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:31Z","lastTransitionTime":"2026-01-29T14:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.095475 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.095557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.095576 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.095601 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.095620 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.113423 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 18:05:52.88380692 +0000 UTC Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.148336 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:32 crc kubenswrapper[4753]: E0129 14:03:32.148521 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.198305 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.198547 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.198566 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.198598 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.198654 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.302442 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.302524 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.302551 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.302579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.302598 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.357069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.357139 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.357197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.357232 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.357256 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: E0129 14:03:32.378855 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:32Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.384579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.384677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.384697 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.384764 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.384785 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: E0129 14:03:32.416066 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:32Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.422380 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.422435 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.422450 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.422476 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.422496 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: E0129 14:03:32.443591 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:32Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.448529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.448581 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.448595 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.448616 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.448631 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: E0129 14:03:32.471373 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:32Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.476773 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.476834 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.476853 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.476883 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.476903 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: E0129 14:03:32.498951 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:32Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:32 crc kubenswrapper[4753]: E0129 14:03:32.499220 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.501748 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.501804 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.501822 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.501847 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.501867 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.605973 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.606062 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.606090 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.606123 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.606146 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.709944 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.710020 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.710029 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.710049 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.710060 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.813744 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.813820 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.813838 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.813866 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.813883 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.917505 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.917577 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.917597 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.917625 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:32 crc kubenswrapper[4753]: I0129 14:03:32.917645 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:32Z","lastTransitionTime":"2026-01-29T14:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.020451 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.020517 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.020541 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.020573 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.020597 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:33Z","lastTransitionTime":"2026-01-29T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.114233 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 14:06:24.901468361 +0000 UTC Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.123794 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.123872 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.123897 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.123929 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.123954 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:33Z","lastTransitionTime":"2026-01-29T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.148896 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.148950 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.148902 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:33 crc kubenswrapper[4753]: E0129 14:03:33.149068 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:33 crc kubenswrapper[4753]: E0129 14:03:33.149284 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:33 crc kubenswrapper[4753]: E0129 14:03:33.149439 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.227367 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.227431 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.227454 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.227477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.227495 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:33Z","lastTransitionTime":"2026-01-29T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.331695 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.331772 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.331790 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.331819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.331843 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:33Z","lastTransitionTime":"2026-01-29T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.435291 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.435358 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.435377 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.435402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.435422 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:33Z","lastTransitionTime":"2026-01-29T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.539544 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.539655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.539681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.539713 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.539737 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:33Z","lastTransitionTime":"2026-01-29T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.643017 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.643058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.643068 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.643084 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.643094 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:33Z","lastTransitionTime":"2026-01-29T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.746922 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.746976 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.746987 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.747008 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.747022 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:33Z","lastTransitionTime":"2026-01-29T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.850249 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.850331 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.850350 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.850378 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.850402 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:33Z","lastTransitionTime":"2026-01-29T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.953932 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.953999 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.954016 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.954043 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:33 crc kubenswrapper[4753]: I0129 14:03:33.954061 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:33Z","lastTransitionTime":"2026-01-29T14:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.057323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.057391 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.057410 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.057436 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.057454 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:34Z","lastTransitionTime":"2026-01-29T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.114581 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 15:05:34.42914509 +0000 UTC Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.148489 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:34 crc kubenswrapper[4753]: E0129 14:03:34.148712 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.163018 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.163113 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.163140 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.163219 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.163252 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:34Z","lastTransitionTime":"2026-01-29T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.266566 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.266629 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.266649 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.266675 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.266694 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:34Z","lastTransitionTime":"2026-01-29T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.370315 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.370385 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.370402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.370427 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.370445 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:34Z","lastTransitionTime":"2026-01-29T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.473122 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.473224 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.473243 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.473267 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.473285 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:34Z","lastTransitionTime":"2026-01-29T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.576363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.576438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.576461 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.576492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.576528 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:34Z","lastTransitionTime":"2026-01-29T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.680566 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.680623 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.680642 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.680663 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.680679 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:34Z","lastTransitionTime":"2026-01-29T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.783524 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.783589 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.783603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.783621 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.783634 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:34Z","lastTransitionTime":"2026-01-29T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.886397 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.886481 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.886506 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.886539 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.886568 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:34Z","lastTransitionTime":"2026-01-29T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.990346 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.990464 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.990483 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.990511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:34 crc kubenswrapper[4753]: I0129 14:03:34.990529 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:34Z","lastTransitionTime":"2026-01-29T14:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.094186 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.094250 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.094267 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.094293 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.094310 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:35Z","lastTransitionTime":"2026-01-29T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.115074 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 04:51:01.208469321 +0000 UTC Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.148398 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.148442 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.148398 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:35 crc kubenswrapper[4753]: E0129 14:03:35.148624 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:35 crc kubenswrapper[4753]: E0129 14:03:35.148727 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:35 crc kubenswrapper[4753]: E0129 14:03:35.149008 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.197437 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.197542 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.197559 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.197584 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.197602 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:35Z","lastTransitionTime":"2026-01-29T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.301071 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.301165 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.301232 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.301270 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.301298 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:35Z","lastTransitionTime":"2026-01-29T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.405268 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.405355 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.405379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.405410 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.405434 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:35Z","lastTransitionTime":"2026-01-29T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.508086 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.508177 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.508198 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.508223 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.508242 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:35Z","lastTransitionTime":"2026-01-29T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.611404 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.611464 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.611474 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.611496 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.611508 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:35Z","lastTransitionTime":"2026-01-29T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.714389 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.714444 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.714456 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.714477 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.714490 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:35Z","lastTransitionTime":"2026-01-29T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.817791 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.817861 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.817882 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.817913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.817935 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:35Z","lastTransitionTime":"2026-01-29T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.921197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.921246 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.921258 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.921277 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.921289 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:35Z","lastTransitionTime":"2026-01-29T14:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:35 crc kubenswrapper[4753]: I0129 14:03:35.970869 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:35 crc kubenswrapper[4753]: E0129 14:03:35.971114 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:35 crc kubenswrapper[4753]: E0129 14:03:35.971283 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs podName:281741ae-7781-4682-8b1d-207c9a437581 nodeName:}" failed. No retries permitted until 2026-01-29 14:03:51.971225927 +0000 UTC m=+66.665960309 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs") pod "network-metrics-daemon-57lv7" (UID: "281741ae-7781-4682-8b1d-207c9a437581") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.024798 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.024847 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.024859 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.024880 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.024898 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:36Z","lastTransitionTime":"2026-01-29T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.115952 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 19:27:33.257445439 +0000 UTC Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.128905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.128952 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.129021 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.129056 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.129079 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:36Z","lastTransitionTime":"2026-01-29T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.148852 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.149088 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.166125 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.180821 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.195272 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.209856 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.226404 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.231612 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.231708 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.231724 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.231747 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.231765 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:36Z","lastTransitionTime":"2026-01-29T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.255183 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"dbus.Signature{str:\\\\\\\"(uo)\\\\\\\"}, value:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\"}}, \\\\\\\"StateChangeTimestamp\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x64987519d763c}, \\\\\\\"StateChangeTimestampMonotonic\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x4d04347}, \\\\\\\"SubState\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"running\\\\\\\"}}, []string{\\\\\\\"Conditions\\\\\\\", \\\\\\\"Asserts\\\\\\\"}}, Sequence:0x20}\\\\nI0129 14:03:18.952571 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1\\\\\\\", Name:\\\\\\\"org.freedesktop.systemd1.Manager.JobRemoved\\\\\\\", Body:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\", \\\\\\\"crio-conmon-ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e.scope\\\\\\\", \\\\\\\"done\\\\\\\"}, Sequence:0x21}\\\\nI0129 14:03:18.952582 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2dconmon_2dded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x22}\\\\nI0129 14:03:18.952593 6189 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.272434 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.287412 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.299140 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.313927 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.327220 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.335415 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.335468 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.335482 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.335502 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.335515 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:36Z","lastTransitionTime":"2026-01-29T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.342429 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.355654 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.371913 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.404992 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.427992 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.438192 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.438235 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.438246 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.438268 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.438281 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:36Z","lastTransitionTime":"2026-01-29T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.443122 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:36Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.541254 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.541316 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.541336 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.541361 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.541380 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:36Z","lastTransitionTime":"2026-01-29T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.644237 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.644296 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.644314 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.644339 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.644357 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:36Z","lastTransitionTime":"2026-01-29T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.747440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.747523 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.747566 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.747725 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.747746 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:36Z","lastTransitionTime":"2026-01-29T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.851238 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.851313 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.851341 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.851375 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.851406 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:36Z","lastTransitionTime":"2026-01-29T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.880616 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.880667 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.880835 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.880831 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.880918 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:04:08.88089801 +0000 UTC m=+83.575632392 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.880941 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:04:08.880933591 +0000 UTC m=+83.575667973 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.954931 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.955004 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.955027 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.955059 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.955086 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:36Z","lastTransitionTime":"2026-01-29T14:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.981266 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.981425 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.981458 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:08.981423337 +0000 UTC m=+83.676157769 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:03:36 crc kubenswrapper[4753]: I0129 14:03:36.981509 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.981698 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.981740 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.981757 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.981803 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.981843 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.981870 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.981847 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 14:04:08.981822707 +0000 UTC m=+83.676557089 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:36 crc kubenswrapper[4753]: E0129 14:03:36.981975 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 14:04:08.981952521 +0000 UTC m=+83.676687113 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.059148 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.059225 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.059239 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.059265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.059282 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:37Z","lastTransitionTime":"2026-01-29T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.116887 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 09:41:21.827107031 +0000 UTC Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.149391 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.149422 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.149562 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:37 crc kubenswrapper[4753]: E0129 14:03:37.149668 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:37 crc kubenswrapper[4753]: E0129 14:03:37.149844 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:37 crc kubenswrapper[4753]: E0129 14:03:37.150063 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.162087 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.162128 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.162140 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.162159 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.162185 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:37Z","lastTransitionTime":"2026-01-29T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.264694 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.264754 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.264768 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.264788 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.264801 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:37Z","lastTransitionTime":"2026-01-29T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.367788 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.367836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.367850 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.367870 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.367887 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:37Z","lastTransitionTime":"2026-01-29T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.471656 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.471730 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.471756 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.471786 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.471810 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:37Z","lastTransitionTime":"2026-01-29T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.576138 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.576225 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.576241 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.576264 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.576279 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:37Z","lastTransitionTime":"2026-01-29T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.685639 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.685763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.685785 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.685809 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.685829 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:37Z","lastTransitionTime":"2026-01-29T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.788692 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.788765 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.788784 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.788807 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.788824 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:37Z","lastTransitionTime":"2026-01-29T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.891692 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.891720 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.891728 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.891742 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.891752 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:37Z","lastTransitionTime":"2026-01-29T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.994296 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.994329 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.994337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.994352 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:37 crc kubenswrapper[4753]: I0129 14:03:37.994361 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:37Z","lastTransitionTime":"2026-01-29T14:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.097681 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.097755 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.097783 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.097814 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.097835 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:38Z","lastTransitionTime":"2026-01-29T14:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.117206 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 13:28:12.686515607 +0000 UTC Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.149132 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:38 crc kubenswrapper[4753]: E0129 14:03:38.149678 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.200529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.200856 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.200920 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.200989 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.201053 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:38Z","lastTransitionTime":"2026-01-29T14:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.304138 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.304268 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.304335 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.304368 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.304392 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:38Z","lastTransitionTime":"2026-01-29T14:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.406573 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.406630 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.406664 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.406694 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.406714 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:38Z","lastTransitionTime":"2026-01-29T14:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.510037 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.510122 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.510145 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.510235 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.510261 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:38Z","lastTransitionTime":"2026-01-29T14:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.613538 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.613607 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.613618 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.613640 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.613653 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:38Z","lastTransitionTime":"2026-01-29T14:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.716855 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.716913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.716927 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.716950 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.716968 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:38Z","lastTransitionTime":"2026-01-29T14:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.820461 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.820508 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.820523 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.820543 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.820554 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:38Z","lastTransitionTime":"2026-01-29T14:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.922868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.922939 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.922957 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.922998 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:38 crc kubenswrapper[4753]: I0129 14:03:38.923021 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:38Z","lastTransitionTime":"2026-01-29T14:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.026236 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.026302 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.026323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.026348 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.026368 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:39Z","lastTransitionTime":"2026-01-29T14:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.118753 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 05:53:32.293024006 +0000 UTC Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.129111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.129144 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.129176 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.129194 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.129208 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:39Z","lastTransitionTime":"2026-01-29T14:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.149401 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.149480 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.149431 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:39 crc kubenswrapper[4753]: E0129 14:03:39.149638 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:39 crc kubenswrapper[4753]: E0129 14:03:39.149827 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:39 crc kubenswrapper[4753]: E0129 14:03:39.149935 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.232219 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.232280 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.232301 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.232328 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.232347 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:39Z","lastTransitionTime":"2026-01-29T14:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.334489 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.334544 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.334557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.334576 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.334587 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:39Z","lastTransitionTime":"2026-01-29T14:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.437724 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.437804 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.437823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.437910 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.437933 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:39Z","lastTransitionTime":"2026-01-29T14:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.540360 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.540404 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.540417 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.540435 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.540449 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:39Z","lastTransitionTime":"2026-01-29T14:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.643446 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.643514 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.643527 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.643548 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.643563 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:39Z","lastTransitionTime":"2026-01-29T14:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.747407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.747481 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.747503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.747532 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.747551 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:39Z","lastTransitionTime":"2026-01-29T14:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.850953 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.851010 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.851031 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.851051 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.851065 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:39Z","lastTransitionTime":"2026-01-29T14:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.954999 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.955056 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.955069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.955094 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:39 crc kubenswrapper[4753]: I0129 14:03:39.955109 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:39Z","lastTransitionTime":"2026-01-29T14:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.058492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.058609 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.058637 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.058670 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.058694 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:40Z","lastTransitionTime":"2026-01-29T14:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.119106 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 05:35:00.938913855 +0000 UTC Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.148722 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:40 crc kubenswrapper[4753]: E0129 14:03:40.148917 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.164709 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.164767 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.164786 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.164811 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.164829 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:40Z","lastTransitionTime":"2026-01-29T14:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.267569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.267870 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.268003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.268131 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.268302 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:40Z","lastTransitionTime":"2026-01-29T14:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.371727 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.371799 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.371823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.371852 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.371874 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:40Z","lastTransitionTime":"2026-01-29T14:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.475124 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.475641 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.475823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.476056 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.476305 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:40Z","lastTransitionTime":"2026-01-29T14:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.579972 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.580024 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.580042 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.580067 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.580086 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:40Z","lastTransitionTime":"2026-01-29T14:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.684194 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.684270 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.684289 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.684322 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.684346 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:40Z","lastTransitionTime":"2026-01-29T14:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.787569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.787642 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.787660 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.787686 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.787704 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:40Z","lastTransitionTime":"2026-01-29T14:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.890512 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.890548 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.890557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.890569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.890579 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:40Z","lastTransitionTime":"2026-01-29T14:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.994118 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.994190 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.994203 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.994226 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:40 crc kubenswrapper[4753]: I0129 14:03:40.994240 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:40Z","lastTransitionTime":"2026-01-29T14:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.098588 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.098643 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.098657 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.098679 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.098691 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:41Z","lastTransitionTime":"2026-01-29T14:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.119653 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 03:49:38.059852661 +0000 UTC Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.148735 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:41 crc kubenswrapper[4753]: E0129 14:03:41.148887 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.148753 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:41 crc kubenswrapper[4753]: E0129 14:03:41.149133 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.149309 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:41 crc kubenswrapper[4753]: E0129 14:03:41.149597 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.201058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.201375 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.201484 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.201559 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.201638 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:41Z","lastTransitionTime":"2026-01-29T14:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.304569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.304638 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.304658 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.304685 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.304704 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:41Z","lastTransitionTime":"2026-01-29T14:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.407501 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.407599 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.407629 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.407660 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.407684 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:41Z","lastTransitionTime":"2026-01-29T14:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.460646 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.476738 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.482258 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.504335 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.510313 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.510373 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.510397 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.510425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.510445 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:41Z","lastTransitionTime":"2026-01-29T14:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.520447 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.539695 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.557458 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.579283 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.596999 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.611267 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.613130 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.613192 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.613206 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.613231 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.613247 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:41Z","lastTransitionTime":"2026-01-29T14:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.634996 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.653289 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.676479 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.700098 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.716523 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.716592 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.716609 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.716638 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.716656 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:41Z","lastTransitionTime":"2026-01-29T14:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.722769 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.741434 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.764888 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.783643 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.819611 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.819653 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.819662 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.819676 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.819687 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:41Z","lastTransitionTime":"2026-01-29T14:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.819639 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"dbus.Signature{str:\\\\\\\"(uo)\\\\\\\"}, value:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\"}}, \\\\\\\"StateChangeTimestamp\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x64987519d763c}, \\\\\\\"StateChangeTimestampMonotonic\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x4d04347}, \\\\\\\"SubState\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"running\\\\\\\"}}, []string{\\\\\\\"Conditions\\\\\\\", \\\\\\\"Asserts\\\\\\\"}}, Sequence:0x20}\\\\nI0129 14:03:18.952571 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1\\\\\\\", Name:\\\\\\\"org.freedesktop.systemd1.Manager.JobRemoved\\\\\\\", Body:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\", \\\\\\\"crio-conmon-ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e.scope\\\\\\\", \\\\\\\"done\\\\\\\"}, Sequence:0x21}\\\\nI0129 14:03:18.952582 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2dconmon_2dded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x22}\\\\nI0129 14:03:18.952593 6189 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:41Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.922651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.922734 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.922760 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.922790 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:41 crc kubenswrapper[4753]: I0129 14:03:41.922814 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:41Z","lastTransitionTime":"2026-01-29T14:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.025585 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.025683 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.025705 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.025776 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.025807 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.120760 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 12:52:21.60142801 +0000 UTC Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.129281 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.129322 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.129339 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.129357 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.129368 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.149574 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:42 crc kubenswrapper[4753]: E0129 14:03:42.150038 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.150323 4753 scope.go:117] "RemoveContainer" containerID="f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.232678 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.233281 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.233348 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.233422 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.233534 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.336772 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.337270 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.337475 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.337712 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.337920 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.441497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.441537 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.441551 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.441569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.441579 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.528314 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/1.log" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.531395 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.531892 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.544285 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.544381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.544406 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.544441 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.544465 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.546685 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.564422 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.587584 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.607992 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd67e089-3116-4162-b956-e2a8c2c71beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37d5fc7c12e2b1310d540f26ef183c81181b633925e1ae8eaa54cd0852a80c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d367f98d62f21486659db334571303be0b003c240182fe5fc70f072014f31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bff2518f4e02462e15ff6ddba09a3c44fb04cb19072d20841bb3fba30106d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.623673 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.646478 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.647493 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.647525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.647536 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.647553 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.647564 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.664377 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.688684 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.716287 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"dbus.Signature{str:\\\\\\\"(uo)\\\\\\\"}, value:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\"}}, \\\\\\\"StateChangeTimestamp\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x64987519d763c}, \\\\\\\"StateChangeTimestampMonotonic\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x4d04347}, \\\\\\\"SubState\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"running\\\\\\\"}}, []string{\\\\\\\"Conditions\\\\\\\", \\\\\\\"Asserts\\\\\\\"}}, Sequence:0x20}\\\\nI0129 14:03:18.952571 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1\\\\\\\", Name:\\\\\\\"org.freedesktop.systemd1.Manager.JobRemoved\\\\\\\", Body:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\", \\\\\\\"crio-conmon-ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e.scope\\\\\\\", \\\\\\\"done\\\\\\\"}, Sequence:0x21}\\\\nI0129 14:03:18.952582 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2dconmon_2dded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x22}\\\\nI0129 14:03:18.952593 6189 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.733096 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.750406 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.750449 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.750458 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.750474 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.750483 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.751104 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.765303 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.790753 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.807563 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.820594 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.844372 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.853719 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.853782 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.853795 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.853815 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.853832 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.860644 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.877239 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.887707 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.887757 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.887773 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.887795 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.887812 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: E0129 14:03:42.909547 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.917519 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.917585 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.917599 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.917620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.917634 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: E0129 14:03:42.933526 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.937838 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.937882 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.937894 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.937914 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.937930 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: E0129 14:03:42.950257 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.954003 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.954041 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.954052 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.954068 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.954079 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: E0129 14:03:42.966177 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.970657 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.970696 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.970708 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.970725 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.970738 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:42 crc kubenswrapper[4753]: E0129 14:03:42.985664 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:42Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:42 crc kubenswrapper[4753]: E0129 14:03:42.985837 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.987713 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.987763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.987774 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.987792 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:42 crc kubenswrapper[4753]: I0129 14:03:42.987806 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:42Z","lastTransitionTime":"2026-01-29T14:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.091476 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.091537 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.091550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.091573 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.091587 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:43Z","lastTransitionTime":"2026-01-29T14:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.121922 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 13:33:28.169442882 +0000 UTC Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.148682 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.148763 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.148841 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:43 crc kubenswrapper[4753]: E0129 14:03:43.148876 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:43 crc kubenswrapper[4753]: E0129 14:03:43.148971 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:43 crc kubenswrapper[4753]: E0129 14:03:43.149044 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.194521 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.194572 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.194585 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.194605 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.194618 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:43Z","lastTransitionTime":"2026-01-29T14:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.298333 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.298387 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.298405 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.298428 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.298446 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:43Z","lastTransitionTime":"2026-01-29T14:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.401997 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.402055 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.402074 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.402096 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.402114 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:43Z","lastTransitionTime":"2026-01-29T14:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.505557 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.505678 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.505704 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.505733 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.505753 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:43Z","lastTransitionTime":"2026-01-29T14:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.538572 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/2.log" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.539665 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/1.log" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.543673 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" exitCode=1 Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.543760 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491"} Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.543862 4753 scope.go:117] "RemoveContainer" containerID="f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.545025 4753 scope.go:117] "RemoveContainer" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" Jan 29 14:03:43 crc kubenswrapper[4753]: E0129 14:03:43.545393 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.567115 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.586286 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.604354 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.609361 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.609402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.609414 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.609433 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.609445 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:43Z","lastTransitionTime":"2026-01-29T14:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.639755 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.657199 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.674196 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.698393 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.712480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.712546 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.712560 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.712581 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.712595 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:43Z","lastTransitionTime":"2026-01-29T14:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.718958 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.735628 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.756045 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.772990 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.787485 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.803118 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd67e089-3116-4162-b956-e2a8c2c71beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37d5fc7c12e2b1310d540f26ef183c81181b633925e1ae8eaa54cd0852a80c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d367f98d62f21486659db334571303be0b003c240182fe5fc70f072014f31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bff2518f4e02462e15ff6ddba09a3c44fb04cb19072d20841bb3fba30106d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.815635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.815699 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.815714 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.815731 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.815743 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:43Z","lastTransitionTime":"2026-01-29T14:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.817922 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.831788 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.847104 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.862859 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.885577 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3092cbdfb3c0df89954ac0325bbd214ee2cf7331d1ab414695aefcf43e30139\\\",\\\"exitCode\\\":2,\\\"finishedAt\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"message\\\":\\\"dbus.Signature{str:\\\\\\\"(uo)\\\\\\\"}, value:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\"}}, \\\\\\\"StateChangeTimestamp\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x64987519d763c}, \\\\\\\"StateChangeTimestampMonotonic\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"t\\\\\\\"}, value:0x4d04347}, \\\\\\\"SubState\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"running\\\\\\\"}}, []string{\\\\\\\"Conditions\\\\\\\", \\\\\\\"Asserts\\\\\\\"}}, Sequence:0x20}\\\\nI0129 14:03:18.952571 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1\\\\\\\", Name:\\\\\\\"org.freedesktop.systemd1.Manager.JobRemoved\\\\\\\", Body:[]interface {}{0x91e, \\\\\\\"/org/freedesktop/systemd1/job/2334\\\\\\\", \\\\\\\"crio-conmon-ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e.scope\\\\\\\", \\\\\\\"done\\\\\\\"}, Sequence:0x21}\\\\nI0129 14:03:18.952582 6189 udn_isolation.go:361] D-Bus event received: \\\\u0026dbus.Signal{Sender:\\\\\\\"org.freedesktop.systemd1\\\\\\\", Path:\\\\\\\"/org/freedesktop/systemd1/unit/crio_2dconmon_2dded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e_2escope\\\\\\\", Name:\\\\\\\"org.freedesktop.DBus.Properties.PropertiesChanged\\\\\\\", Body:[]interface {}{\\\\\\\"org.freedesktop.systemd1.Scope\\\\\\\", map[string]dbus.Variant{\\\\\\\"Controller\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"\\\\\\\"}, \\\\\\\"Result\\\\\\\":dbus.Variant{sig:dbus.Signature{str:\\\\\\\"s\\\\\\\"}, value:\\\\\\\"success\\\\\\\"}}, []string{}}, Sequence:0x22}\\\\nI0129 14:03:18.952593 6189 ud\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:43Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 14:03:43.161657 6453 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0129 14:03:43.161748 6453 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0129 14:03:43.161770 6453 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0129 14:03:43.161862 6453 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 14:03:43.161902 6453 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 14:03:43.162242 6453 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 14:03:43.162337 6453 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 14:03:43.162373 6453 ovnkube.go:599] Stopped ovnkube\\\\nI0129 14:03:43.162427 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 14:03:43.162607 6453 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:43Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.919469 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.919540 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.919558 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.919584 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:43 crc kubenswrapper[4753]: I0129 14:03:43.919607 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:43Z","lastTransitionTime":"2026-01-29T14:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.022517 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.022603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.022639 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.022661 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.022678 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:44Z","lastTransitionTime":"2026-01-29T14:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.122803 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 14:47:10.050336544 +0000 UTC Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.125419 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.125492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.125513 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.125550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.125587 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:44Z","lastTransitionTime":"2026-01-29T14:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.148533 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:44 crc kubenswrapper[4753]: E0129 14:03:44.148749 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.229124 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.229227 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.229245 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.229267 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.229284 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:44Z","lastTransitionTime":"2026-01-29T14:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.333175 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.333235 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.333251 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.333273 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.333286 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:44Z","lastTransitionTime":"2026-01-29T14:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.436320 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.436367 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.436408 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.436430 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.436442 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:44Z","lastTransitionTime":"2026-01-29T14:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.539530 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.539597 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.539620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.539651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.539673 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:44Z","lastTransitionTime":"2026-01-29T14:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.550193 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/2.log" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.555770 4753 scope.go:117] "RemoveContainer" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" Jan 29 14:03:44 crc kubenswrapper[4753]: E0129 14:03:44.556071 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.576374 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.601526 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.631246 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.642352 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.642408 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.642425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.642479 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.642499 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:44Z","lastTransitionTime":"2026-01-29T14:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.665365 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.682567 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.695610 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.711112 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.724244 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.735604 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.745523 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.745567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.745580 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.745599 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.745613 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:44Z","lastTransitionTime":"2026-01-29T14:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.751442 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.766998 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.788395 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.805918 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd67e089-3116-4162-b956-e2a8c2c71beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37d5fc7c12e2b1310d540f26ef183c81181b633925e1ae8eaa54cd0852a80c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d367f98d62f21486659db334571303be0b003c240182fe5fc70f072014f31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bff2518f4e02462e15ff6ddba09a3c44fb04cb19072d20841bb3fba30106d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.823940 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.838977 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.848490 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.848551 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.848570 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.848595 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.848614 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:44Z","lastTransitionTime":"2026-01-29T14:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.855746 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.870523 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.898037 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:43Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 14:03:43.161657 6453 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0129 14:03:43.161748 6453 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0129 14:03:43.161770 6453 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0129 14:03:43.161862 6453 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 14:03:43.161902 6453 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 14:03:43.162242 6453 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 14:03:43.162337 6453 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 14:03:43.162373 6453 ovnkube.go:599] Stopped ovnkube\\\\nI0129 14:03:43.162427 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 14:03:43.162607 6453 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:44Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.951715 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.951786 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.951802 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.951829 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:44 crc kubenswrapper[4753]: I0129 14:03:44.951849 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:44Z","lastTransitionTime":"2026-01-29T14:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.055897 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.056046 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.056070 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.056095 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.056112 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:45Z","lastTransitionTime":"2026-01-29T14:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.122989 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 23:09:36.185316385 +0000 UTC Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.148852 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.148968 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.148871 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:45 crc kubenswrapper[4753]: E0129 14:03:45.149066 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:45 crc kubenswrapper[4753]: E0129 14:03:45.149247 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:45 crc kubenswrapper[4753]: E0129 14:03:45.149451 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.159553 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.159603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.159621 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.159646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.159665 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:45Z","lastTransitionTime":"2026-01-29T14:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.263588 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.263651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.263670 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.263697 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.263717 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:45Z","lastTransitionTime":"2026-01-29T14:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.367504 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.367586 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.367613 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.367643 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.367668 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:45Z","lastTransitionTime":"2026-01-29T14:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.470879 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.470940 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.470959 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.470987 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.471011 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:45Z","lastTransitionTime":"2026-01-29T14:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.573923 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.573992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.574014 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.574046 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.574074 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:45Z","lastTransitionTime":"2026-01-29T14:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.677124 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.677239 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.677272 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.677323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.677345 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:45Z","lastTransitionTime":"2026-01-29T14:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.781118 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.781229 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.781249 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.781281 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.781307 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:45Z","lastTransitionTime":"2026-01-29T14:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.884253 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.884322 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.884334 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.884365 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.884379 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:45Z","lastTransitionTime":"2026-01-29T14:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.987482 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.987579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.987598 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.987628 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:45 crc kubenswrapper[4753]: I0129 14:03:45.987646 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:45Z","lastTransitionTime":"2026-01-29T14:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.090554 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.090603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.090615 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.090634 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.090646 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:46Z","lastTransitionTime":"2026-01-29T14:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.127452 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 05:29:34.32404291 +0000 UTC Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.150092 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:46 crc kubenswrapper[4753]: E0129 14:03:46.150351 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.169819 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.186308 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.193067 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.193096 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.193106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.193121 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.193132 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:46Z","lastTransitionTime":"2026-01-29T14:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.203094 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.217791 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.237048 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:43Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 14:03:43.161657 6453 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0129 14:03:43.161748 6453 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0129 14:03:43.161770 6453 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0129 14:03:43.161862 6453 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 14:03:43.161902 6453 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 14:03:43.162242 6453 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 14:03:43.162337 6453 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 14:03:43.162373 6453 ovnkube.go:599] Stopped ovnkube\\\\nI0129 14:03:43.162427 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 14:03:43.162607 6453 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.261363 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd67e089-3116-4162-b956-e2a8c2c71beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37d5fc7c12e2b1310d540f26ef183c81181b633925e1ae8eaa54cd0852a80c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d367f98d62f21486659db334571303be0b003c240182fe5fc70f072014f31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bff2518f4e02462e15ff6ddba09a3c44fb04cb19072d20841bb3fba30106d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.282674 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.296050 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.296127 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.296142 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.296190 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.296207 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:46Z","lastTransitionTime":"2026-01-29T14:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.298570 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.313607 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.337498 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.358210 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.372462 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.390891 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.399527 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.399599 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.399617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.399644 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.399662 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:46Z","lastTransitionTime":"2026-01-29T14:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.407642 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.442284 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.463982 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.481004 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.502454 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.502512 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.502536 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.502567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.502589 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:46Z","lastTransitionTime":"2026-01-29T14:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.508889 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:46Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.606523 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.606598 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.606617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.606643 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.606661 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:46Z","lastTransitionTime":"2026-01-29T14:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.710329 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.710412 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.710435 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.710468 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.710491 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:46Z","lastTransitionTime":"2026-01-29T14:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.813767 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.813831 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.813841 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.813861 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.813873 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:46Z","lastTransitionTime":"2026-01-29T14:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.917973 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.918055 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.918082 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.918118 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:46 crc kubenswrapper[4753]: I0129 14:03:46.918142 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:46Z","lastTransitionTime":"2026-01-29T14:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.021123 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.021195 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.021204 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.021220 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.021232 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:47Z","lastTransitionTime":"2026-01-29T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.124342 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.124406 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.124426 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.124452 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.124472 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:47Z","lastTransitionTime":"2026-01-29T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.127729 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:35:35.689121243 +0000 UTC Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.149424 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.149482 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.149478 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:47 crc kubenswrapper[4753]: E0129 14:03:47.149640 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:47 crc kubenswrapper[4753]: E0129 14:03:47.149782 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:47 crc kubenswrapper[4753]: E0129 14:03:47.149936 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.228604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.228676 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.228693 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.228719 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.228743 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:47Z","lastTransitionTime":"2026-01-29T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.332223 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.332300 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.332320 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.332351 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.332376 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:47Z","lastTransitionTime":"2026-01-29T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.435639 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.435710 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.435729 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.435758 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.435777 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:47Z","lastTransitionTime":"2026-01-29T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.539300 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.539379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.539396 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.539422 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.539440 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:47Z","lastTransitionTime":"2026-01-29T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.642319 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.642380 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.642392 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.642417 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.642435 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:47Z","lastTransitionTime":"2026-01-29T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.745525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.745567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.745576 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.745593 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.745603 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:47Z","lastTransitionTime":"2026-01-29T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.848490 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.848547 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.848565 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.848589 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.848606 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:47Z","lastTransitionTime":"2026-01-29T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.951541 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.951596 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.951607 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.951626 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:47 crc kubenswrapper[4753]: I0129 14:03:47.951641 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:47Z","lastTransitionTime":"2026-01-29T14:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.055226 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.055292 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.055310 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.055337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.055359 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:48Z","lastTransitionTime":"2026-01-29T14:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.128842 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 14:18:43.74013408 +0000 UTC Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.149331 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:48 crc kubenswrapper[4753]: E0129 14:03:48.149522 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.157916 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.157959 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.157968 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.157982 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.157992 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:48Z","lastTransitionTime":"2026-01-29T14:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.261126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.261228 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.261247 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.261275 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.261294 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:48Z","lastTransitionTime":"2026-01-29T14:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.364450 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.364545 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.364569 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.364596 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.364615 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:48Z","lastTransitionTime":"2026-01-29T14:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.467576 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.467639 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.467657 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.467683 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.467702 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:48Z","lastTransitionTime":"2026-01-29T14:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.570886 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.570951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.570965 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.570987 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.571001 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:48Z","lastTransitionTime":"2026-01-29T14:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.674292 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.674373 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.674397 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.674428 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.674451 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:48Z","lastTransitionTime":"2026-01-29T14:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.778102 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.778307 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.778328 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.778357 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.778375 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:48Z","lastTransitionTime":"2026-01-29T14:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.882686 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.882757 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.882775 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.882805 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.882825 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:48Z","lastTransitionTime":"2026-01-29T14:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.986492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.986605 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.986625 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.986657 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:48 crc kubenswrapper[4753]: I0129 14:03:48.986679 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:48Z","lastTransitionTime":"2026-01-29T14:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.089494 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.089540 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.089553 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.089573 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.089588 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:49Z","lastTransitionTime":"2026-01-29T14:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.129774 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 14:22:36.306975505 +0000 UTC Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.149221 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.149249 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.149229 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:49 crc kubenswrapper[4753]: E0129 14:03:49.149362 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:49 crc kubenswrapper[4753]: E0129 14:03:49.149608 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:49 crc kubenswrapper[4753]: E0129 14:03:49.149652 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.192790 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.192831 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.192842 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.192858 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.192870 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:49Z","lastTransitionTime":"2026-01-29T14:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.297212 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.297347 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.297374 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.297404 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.297427 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:49Z","lastTransitionTime":"2026-01-29T14:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.401501 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.401585 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.401603 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.401631 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.401651 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:49Z","lastTransitionTime":"2026-01-29T14:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.504295 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.504357 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.504370 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.504423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.504437 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:49Z","lastTransitionTime":"2026-01-29T14:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.608140 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.608231 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.608280 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.608306 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.608324 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:49Z","lastTransitionTime":"2026-01-29T14:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.713973 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.714033 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.714049 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.714077 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.714095 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:49Z","lastTransitionTime":"2026-01-29T14:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.817407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.817511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.817566 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.817595 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.817646 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:49Z","lastTransitionTime":"2026-01-29T14:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.921220 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.921284 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.921301 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.921327 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:49 crc kubenswrapper[4753]: I0129 14:03:49.921348 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:49Z","lastTransitionTime":"2026-01-29T14:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.025199 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.025432 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.025457 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.025493 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.025514 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:50Z","lastTransitionTime":"2026-01-29T14:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.128526 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.128579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.128589 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.128606 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.128621 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:50Z","lastTransitionTime":"2026-01-29T14:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.130726 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 14:52:27.593153916 +0000 UTC Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.149101 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:50 crc kubenswrapper[4753]: E0129 14:03:50.149263 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.231391 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.231452 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.231472 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.231497 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.231517 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:50Z","lastTransitionTime":"2026-01-29T14:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.335535 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.335629 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.335645 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.335677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.335699 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:50Z","lastTransitionTime":"2026-01-29T14:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.438706 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.438750 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.438766 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.438787 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.438800 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:50Z","lastTransitionTime":"2026-01-29T14:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.542584 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.542650 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.542666 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.542691 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.542710 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:50Z","lastTransitionTime":"2026-01-29T14:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.646637 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.646760 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.646781 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.646809 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.646830 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:50Z","lastTransitionTime":"2026-01-29T14:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.749993 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.750059 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.750075 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.750095 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.750108 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:50Z","lastTransitionTime":"2026-01-29T14:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.852448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.852509 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.852519 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.852540 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.852552 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:50Z","lastTransitionTime":"2026-01-29T14:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.955884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.955947 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.955963 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.955982 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:50 crc kubenswrapper[4753]: I0129 14:03:50.955993 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:50Z","lastTransitionTime":"2026-01-29T14:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.058731 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.058782 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.058793 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.058815 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.058830 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:51Z","lastTransitionTime":"2026-01-29T14:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.179410 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.179480 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.179534 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 23:41:57.544145644 +0000 UTC Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.179671 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:51 crc kubenswrapper[4753]: E0129 14:03:51.179853 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:51 crc kubenswrapper[4753]: E0129 14:03:51.180035 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:51 crc kubenswrapper[4753]: E0129 14:03:51.180111 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.181082 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.181131 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.181194 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.181220 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.181237 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:51Z","lastTransitionTime":"2026-01-29T14:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.284059 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.284117 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.284133 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.284172 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.284188 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:51Z","lastTransitionTime":"2026-01-29T14:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.386549 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.386594 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.386604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.386618 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.386628 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:51Z","lastTransitionTime":"2026-01-29T14:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.489393 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.489452 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.489471 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.489495 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.489512 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:51Z","lastTransitionTime":"2026-01-29T14:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.591551 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.591594 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.591606 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.591623 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.591636 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:51Z","lastTransitionTime":"2026-01-29T14:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.694601 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.694640 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.694649 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.694664 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.694674 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:51Z","lastTransitionTime":"2026-01-29T14:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.797622 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.797690 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.797699 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.797720 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.797733 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:51Z","lastTransitionTime":"2026-01-29T14:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.900482 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.900542 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.900553 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.900568 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.900581 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:51Z","lastTransitionTime":"2026-01-29T14:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:51 crc kubenswrapper[4753]: I0129 14:03:51.991049 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:51 crc kubenswrapper[4753]: E0129 14:03:51.991322 4753 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:51 crc kubenswrapper[4753]: E0129 14:03:51.991429 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs podName:281741ae-7781-4682-8b1d-207c9a437581 nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.99139919 +0000 UTC m=+98.686133572 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs") pod "network-metrics-daemon-57lv7" (UID: "281741ae-7781-4682-8b1d-207c9a437581") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.003238 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.003310 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.003321 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.003363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.003374 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:52Z","lastTransitionTime":"2026-01-29T14:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.106667 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.106744 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.106768 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.106798 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.106822 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:52Z","lastTransitionTime":"2026-01-29T14:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.149412 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:52 crc kubenswrapper[4753]: E0129 14:03:52.149590 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.180005 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 20:07:26.009743415 +0000 UTC Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.209248 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.209308 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.209324 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.209344 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.209356 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:52Z","lastTransitionTime":"2026-01-29T14:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.312486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.312550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.312571 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.312596 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.312616 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:52Z","lastTransitionTime":"2026-01-29T14:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.415394 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.415436 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.415448 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.415464 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.415475 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:52Z","lastTransitionTime":"2026-01-29T14:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.518466 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.518505 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.518517 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.518534 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.518548 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:52Z","lastTransitionTime":"2026-01-29T14:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.621665 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.621711 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.621721 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.621737 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.621749 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:52Z","lastTransitionTime":"2026-01-29T14:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.724992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.725038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.725047 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.725064 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.725075 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:52Z","lastTransitionTime":"2026-01-29T14:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.828466 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.828563 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.828580 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.828614 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.828634 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:52Z","lastTransitionTime":"2026-01-29T14:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.931614 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.931665 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.931677 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.931695 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:52 crc kubenswrapper[4753]: I0129 14:03:52.931707 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:52Z","lastTransitionTime":"2026-01-29T14:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.034369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.034434 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.034454 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.034478 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.034500 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.105869 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.105941 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.105955 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.105995 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.106010 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: E0129 14:03:53.123971 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.128686 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.128745 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.128764 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.128790 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.128810 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: E0129 14:03:53.143644 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.147543 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.147586 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.147600 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.147619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.147633 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.148885 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.148950 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.148969 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:53 crc kubenswrapper[4753]: E0129 14:03:53.149022 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:53 crc kubenswrapper[4753]: E0129 14:03:53.149122 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:53 crc kubenswrapper[4753]: E0129 14:03:53.149250 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:53 crc kubenswrapper[4753]: E0129 14:03:53.159966 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.164819 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.164859 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.164872 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.164889 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.164902 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: E0129 14:03:53.178588 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.180544 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 03:14:16.975758183 +0000 UTC Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.183739 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.183782 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.183795 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.183812 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.183826 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: E0129 14:03:53.197996 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: E0129 14:03:53.198182 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.200055 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.200097 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.200109 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.200128 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.200141 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.303250 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.303359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.303375 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.303394 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.303407 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.405955 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.406024 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.406042 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.406057 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.406068 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.509228 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.509293 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.509313 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.509341 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.509361 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.591045 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vfrvp_63926a91-5e42-4768-8277-55a0113cb5e2/kube-multus/0.log" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.591099 4753 generic.go:334] "Generic (PLEG): container finished" podID="63926a91-5e42-4768-8277-55a0113cb5e2" containerID="450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6" exitCode=1 Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.591138 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vfrvp" event={"ID":"63926a91-5e42-4768-8277-55a0113cb5e2","Type":"ContainerDied","Data":"450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.591612 4753 scope.go:117] "RemoveContainer" containerID="450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.604498 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.618565 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.618626 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.618651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.618717 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.618744 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.635251 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.650898 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.663584 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.681961 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.695782 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.712712 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.723141 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.723205 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.723218 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.723234 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.723246 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.725959 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.740241 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.761538 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:43Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 14:03:43.161657 6453 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0129 14:03:43.161748 6453 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0129 14:03:43.161770 6453 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0129 14:03:43.161862 6453 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 14:03:43.161902 6453 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 14:03:43.162242 6453 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 14:03:43.162337 6453 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 14:03:43.162373 6453 ovnkube.go:599] Stopped ovnkube\\\\nI0129 14:03:43.162427 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 14:03:43.162607 6453 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.775723 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd67e089-3116-4162-b956-e2a8c2c71beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37d5fc7c12e2b1310d540f26ef183c81181b633925e1ae8eaa54cd0852a80c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d367f98d62f21486659db334571303be0b003c240182fe5fc70f072014f31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bff2518f4e02462e15ff6ddba09a3c44fb04cb19072d20841bb3fba30106d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.792319 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.806880 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.819856 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.825144 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.825214 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.825228 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.825243 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.825253 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.835512 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"2026-01-29T14:03:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85093681-3c1a-4dcd-bd26-81113c07771c\\\\n2026-01-29T14:03:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85093681-3c1a-4dcd-bd26-81113c07771c to /host/opt/cni/bin/\\\\n2026-01-29T14:03:08Z [verbose] multus-daemon started\\\\n2026-01-29T14:03:08Z [verbose] Readiness Indicator file check\\\\n2026-01-29T14:03:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.852608 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.865137 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.877756 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:53Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.928056 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.928122 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.928135 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.928359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:53 crc kubenswrapper[4753]: I0129 14:03:53.928373 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:53Z","lastTransitionTime":"2026-01-29T14:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.031407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.031461 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.031478 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.031501 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.031518 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:54Z","lastTransitionTime":"2026-01-29T14:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.134057 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.134114 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.134123 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.134137 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.134172 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:54Z","lastTransitionTime":"2026-01-29T14:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.149419 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:54 crc kubenswrapper[4753]: E0129 14:03:54.149578 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.180746 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 00:29:52.844304486 +0000 UTC Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.236511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.236553 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.236565 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.236581 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.236599 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:54Z","lastTransitionTime":"2026-01-29T14:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.339120 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.339192 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.339204 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.339223 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.339236 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:54Z","lastTransitionTime":"2026-01-29T14:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.443479 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.443547 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.443565 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.443590 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.443611 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:54Z","lastTransitionTime":"2026-01-29T14:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.547076 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.547286 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.547381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.547483 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.547561 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:54Z","lastTransitionTime":"2026-01-29T14:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.598657 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vfrvp_63926a91-5e42-4768-8277-55a0113cb5e2/kube-multus/0.log" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.598813 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vfrvp" event={"ID":"63926a91-5e42-4768-8277-55a0113cb5e2","Type":"ContainerStarted","Data":"a178927f4539cbdccdd23649d9477c2c6c0b238d6a9702240d428e1e2fe90d4b"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.619506 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.636739 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.649199 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a178927f4539cbdccdd23649d9477c2c6c0b238d6a9702240d428e1e2fe90d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"2026-01-29T14:03:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85093681-3c1a-4dcd-bd26-81113c07771c\\\\n2026-01-29T14:03:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85093681-3c1a-4dcd-bd26-81113c07771c to /host/opt/cni/bin/\\\\n2026-01-29T14:03:08Z [verbose] multus-daemon started\\\\n2026-01-29T14:03:08Z [verbose] Readiness Indicator file check\\\\n2026-01-29T14:03:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.650761 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.650788 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.650797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.650813 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.650824 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:54Z","lastTransitionTime":"2026-01-29T14:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.677023 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:43Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 14:03:43.161657 6453 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0129 14:03:43.161748 6453 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0129 14:03:43.161770 6453 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0129 14:03:43.161862 6453 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 14:03:43.161902 6453 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 14:03:43.162242 6453 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 14:03:43.162337 6453 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 14:03:43.162373 6453 ovnkube.go:599] Stopped ovnkube\\\\nI0129 14:03:43.162427 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 14:03:43.162607 6453 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.694676 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd67e089-3116-4162-b956-e2a8c2c71beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37d5fc7c12e2b1310d540f26ef183c81181b633925e1ae8eaa54cd0852a80c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d367f98d62f21486659db334571303be0b003c240182fe5fc70f072014f31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bff2518f4e02462e15ff6ddba09a3c44fb04cb19072d20841bb3fba30106d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.710702 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.728947 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.743000 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.753701 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.753752 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.753764 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.753784 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.753802 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:54Z","lastTransitionTime":"2026-01-29T14:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.762730 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.780047 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.797350 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.811697 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.825938 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.854431 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.856039 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.856070 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.856084 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.856104 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.856118 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:54Z","lastTransitionTime":"2026-01-29T14:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.875094 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.890486 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.907311 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.921287 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:54Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.958860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.958927 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.958948 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.958973 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:54 crc kubenswrapper[4753]: I0129 14:03:54.958995 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:54Z","lastTransitionTime":"2026-01-29T14:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.061042 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.061111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.061137 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.061203 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.061300 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:55Z","lastTransitionTime":"2026-01-29T14:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.149363 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.149368 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.149392 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:55 crc kubenswrapper[4753]: E0129 14:03:55.149523 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:55 crc kubenswrapper[4753]: E0129 14:03:55.149783 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:55 crc kubenswrapper[4753]: E0129 14:03:55.149923 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.163589 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.163633 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.163646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.163663 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.163675 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:55Z","lastTransitionTime":"2026-01-29T14:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.181258 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 09:46:07.821016302 +0000 UTC Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.266336 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.266379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.266390 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.266408 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.266423 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:55Z","lastTransitionTime":"2026-01-29T14:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.368758 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.368837 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.368859 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.368884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.368904 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:55Z","lastTransitionTime":"2026-01-29T14:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.471779 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.471841 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.471859 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.471880 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.471893 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:55Z","lastTransitionTime":"2026-01-29T14:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.574730 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.574804 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.574851 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.574879 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.574897 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:55Z","lastTransitionTime":"2026-01-29T14:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.677655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.677746 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.677770 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.677798 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.677822 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:55Z","lastTransitionTime":"2026-01-29T14:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.780740 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.780792 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.780802 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.780821 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.780834 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:55Z","lastTransitionTime":"2026-01-29T14:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.883867 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.883941 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.883953 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.883972 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.883990 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:55Z","lastTransitionTime":"2026-01-29T14:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.987235 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.987312 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.987327 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.987351 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:55 crc kubenswrapper[4753]: I0129 14:03:55.987368 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:55Z","lastTransitionTime":"2026-01-29T14:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.089700 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.089753 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.089763 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.089779 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.089790 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:56Z","lastTransitionTime":"2026-01-29T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.149340 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:56 crc kubenswrapper[4753]: E0129 14:03:56.149686 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.164902 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.165008 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.182383 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 03:33:16.91962169 +0000 UTC Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.192225 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.192265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.192276 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.192294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.192310 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:56Z","lastTransitionTime":"2026-01-29T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.195423 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.209936 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.219955 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.236669 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.248556 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.267080 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.282536 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.299341 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.299388 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.299402 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.299423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.299437 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:56Z","lastTransitionTime":"2026-01-29T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.299959 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.320847 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:43Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 14:03:43.161657 6453 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0129 14:03:43.161748 6453 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0129 14:03:43.161770 6453 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0129 14:03:43.161862 6453 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 14:03:43.161902 6453 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 14:03:43.162242 6453 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 14:03:43.162337 6453 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 14:03:43.162373 6453 ovnkube.go:599] Stopped ovnkube\\\\nI0129 14:03:43.162427 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 14:03:43.162607 6453 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.337589 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd67e089-3116-4162-b956-e2a8c2c71beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37d5fc7c12e2b1310d540f26ef183c81181b633925e1ae8eaa54cd0852a80c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d367f98d62f21486659db334571303be0b003c240182fe5fc70f072014f31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bff2518f4e02462e15ff6ddba09a3c44fb04cb19072d20841bb3fba30106d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.349138 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.360864 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.375468 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.390561 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a178927f4539cbdccdd23649d9477c2c6c0b238d6a9702240d428e1e2fe90d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"2026-01-29T14:03:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85093681-3c1a-4dcd-bd26-81113c07771c\\\\n2026-01-29T14:03:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85093681-3c1a-4dcd-bd26-81113c07771c to /host/opt/cni/bin/\\\\n2026-01-29T14:03:08Z [verbose] multus-daemon started\\\\n2026-01-29T14:03:08Z [verbose] Readiness Indicator file check\\\\n2026-01-29T14:03:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.402872 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.402918 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.402930 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.402951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.402969 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:56Z","lastTransitionTime":"2026-01-29T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.405061 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.419828 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.432281 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:03:56Z is after 2025-08-24T17:21:41Z" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.506752 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.506811 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.506830 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.506855 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.506880 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:56Z","lastTransitionTime":"2026-01-29T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.608570 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.608618 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.608632 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.608651 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.608663 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:56Z","lastTransitionTime":"2026-01-29T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.711367 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.711409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.711420 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.711440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.711453 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:56Z","lastTransitionTime":"2026-01-29T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.815077 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.815133 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.815170 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.815194 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.815209 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:56Z","lastTransitionTime":"2026-01-29T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.918649 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.918712 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.918725 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.918746 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:56 crc kubenswrapper[4753]: I0129 14:03:56.918759 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:56Z","lastTransitionTime":"2026-01-29T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.022384 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.022436 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.022447 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.022464 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.022476 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:57Z","lastTransitionTime":"2026-01-29T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.125783 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.125828 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.125837 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.125853 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.125863 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:57Z","lastTransitionTime":"2026-01-29T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.149275 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.149293 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:57 crc kubenswrapper[4753]: E0129 14:03:57.149427 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:57 crc kubenswrapper[4753]: E0129 14:03:57.149474 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.149689 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:57 crc kubenswrapper[4753]: E0129 14:03:57.149780 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.182744 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 16:13:23.273871378 +0000 UTC Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.229189 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.229282 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.229304 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.229363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.229387 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:57Z","lastTransitionTime":"2026-01-29T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.333516 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.333572 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.333587 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.333607 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.333622 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:57Z","lastTransitionTime":"2026-01-29T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.437563 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.437621 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.437635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.437655 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.437668 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:57Z","lastTransitionTime":"2026-01-29T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.540568 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.540635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.540654 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.540679 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.540696 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:57Z","lastTransitionTime":"2026-01-29T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.644205 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.644259 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.644277 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.644300 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.644317 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:57Z","lastTransitionTime":"2026-01-29T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.747106 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.747282 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.747327 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.747347 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.747379 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:57Z","lastTransitionTime":"2026-01-29T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.850620 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.850673 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.850685 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.850706 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.850718 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:57Z","lastTransitionTime":"2026-01-29T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.953873 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.953929 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.953948 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.953977 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:57 crc kubenswrapper[4753]: I0129 14:03:57.953996 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:57Z","lastTransitionTime":"2026-01-29T14:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.057491 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.057533 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.057544 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.057561 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.057574 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:58Z","lastTransitionTime":"2026-01-29T14:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.148992 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:03:58 crc kubenswrapper[4753]: E0129 14:03:58.149361 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.149778 4753 scope.go:117] "RemoveContainer" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" Jan 29 14:03:58 crc kubenswrapper[4753]: E0129 14:03:58.149988 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.160244 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.160320 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.160332 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.160351 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.160363 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:58Z","lastTransitionTime":"2026-01-29T14:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.183562 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 14:14:44.755174511 +0000 UTC Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.263188 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.263230 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.263243 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.263261 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.263272 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:58Z","lastTransitionTime":"2026-01-29T14:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.366076 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.366124 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.366136 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.366184 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.366203 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:58Z","lastTransitionTime":"2026-01-29T14:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.469038 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.469079 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.469090 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.469111 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.469122 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:58Z","lastTransitionTime":"2026-01-29T14:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.571715 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.571789 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.571802 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.571823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.571837 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:58Z","lastTransitionTime":"2026-01-29T14:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.674742 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.674816 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.674834 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.674932 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.674954 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:58Z","lastTransitionTime":"2026-01-29T14:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.777567 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.777635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.777657 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.777685 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.777704 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:58Z","lastTransitionTime":"2026-01-29T14:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.880775 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.880885 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.880906 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.880982 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.881004 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:58Z","lastTransitionTime":"2026-01-29T14:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.984851 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.984930 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.984951 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.984981 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:58 crc kubenswrapper[4753]: I0129 14:03:58.984998 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:58Z","lastTransitionTime":"2026-01-29T14:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.088265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.088322 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.088337 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.088359 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.088371 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:59Z","lastTransitionTime":"2026-01-29T14:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.148768 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.148837 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.148969 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:03:59 crc kubenswrapper[4753]: E0129 14:03:59.148995 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:03:59 crc kubenswrapper[4753]: E0129 14:03:59.149131 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:03:59 crc kubenswrapper[4753]: E0129 14:03:59.149405 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.183771 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 21:41:35.641505169 +0000 UTC Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.190924 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.190964 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.190973 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.190990 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.191004 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:59Z","lastTransitionTime":"2026-01-29T14:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.294243 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.294290 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.294302 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.294323 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.294335 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:59Z","lastTransitionTime":"2026-01-29T14:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.396804 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.396850 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.396860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.396877 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.396888 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:59Z","lastTransitionTime":"2026-01-29T14:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.499554 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.499607 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.499617 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.499639 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.499650 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:59Z","lastTransitionTime":"2026-01-29T14:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.602699 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.602765 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.602789 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.602818 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.602838 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:59Z","lastTransitionTime":"2026-01-29T14:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.705853 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.705915 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.705928 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.705948 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.705961 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:59Z","lastTransitionTime":"2026-01-29T14:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.808451 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.808499 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.808508 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.808526 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.808538 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:59Z","lastTransitionTime":"2026-01-29T14:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.911144 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.911209 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.911221 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.911239 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:03:59 crc kubenswrapper[4753]: I0129 14:03:59.911249 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:03:59Z","lastTransitionTime":"2026-01-29T14:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.014296 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.014353 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.014364 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.014383 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.014397 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:00Z","lastTransitionTime":"2026-01-29T14:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.116901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.116956 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.116969 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.116988 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.117000 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:00Z","lastTransitionTime":"2026-01-29T14:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.148561 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:00 crc kubenswrapper[4753]: E0129 14:04:00.148817 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.184711 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 18:02:32.926375679 +0000 UTC Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.220415 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.220482 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.220499 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.220525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.220547 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:00Z","lastTransitionTime":"2026-01-29T14:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.323342 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.323398 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.323409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.323428 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.323439 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:00Z","lastTransitionTime":"2026-01-29T14:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.426469 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.426508 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.426518 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.426534 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.426544 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:00Z","lastTransitionTime":"2026-01-29T14:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.529550 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.529614 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.529626 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.529648 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.529667 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:00Z","lastTransitionTime":"2026-01-29T14:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.632525 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.632588 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.632604 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.632623 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.632637 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:00Z","lastTransitionTime":"2026-01-29T14:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.735438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.735499 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.735509 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.735530 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.735543 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:00Z","lastTransitionTime":"2026-01-29T14:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.838568 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.838622 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.838632 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.838652 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.838665 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:00Z","lastTransitionTime":"2026-01-29T14:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.942098 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.942209 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.942237 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.942266 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:00 crc kubenswrapper[4753]: I0129 14:04:00.942289 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:00Z","lastTransitionTime":"2026-01-29T14:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.045181 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.045236 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.045248 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.045267 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.045281 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:01Z","lastTransitionTime":"2026-01-29T14:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.148096 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.148388 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.148351 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.148409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.148452 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.148467 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.148468 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:01Z","lastTransitionTime":"2026-01-29T14:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:01 crc kubenswrapper[4753]: E0129 14:04:01.148531 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:04:01 crc kubenswrapper[4753]: E0129 14:04:01.148611 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.148384 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:01 crc kubenswrapper[4753]: E0129 14:04:01.148730 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.185788 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 22:41:32.179224986 +0000 UTC Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.251198 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.251254 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.251265 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.251284 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.251295 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:01Z","lastTransitionTime":"2026-01-29T14:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.354813 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.354857 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.354869 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.354891 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.354905 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:01Z","lastTransitionTime":"2026-01-29T14:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.457824 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.457891 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.457912 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.457939 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.457959 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:01Z","lastTransitionTime":"2026-01-29T14:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.560285 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.560341 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.560350 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.560366 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.560411 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:01Z","lastTransitionTime":"2026-01-29T14:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.663977 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.664073 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.664099 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.664135 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.664194 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:01Z","lastTransitionTime":"2026-01-29T14:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.768522 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.768580 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.768591 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.768608 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.768622 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:01Z","lastTransitionTime":"2026-01-29T14:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.872686 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.872752 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.872770 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.872803 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.872823 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:01Z","lastTransitionTime":"2026-01-29T14:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.975994 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.976046 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.976060 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.976085 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:01 crc kubenswrapper[4753]: I0129 14:04:01.976100 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:01Z","lastTransitionTime":"2026-01-29T14:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.079330 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.079381 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.079396 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.079416 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.079431 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:02Z","lastTransitionTime":"2026-01-29T14:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.149541 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:02 crc kubenswrapper[4753]: E0129 14:04:02.149808 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.182236 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.182274 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.182283 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.182302 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.182314 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:02Z","lastTransitionTime":"2026-01-29T14:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.186723 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 03:50:41.412134321 +0000 UTC Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.284929 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.284981 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.284993 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.285013 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.285034 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:02Z","lastTransitionTime":"2026-01-29T14:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.388955 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.389010 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.389028 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.389051 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.389068 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:02Z","lastTransitionTime":"2026-01-29T14:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.492898 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.492968 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.492982 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.493008 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.493020 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:02Z","lastTransitionTime":"2026-01-29T14:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.596936 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.597036 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.597062 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.597094 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.597135 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:02Z","lastTransitionTime":"2026-01-29T14:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.699831 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.699897 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.699916 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.699944 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.699965 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:02Z","lastTransitionTime":"2026-01-29T14:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.802979 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.803050 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.803069 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.803095 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.803115 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:02Z","lastTransitionTime":"2026-01-29T14:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.907222 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.907282 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.907291 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.907308 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:02 crc kubenswrapper[4753]: I0129 14:04:02.907319 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:02Z","lastTransitionTime":"2026-01-29T14:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.009761 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.009816 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.009837 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.009860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.009878 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.113540 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.113600 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.113614 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.113635 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.113648 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.149283 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.149429 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.149424 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:04:03 crc kubenswrapper[4753]: E0129 14:04:03.149578 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:04:03 crc kubenswrapper[4753]: E0129 14:04:03.149734 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:04:03 crc kubenswrapper[4753]: E0129 14:04:03.149861 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.187497 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 18:34:45.853837516 +0000 UTC Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.216752 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.216824 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.216858 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.216888 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.216910 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.320942 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.321050 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.321072 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.321104 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.321131 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.401507 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.401577 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.401621 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.401659 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.401686 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: E0129 14:04:03.423821 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:03Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.429640 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.429698 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.429716 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.429741 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.429760 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: E0129 14:04:03.451825 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:03Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.457442 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.457493 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.457508 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.457545 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.457561 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: E0129 14:04:03.478757 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:03Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.484776 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.484823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.484837 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.484863 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.484879 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: E0129 14:04:03.505734 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:03Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.511499 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.511565 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.511585 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.511611 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.511628 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: E0129 14:04:03.527723 4753 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-29T14:04:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd5460f3-0655-48f4-971d-c3e6b7a9c2ef\\\",\\\"systemUUID\\\":\\\"aa3924a6-9f3e-446b-bf11-65e8bcfab058\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:03Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:03 crc kubenswrapper[4753]: E0129 14:04:03.527885 4753 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.529891 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.529968 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.529987 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.530013 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.530032 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.636263 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.636322 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.636335 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.636355 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.636368 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.739998 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.740062 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.740079 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.740099 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.740113 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.843913 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.843984 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.844009 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.844039 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.844061 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.947549 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.947608 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.947625 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.947646 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:03 crc kubenswrapper[4753]: I0129 14:04:03.947664 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:03Z","lastTransitionTime":"2026-01-29T14:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.051140 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.051226 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.051237 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.051255 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.051266 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:04Z","lastTransitionTime":"2026-01-29T14:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.149033 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:04 crc kubenswrapper[4753]: E0129 14:04:04.149672 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.161370 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.161444 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.161461 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.161997 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.162058 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:04Z","lastTransitionTime":"2026-01-29T14:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.187816 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 11:25:51.160139812 +0000 UTC Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.265995 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.266085 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.266103 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.266667 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.266732 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:04Z","lastTransitionTime":"2026-01-29T14:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.371445 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.371542 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.371621 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.371657 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.371681 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:04Z","lastTransitionTime":"2026-01-29T14:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.475315 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.475370 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.475390 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.475417 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.475435 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:04Z","lastTransitionTime":"2026-01-29T14:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.578271 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.578346 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.578370 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.578400 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.578421 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:04Z","lastTransitionTime":"2026-01-29T14:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.682423 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.682492 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.682505 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.682533 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.682552 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:04Z","lastTransitionTime":"2026-01-29T14:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.786223 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.786294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.786306 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.786330 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.786342 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:04Z","lastTransitionTime":"2026-01-29T14:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.889766 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.889834 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.889850 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.889878 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.889898 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:04Z","lastTransitionTime":"2026-01-29T14:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.993367 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.993441 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.993468 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.993500 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:04 crc kubenswrapper[4753]: I0129 14:04:04.993520 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:04Z","lastTransitionTime":"2026-01-29T14:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.097212 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.097269 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.097294 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.097316 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.097332 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:05Z","lastTransitionTime":"2026-01-29T14:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.148773 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.148785 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.148839 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:05 crc kubenswrapper[4753]: E0129 14:04:05.149098 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:04:05 crc kubenswrapper[4753]: E0129 14:04:05.149380 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:04:05 crc kubenswrapper[4753]: E0129 14:04:05.149471 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.188946 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 17:46:21.655257118 +0000 UTC Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.200379 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.200430 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.200442 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.200464 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.200478 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:05Z","lastTransitionTime":"2026-01-29T14:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.304409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.304486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.304510 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.304540 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.304566 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:05Z","lastTransitionTime":"2026-01-29T14:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.408034 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.408219 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.408246 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.408274 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.408296 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:05Z","lastTransitionTime":"2026-01-29T14:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.511750 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.511822 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.511841 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.511867 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.511883 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:05Z","lastTransitionTime":"2026-01-29T14:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.614723 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.614797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.614821 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.614850 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.614872 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:05Z","lastTransitionTime":"2026-01-29T14:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.718710 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.718790 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.718814 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.718851 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.718876 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:05Z","lastTransitionTime":"2026-01-29T14:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.822143 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.822203 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.822213 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.822229 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.822238 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:05Z","lastTransitionTime":"2026-01-29T14:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.925432 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.925508 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.925529 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.925555 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:05 crc kubenswrapper[4753]: I0129 14:04:05.925576 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:05Z","lastTransitionTime":"2026-01-29T14:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.028841 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.028908 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.028926 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.028953 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.028972 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:06Z","lastTransitionTime":"2026-01-29T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.131619 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.131695 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.131721 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.131989 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.132023 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:06Z","lastTransitionTime":"2026-01-29T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.148926 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:06 crc kubenswrapper[4753]: E0129 14:04:06.149090 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.169342 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd7318c-82fc-4dd8-a683-63d91482f525\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c34fd26eab07375195917e0641340b52b6d2dbbec4913a9b5512fec19df2fba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://154b0de3118bd87d7adfbe58f93f3d11adf7eb3e7a592782ce074f8f5cfaaa62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154b0de3118bd87d7adfbe58f93f3d11adf7eb3e7a592782ce074f8f5cfaaa62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.189418 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 22:15:21.910540234 +0000 UTC Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.189545 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5840803-8eed-4fb0-9307-9a39df0f7603\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0129 14:02:59.940074 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0129 14:02:59.944018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3772716806/tls.crt::/tmp/serving-cert-3772716806/tls.key\\\\\\\"\\\\nI0129 14:03:05.288520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0129 14:03:05.309583 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0129 14:03:05.309711 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0129 14:03:05.309757 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0129 14:03:05.309805 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0129 14:03:05.320460 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0129 14:03:05.320484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320489 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0129 14:03:05.320494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0129 14:03:05.320497 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0129 14:03:05.320499 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0129 14:03:05.320502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0129 14:03:05.320640 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0129 14:03:05.322032 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.209467 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8018d62-b07f-4c99-9613-6f89a55047dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbebccad7b921935f51556cbaa880576c2edaf01797c93e7ec43dfe118e30de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea13a05b6e5fc4460b51ff1c4fa94cd1a131a0d77ca0a473ccd5c953f36cbcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d030bd70dfe1bc52274ef09bd9519364d095902cee549407cf11cc9c37b04f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.229544 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4trnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0052a69-b4c2-41a7-9acc-3c9a936c20b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2804baea1fa3f49ea022b1b142a1776076e62cf73c448ef8c35410779893e738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7qcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4trnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.234760 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.234802 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.234811 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.234831 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.234842 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:06Z","lastTransitionTime":"2026-01-29T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.245333 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08d5c2a2-f3dc-4cda-991d-bfa44033357e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6abf91f15b53a2aeab13f1efd0ac3cea3d38450abb56e63d0fe0bfeb2157dec3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded50d8515e800f8722cadbee0983890ca22b67d105f55dc826ad1cf460fb56e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4drqk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z8p72\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.259842 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-57lv7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"281741ae-7781-4682-8b1d-207c9a437581\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-74bbq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-57lv7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.281671 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15790183-7300-4089-86da-5b3e95aaf7ac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142d8131bc499dfe39d3401bdf93dbddd4380fefacb39c741af922612ab536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c43f11d47a9f4704e7b3b0c148689b695ecb7b3ca27ad18d4a7e5095ddc4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2930397fb41e25650747194bc203c88875eb1bab60a9503a8f3e051a01346127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94abfbad752c7833ff8897ad1eeab656aba2f62666ca816d67d761e0b4f37117\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://628fa81e6e1c548e487fd251970852427570e0b97bd46f9bd18d0b971bc9b7ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98f01a94305fdd09cadc7276765024e2810b447d967f5ea8e2d4e1acac7cb309\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8be39937c3ed1b7c55e95bcf798bf1103f224d5764bf40a56d8c6ecfe0716d8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c21ce015355d75ff5253609ab1e70c6cc3677388e88824a8e409319d7547395\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.295945 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.310243 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-s7czm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bf54982-69e3-4afd-95d9-654f58871b60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c160c045d3f7369cd8f62fde1e1e54ee047218fd1de63aa06a81a5d5889d0a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqvhr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-s7czm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.327813 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf2500dd-260b-477b-905b-6f1c52455890\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aedc6a6f88ca7d3d32c35a8806dc47d498ac84bcc4f3ddea06c7c72a0795ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43fa2469a1482224a3d4580d023b9e870138ae982613cf7f933a682e0958789e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36f7ea6a9b72fedadc11a0eb68e9e226a3dca183df2b9bf7c47bf6a866d40a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://145f786a1008fb06bf4f46045a69f414034300bd13369094cb36d4ea1340c9eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21cb7cb2d7d66d48005ae32000b7a2b6bc55cb56575d84f002b39bc8d992b02e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://204bcdf05698653f001a48991bd590ef580b8ea40ae838414fa0bd47e9413a58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d18ac0596dead4e2ca057aaf89ceb7b9eda45d658ede43ff930b072aae6eeb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9qljt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g2rf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.337509 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.337556 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.337574 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.337598 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.337616 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:06Z","lastTransitionTime":"2026-01-29T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.343255 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://968212a69868279d6e2ced6a84dcd58d3ab5ca3ae3fcce5857291d40020f88f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.360711 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.376656 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaab907a10e06f956df41cae09d00fd8c2acada7ae55e3f53d1bb7106470f1ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://547e976b5c7cd1d987ceba96a6e9581f2701d02316570c221728f0e155696169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.393361 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vfrvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63926a91-5e42-4768-8277-55a0113cb5e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a178927f4539cbdccdd23649d9477c2c6c0b238d6a9702240d428e1e2fe90d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:53Z\\\",\\\"message\\\":\\\"2026-01-29T14:03:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85093681-3c1a-4dcd-bd26-81113c07771c\\\\n2026-01-29T14:03:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85093681-3c1a-4dcd-bd26-81113c07771c to /host/opt/cni/bin/\\\\n2026-01-29T14:03:08Z [verbose] multus-daemon started\\\\n2026-01-29T14:03:08Z [verbose] Readiness Indicator file check\\\\n2026-01-29T14:03:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np2fm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vfrvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.418204 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a796c89a-761f-48d7-80b5-031f75703f32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-29T14:03:43Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0129 14:03:43.161657 6453 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0129 14:03:43.161748 6453 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0129 14:03:43.161770 6453 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0129 14:03:43.161862 6453 factory.go:1336] Added *v1.Node event handler 7\\\\nI0129 14:03:43.161902 6453 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0129 14:03:43.162242 6453 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0129 14:03:43.162337 6453 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0129 14:03:43.162373 6453 ovnkube.go:599] Stopped ovnkube\\\\nI0129 14:03:43.162427 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0129 14:03:43.162607 6453 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9pd9r_openshift-ovn-kubernetes(a796c89a-761f-48d7-80b5-031f75703f32)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:03:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl6hh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9pd9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.435467 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd67e089-3116-4162-b956-e2a8c2c71beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37d5fc7c12e2b1310d540f26ef183c81181b633925e1ae8eaa54cd0852a80c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d367f98d62f21486659db334571303be0b003c240182fe5fc70f072014f31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bff2518f4e02462e15ff6ddba09a3c44fb04cb19072d20841bb3fba30106d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d042279d7fe276d316c880c4bd577c24642eb4dccf904c900a18c42d007cc9b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-29T14:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-29T14:02:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:02:46Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.441602 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.441657 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.441678 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.441703 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.441719 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:06Z","lastTransitionTime":"2026-01-29T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.455416 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.470225 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2184182e3c278a403a5b62cf0c5635732fb40424ef66177499f3002c571a97e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.484704 4753 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49d14260-5f77-47b9-97e1-c843cf322a0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T14:03:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0470216b9ecaaf9cebf4cfe8184f746a2c55b99a4f7642acae6256f3f9aa547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-29T14:03:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdjpc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T14:03:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-x6rpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-29T14:04:06Z is after 2025-08-24T17:21:41Z" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.544324 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.544422 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.544450 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.544486 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.544511 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:06Z","lastTransitionTime":"2026-01-29T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.647897 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.647986 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.648010 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.648041 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.648063 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:06Z","lastTransitionTime":"2026-01-29T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.751747 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.751803 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.751824 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.751849 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.751882 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:06Z","lastTransitionTime":"2026-01-29T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.854882 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.854971 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.854992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.855024 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.855045 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:06Z","lastTransitionTime":"2026-01-29T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.957868 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.957980 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.958001 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.958032 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:06 crc kubenswrapper[4753]: I0129 14:04:06.958093 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:06Z","lastTransitionTime":"2026-01-29T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.062011 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.062068 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.062085 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.062110 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.062127 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:07Z","lastTransitionTime":"2026-01-29T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.148880 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.148889 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.148967 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:04:07 crc kubenswrapper[4753]: E0129 14:04:07.149725 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:04:07 crc kubenswrapper[4753]: E0129 14:04:07.149515 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:04:07 crc kubenswrapper[4753]: E0129 14:04:07.149873 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.165750 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.165815 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.165836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.165867 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.165890 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:07Z","lastTransitionTime":"2026-01-29T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.190270 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 16:09:20.708916574 +0000 UTC Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.269386 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.269463 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.269480 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.269505 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.269522 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:07Z","lastTransitionTime":"2026-01-29T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.372531 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.372605 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.372631 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.372667 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.372693 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:07Z","lastTransitionTime":"2026-01-29T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.477210 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.477307 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.477326 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.477385 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.477407 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:07Z","lastTransitionTime":"2026-01-29T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.582035 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.582096 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.582115 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.582140 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.582188 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:07Z","lastTransitionTime":"2026-01-29T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.684710 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.684778 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.684797 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.684823 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.684842 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:07Z","lastTransitionTime":"2026-01-29T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.787549 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.787918 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.788014 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.788100 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.788237 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:07Z","lastTransitionTime":"2026-01-29T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.893030 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.893532 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.893690 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.893829 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.893965 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:07Z","lastTransitionTime":"2026-01-29T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.997634 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.997680 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.997694 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.997711 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:07 crc kubenswrapper[4753]: I0129 14:04:07.997723 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:07Z","lastTransitionTime":"2026-01-29T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.100543 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.100600 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.100611 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.100630 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.100643 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:08Z","lastTransitionTime":"2026-01-29T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.149236 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:08 crc kubenswrapper[4753]: E0129 14:04:08.149451 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.190981 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 20:40:26.435846252 +0000 UTC Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.203291 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.203341 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.203352 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.203369 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.203392 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:08Z","lastTransitionTime":"2026-01-29T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.307253 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.307311 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.307329 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.307354 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.307372 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:08Z","lastTransitionTime":"2026-01-29T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.411129 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.411219 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.411239 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.411264 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.411284 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:08Z","lastTransitionTime":"2026-01-29T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.514356 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.514459 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.514479 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.514506 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.514527 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:08Z","lastTransitionTime":"2026-01-29T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.617851 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.617923 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.617943 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.617971 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.617991 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:08Z","lastTransitionTime":"2026-01-29T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.721839 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.721903 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.721920 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.721944 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.721962 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:08Z","lastTransitionTime":"2026-01-29T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.825248 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.825336 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.825365 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.825399 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.825433 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:08Z","lastTransitionTime":"2026-01-29T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.928953 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.929020 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.929036 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.929062 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.929081 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:08Z","lastTransitionTime":"2026-01-29T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.977049 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:08 crc kubenswrapper[4753]: I0129 14:04:08.977123 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:08 crc kubenswrapper[4753]: E0129 14:04:08.977326 4753 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:04:08 crc kubenswrapper[4753]: E0129 14:04:08.977335 4753 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:04:08 crc kubenswrapper[4753]: E0129 14:04:08.977419 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:05:12.977393577 +0000 UTC m=+147.672127989 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 29 14:04:08 crc kubenswrapper[4753]: E0129 14:04:08.977463 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:05:12.977431708 +0000 UTC m=+147.672166130 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.032782 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.032856 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.032875 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.032904 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.032928 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:09Z","lastTransitionTime":"2026-01-29T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.078125 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.078439 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.078511 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:05:13.078447128 +0000 UTC m=+147.773181550 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.078675 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.078713 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.078739 4753 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.078758 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.078836 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-29 14:05:13.078807477 +0000 UTC m=+147.773542039 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.079042 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.079092 4753 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.079113 4753 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.079249 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-29 14:05:13.07922287 +0000 UTC m=+147.773957282 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.135736 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.135793 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.135810 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.135836 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.135854 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:09Z","lastTransitionTime":"2026-01-29T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.148557 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.148573 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.148745 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.148651 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.148876 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:04:09 crc kubenswrapper[4753]: E0129 14:04:09.149044 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.191988 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 16:06:28.136759758 +0000 UTC Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.239325 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.239372 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.239389 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.239413 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.239431 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:09Z","lastTransitionTime":"2026-01-29T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.342564 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.342622 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.342676 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.342704 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.342721 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:09Z","lastTransitionTime":"2026-01-29T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.446288 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.446363 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.446388 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.446416 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.446439 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:09Z","lastTransitionTime":"2026-01-29T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.550058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.550126 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.550144 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.550207 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.550232 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:09Z","lastTransitionTime":"2026-01-29T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.654068 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.654138 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.654190 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.654218 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.654236 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:09Z","lastTransitionTime":"2026-01-29T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.757539 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.757600 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.757616 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.757640 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.757661 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:09Z","lastTransitionTime":"2026-01-29T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.860942 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.861383 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.861409 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.861438 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.861462 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:09Z","lastTransitionTime":"2026-01-29T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.964667 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.964722 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.964739 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.964764 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:09 crc kubenswrapper[4753]: I0129 14:04:09.964783 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:09Z","lastTransitionTime":"2026-01-29T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.073724 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.073804 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.073842 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.073876 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.073912 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:10Z","lastTransitionTime":"2026-01-29T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.148736 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:10 crc kubenswrapper[4753]: E0129 14:04:10.148965 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.177565 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.177647 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.177672 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.177702 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.177723 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:10Z","lastTransitionTime":"2026-01-29T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.192952 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 01:29:22.311401223 +0000 UTC Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.281727 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.281800 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.281999 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.282029 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.282058 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:10Z","lastTransitionTime":"2026-01-29T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.385815 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.385885 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.385901 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.385938 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.385961 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:10Z","lastTransitionTime":"2026-01-29T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.489199 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.489268 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.489286 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.489311 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.489329 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:10Z","lastTransitionTime":"2026-01-29T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.592761 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.592824 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.592843 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.592912 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.592936 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:10Z","lastTransitionTime":"2026-01-29T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.695798 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.695900 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.695926 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.695953 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.695972 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:10Z","lastTransitionTime":"2026-01-29T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.799503 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.799558 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.799573 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.799593 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.799608 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:10Z","lastTransitionTime":"2026-01-29T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.901829 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.901873 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.901886 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.901905 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:10 crc kubenswrapper[4753]: I0129 14:04:10.901918 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:10Z","lastTransitionTime":"2026-01-29T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.004699 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.004776 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.004800 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.004829 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.004854 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:11Z","lastTransitionTime":"2026-01-29T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.107954 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.108060 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.108103 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.108134 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.108203 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:11Z","lastTransitionTime":"2026-01-29T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.149073 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.149074 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.149270 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:04:11 crc kubenswrapper[4753]: E0129 14:04:11.149513 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:04:11 crc kubenswrapper[4753]: E0129 14:04:11.149637 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:04:11 crc kubenswrapper[4753]: E0129 14:04:11.149768 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.193900 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 04:20:06.882321579 +0000 UTC Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.210616 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.210657 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.210718 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.210748 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.210817 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:11Z","lastTransitionTime":"2026-01-29T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.313224 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.313279 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.313345 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.313385 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.313458 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:11Z","lastTransitionTime":"2026-01-29T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.416840 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.416893 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.416908 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.416931 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.416947 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:11Z","lastTransitionTime":"2026-01-29T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.520024 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.520089 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.520112 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.520144 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.520221 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:11Z","lastTransitionTime":"2026-01-29T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.623686 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.623758 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.623782 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.623812 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.623831 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:11Z","lastTransitionTime":"2026-01-29T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.726950 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.727029 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.727041 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.727058 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.727089 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:11Z","lastTransitionTime":"2026-01-29T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.831091 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.831205 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.831240 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.831282 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.831307 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:11Z","lastTransitionTime":"2026-01-29T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.934336 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.934407 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.934425 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.934451 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:11 crc kubenswrapper[4753]: I0129 14:04:11.934471 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:11Z","lastTransitionTime":"2026-01-29T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.037779 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.037843 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.037860 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.037887 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.037905 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:12Z","lastTransitionTime":"2026-01-29T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.141577 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.141672 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.141690 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.141716 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.141734 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:12Z","lastTransitionTime":"2026-01-29T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.149056 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:12 crc kubenswrapper[4753]: E0129 14:04:12.149286 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.194776 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 06:03:49.767795663 +0000 UTC Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.245408 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.245488 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.245511 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.245539 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.245556 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:12Z","lastTransitionTime":"2026-01-29T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.349305 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.349400 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.349427 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.349465 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.349493 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:12Z","lastTransitionTime":"2026-01-29T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.453086 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.453141 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.453175 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.453196 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.453209 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:12Z","lastTransitionTime":"2026-01-29T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.556760 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.557252 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.557447 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.557644 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.557899 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:12Z","lastTransitionTime":"2026-01-29T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.661930 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.661992 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.662011 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.662043 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.662062 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:12Z","lastTransitionTime":"2026-01-29T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.765573 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.765649 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.765674 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.765705 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.765727 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:12Z","lastTransitionTime":"2026-01-29T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.869800 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.869867 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.869889 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.869918 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.869936 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:12Z","lastTransitionTime":"2026-01-29T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.973197 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.973276 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.973290 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.973318 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:12 crc kubenswrapper[4753]: I0129 14:04:12.973334 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:12Z","lastTransitionTime":"2026-01-29T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.077102 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.077243 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.077286 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.077321 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.077348 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:13Z","lastTransitionTime":"2026-01-29T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.149327 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:04:13 crc kubenswrapper[4753]: E0129 14:04:13.149462 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.149478 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.150033 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:04:13 crc kubenswrapper[4753]: E0129 14:04:13.150350 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:04:13 crc kubenswrapper[4753]: E0129 14:04:13.150422 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.150920 4753 scope.go:117] "RemoveContainer" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.179502 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.179543 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.179558 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.179578 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.179591 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:13Z","lastTransitionTime":"2026-01-29T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.195675 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 19:21:45.255636902 +0000 UTC Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.283786 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.283880 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.283903 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.283973 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.283997 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:13Z","lastTransitionTime":"2026-01-29T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.387834 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.387909 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.387935 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.387969 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.387995 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:13Z","lastTransitionTime":"2026-01-29T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.491883 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.491952 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.491969 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.491994 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.492012 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:13Z","lastTransitionTime":"2026-01-29T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.595393 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.595478 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.595502 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.595527 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.595549 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:13Z","lastTransitionTime":"2026-01-29T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.678441 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/2.log" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.683792 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerStarted","Data":"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45"} Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.684618 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.699440 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.699884 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.699900 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.699925 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.699946 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:13Z","lastTransitionTime":"2026-01-29T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.752467 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.752555 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.752579 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.752615 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.752640 4753 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-29T14:04:13Z","lastTransitionTime":"2026-01-29T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.780918 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podStartSLOduration=68.78088704 podStartE2EDuration="1m8.78088704s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:13.77877324 +0000 UTC m=+88.473507662" watchObservedRunningTime="2026-01-29 14:04:13.78088704 +0000 UTC m=+88.475621462" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.824412 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vfrvp" podStartSLOduration=68.82439284 podStartE2EDuration="1m8.82439284s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:13.804176437 +0000 UTC m=+88.498910859" watchObservedRunningTime="2026-01-29 14:04:13.82439284 +0000 UTC m=+88.519127232" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.824612 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694"] Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.825027 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.828821 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.828926 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.831217 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.831235 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.842022 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podStartSLOduration=68.841999949 podStartE2EDuration="1m8.841999949s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:13.841619249 +0000 UTC m=+88.536353631" watchObservedRunningTime="2026-01-29 14:04:13.841999949 +0000 UTC m=+88.536734341" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.856952 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=32.856924114 podStartE2EDuration="32.856924114s" podCreationTimestamp="2026-01-29 14:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:13.856568935 +0000 UTC m=+88.551303337" watchObservedRunningTime="2026-01-29 14:04:13.856924114 +0000 UTC m=+88.551658506" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.937186 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=17.937135655 podStartE2EDuration="17.937135655s" podCreationTimestamp="2026-01-29 14:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:13.936792176 +0000 UTC m=+88.631526558" watchObservedRunningTime="2026-01-29 14:04:13.937135655 +0000 UTC m=+88.631870037" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.937772 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4trnx" podStartSLOduration=68.937763953 podStartE2EDuration="1m8.937763953s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:13.917988843 +0000 UTC m=+88.612723235" watchObservedRunningTime="2026-01-29 14:04:13.937763953 +0000 UTC m=+88.632498335" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.937825 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09481310-70c7-4287-b13f-7355280530fc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.938067 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09481310-70c7-4287-b13f-7355280530fc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.938197 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/09481310-70c7-4287-b13f-7355280530fc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.938252 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/09481310-70c7-4287-b13f-7355280530fc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.938385 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09481310-70c7-4287-b13f-7355280530fc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.955404 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.955368042 podStartE2EDuration="1m8.955368042s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:13.955087215 +0000 UTC m=+88.649821617" watchObservedRunningTime="2026-01-29 14:04:13.955368042 +0000 UTC m=+88.650102424" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.971691 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.971669016 podStartE2EDuration="1m8.971669016s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:13.97075442 +0000 UTC m=+88.665488802" watchObservedRunningTime="2026-01-29 14:04:13.971669016 +0000 UTC m=+88.666403428" Jan 29 14:04:13 crc kubenswrapper[4753]: I0129 14:04:13.998953 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g2rf5" podStartSLOduration=68.998921924 podStartE2EDuration="1m8.998921924s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:13.998192384 +0000 UTC m=+88.692926766" watchObservedRunningTime="2026-01-29 14:04:13.998921924 +0000 UTC m=+88.693656306" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.022875 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z8p72" podStartSLOduration=69.022853769 podStartE2EDuration="1m9.022853769s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:14.022549551 +0000 UTC m=+88.717283933" watchObservedRunningTime="2026-01-29 14:04:14.022853769 +0000 UTC m=+88.717588151" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.039242 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09481310-70c7-4287-b13f-7355280530fc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.039577 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/09481310-70c7-4287-b13f-7355280530fc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.039678 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/09481310-70c7-4287-b13f-7355280530fc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.039757 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/09481310-70c7-4287-b13f-7355280530fc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.039710 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/09481310-70c7-4287-b13f-7355280530fc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.039954 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09481310-70c7-4287-b13f-7355280530fc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.040042 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09481310-70c7-4287-b13f-7355280530fc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.040972 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09481310-70c7-4287-b13f-7355280530fc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.053245 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09481310-70c7-4287-b13f-7355280530fc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.065560 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09481310-70c7-4287-b13f-7355280530fc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b7694\" (UID: \"09481310-70c7-4287-b13f-7355280530fc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.083435 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.083411065 podStartE2EDuration="1m7.083411065s" podCreationTimestamp="2026-01-29 14:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:14.082423607 +0000 UTC m=+88.777157999" watchObservedRunningTime="2026-01-29 14:04:14.083411065 +0000 UTC m=+88.778145467" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.124198 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s7czm" podStartSLOduration=69.124174568 podStartE2EDuration="1m9.124174568s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:14.109712845 +0000 UTC m=+88.804447227" watchObservedRunningTime="2026-01-29 14:04:14.124174568 +0000 UTC m=+88.818908950" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.139168 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.149971 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:14 crc kubenswrapper[4753]: E0129 14:04:14.150116 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:04:14 crc kubenswrapper[4753]: W0129 14:04:14.152910 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09481310_70c7_4287_b13f_7355280530fc.slice/crio-c40179a1a5b48d8326c1f93f5276c7b2cc8f9b6ff6095bdece2e53b987b941d2 WatchSource:0}: Error finding container c40179a1a5b48d8326c1f93f5276c7b2cc8f9b6ff6095bdece2e53b987b941d2: Status 404 returned error can't find the container with id c40179a1a5b48d8326c1f93f5276c7b2cc8f9b6ff6095bdece2e53b987b941d2 Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.196933 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 02:44:09.866527552 +0000 UTC Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.197010 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.206421 4753 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.368022 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-57lv7"] Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.688790 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" event={"ID":"09481310-70c7-4287-b13f-7355280530fc","Type":"ContainerStarted","Data":"02e0e10b3bf57daf4a273f13604680b40f19dd6b774f13ef6ef8e395824fbbde"} Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.688854 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" event={"ID":"09481310-70c7-4287-b13f-7355280530fc","Type":"ContainerStarted","Data":"c40179a1a5b48d8326c1f93f5276c7b2cc8f9b6ff6095bdece2e53b987b941d2"} Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.688886 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:14 crc kubenswrapper[4753]: E0129 14:04:14.689020 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:04:14 crc kubenswrapper[4753]: I0129 14:04:14.703093 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b7694" podStartSLOduration=69.70306814 podStartE2EDuration="1m9.70306814s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:14.701602549 +0000 UTC m=+89.396336951" watchObservedRunningTime="2026-01-29 14:04:14.70306814 +0000 UTC m=+89.397802542" Jan 29 14:04:15 crc kubenswrapper[4753]: I0129 14:04:15.148796 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:04:15 crc kubenswrapper[4753]: E0129 14:04:15.149286 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 29 14:04:15 crc kubenswrapper[4753]: I0129 14:04:15.148921 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:04:15 crc kubenswrapper[4753]: E0129 14:04:15.149356 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 29 14:04:15 crc kubenswrapper[4753]: I0129 14:04:15.148889 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:15 crc kubenswrapper[4753]: E0129 14:04:15.149410 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.148749 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:16 crc kubenswrapper[4753]: E0129 14:04:16.151045 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-57lv7" podUID="281741ae-7781-4682-8b1d-207c9a437581" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.594256 4753 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.594500 4753 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.656955 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.657691 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.665407 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.665709 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.666042 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.666243 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.666315 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.674494 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.674861 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55a1d93-4aed-460d-915c-452613c7718c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9klvl\" (UID: \"c55a1d93-4aed-460d-915c-452613c7718c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.674916 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55a1d93-4aed-460d-915c-452613c7718c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9klvl\" (UID: \"c55a1d93-4aed-460d-915c-452613c7718c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.674972 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tklm8\" (UniqueName: \"kubernetes.io/projected/c55a1d93-4aed-460d-915c-452613c7718c-kube-api-access-tklm8\") pod \"openshift-apiserver-operator-796bbdcf4f-9klvl\" (UID: \"c55a1d93-4aed-460d-915c-452613c7718c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.675347 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.676823 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589"] Jan 29 14:04:16 crc kubenswrapper[4753]: W0129 14:04:16.689879 4753 reflector.go:561] object-"openshift-route-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 29 14:04:16 crc kubenswrapper[4753]: E0129 14:04:16.689966 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 29 14:04:16 crc kubenswrapper[4753]: W0129 14:04:16.690268 4753 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 29 14:04:16 crc kubenswrapper[4753]: E0129 14:04:16.690329 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 29 14:04:16 crc kubenswrapper[4753]: W0129 14:04:16.690444 4753 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 29 14:04:16 crc kubenswrapper[4753]: W0129 14:04:16.690454 4753 reflector.go:561] object-"openshift-route-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 29 14:04:16 crc kubenswrapper[4753]: E0129 14:04:16.690478 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 29 14:04:16 crc kubenswrapper[4753]: E0129 14:04:16.690501 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 29 14:04:16 crc kubenswrapper[4753]: W0129 14:04:16.690565 4753 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 29 14:04:16 crc kubenswrapper[4753]: W0129 14:04:16.690594 4753 reflector.go:561] object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2": failed to list *v1.Secret: secrets "route-controller-manager-sa-dockercfg-h2zr2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 29 14:04:16 crc kubenswrapper[4753]: E0129 14:04:16.690598 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 29 14:04:16 crc kubenswrapper[4753]: E0129 14:04:16.690628 4753 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-h2zr2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"route-controller-manager-sa-dockercfg-h2zr2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.690968 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.707399 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.707475 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.707583 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.707673 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.707763 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.708007 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.708196 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.708358 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.708518 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.711889 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jvpqh"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.743082 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5xcbl"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.743378 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gn6zd"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.743613 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gd25h"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.744087 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.744215 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.746422 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.746734 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.747172 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.749846 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-bxx4k"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.750418 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.750887 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jbfcj"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.755668 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.755943 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.760905 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.761216 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.761384 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hf9wc"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.761446 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.761599 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.761678 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.761748 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.761818 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.762311 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.762388 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.762460 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.762534 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.762582 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jbfcj" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.763249 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.763277 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.763764 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.763983 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.764023 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.764266 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.764274 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.764426 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.764437 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.764599 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.766678 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.767091 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.767223 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.767323 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.767429 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.767869 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.767978 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.768119 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.768252 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.768365 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.768510 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.768600 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.768681 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.769351 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.769433 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.769533 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.769850 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.769924 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.771863 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.772174 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.772610 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.775364 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.775432 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.775861 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.775982 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.776275 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777000 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6h58\" (UniqueName: \"kubernetes.io/projected/842d8f24-8875-4968-a486-99a53a278850-kube-api-access-v6h58\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777043 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777065 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-serving-cert\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777082 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/842d8f24-8875-4968-a486-99a53a278850-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777112 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777133 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777175 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/842d8f24-8875-4968-a486-99a53a278850-audit-policies\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777193 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-serving-cert\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777209 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzxcm\" (UniqueName: \"kubernetes.io/projected/78dab1db-992a-4ae3-97a1-d70613ac41fe-kube-api-access-tzxcm\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777224 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777267 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777285 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9rtg\" (UniqueName: \"kubernetes.io/projected/4a83492d-36e3-4400-a969-71934ecfc9f7-kube-api-access-z9rtg\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777300 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/78dab1db-992a-4ae3-97a1-d70613ac41fe-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777333 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-config\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777354 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78dab1db-992a-4ae3-97a1-d70613ac41fe-config\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777373 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/842d8f24-8875-4968-a486-99a53a278850-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777405 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/842d8f24-8875-4968-a486-99a53a278850-audit-dir\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777422 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-config\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777436 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd6a836a-9268-4a6a-ab7f-00605823dc9f-metrics-tls\") pod \"dns-operator-744455d44c-hf9wc\" (UID: \"cd6a836a-9268-4a6a-ab7f-00605823dc9f\") " pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777456 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55a1d93-4aed-460d-915c-452613c7718c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9klvl\" (UID: \"c55a1d93-4aed-460d-915c-452613c7718c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777491 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/776ea39d-e7fb-497f-bfaf-41b385b76754-serving-cert\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777511 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-client-ca\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777528 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpjhb\" (UniqueName: \"kubernetes.io/projected/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-kube-api-access-xpjhb\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777565 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/842d8f24-8875-4968-a486-99a53a278850-etcd-client\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777580 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-serving-cert\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777598 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55a1d93-4aed-460d-915c-452613c7718c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9klvl\" (UID: \"c55a1d93-4aed-460d-915c-452613c7718c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777614 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777645 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-node-pullsecrets\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777676 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/842d8f24-8875-4968-a486-99a53a278850-serving-cert\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777697 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-policies\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777733 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-oauth-config\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777752 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777767 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-client-ca\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777801 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcdzv\" (UniqueName: \"kubernetes.io/projected/f5125f20-92c4-4700-8c7f-f8c7fc7b48b9-kube-api-access-xcdzv\") pod \"downloads-7954f5f757-jbfcj\" (UID: \"f5125f20-92c4-4700-8c7f-f8c7fc7b48b9\") " pod="openshift-console/downloads-7954f5f757-jbfcj" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777818 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-dir\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777834 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-oauth-serving-cert\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777850 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-image-import-ca\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777883 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-encryption-config\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777913 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777929 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-trusted-ca-bundle\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.777965 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778004 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-audit-dir\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778035 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-service-ca\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778052 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-etcd-serving-ca\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778068 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz2vq\" (UniqueName: \"kubernetes.io/projected/810f6f52-858d-46f6-b2d2-71f9c3135263-kube-api-access-dz2vq\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778137 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-console-config\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778180 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpfcs\" (UniqueName: \"kubernetes.io/projected/776ea39d-e7fb-497f-bfaf-41b385b76754-kube-api-access-gpfcs\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778269 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-audit\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778293 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778311 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg5rv\" (UniqueName: \"kubernetes.io/projected/cd6a836a-9268-4a6a-ab7f-00605823dc9f-kube-api-access-rg5rv\") pod \"dns-operator-744455d44c-hf9wc\" (UID: \"cd6a836a-9268-4a6a-ab7f-00605823dc9f\") " pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778344 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-config\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778361 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778381 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-927kk\" (UniqueName: \"kubernetes.io/projected/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-kube-api-access-927kk\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778398 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-etcd-client\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778431 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778449 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778471 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tklm8\" (UniqueName: \"kubernetes.io/projected/c55a1d93-4aed-460d-915c-452613c7718c-kube-api-access-tklm8\") pod \"openshift-apiserver-operator-796bbdcf4f-9klvl\" (UID: \"c55a1d93-4aed-460d-915c-452613c7718c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778506 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/842d8f24-8875-4968-a486-99a53a278850-encryption-config\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.778521 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78dab1db-992a-4ae3-97a1-d70613ac41fe-images\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.780867 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c55a1d93-4aed-460d-915c-452613c7718c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9klvl\" (UID: \"c55a1d93-4aed-460d-915c-452613c7718c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.796683 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-944nm"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.797361 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.797894 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.798386 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.798884 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.799046 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c55a1d93-4aed-460d-915c-452613c7718c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9klvl\" (UID: \"c55a1d93-4aed-460d-915c-452613c7718c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.801619 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.804407 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.812058 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rls7b"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.814498 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.819720 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.825259 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.825468 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.826032 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.827056 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rf798"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.827668 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.828022 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.828570 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.828755 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.829010 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.829311 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.832592 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.834740 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.834917 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.835431 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.835682 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.836399 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.838799 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.841737 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lvbft"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842142 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842277 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842397 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842514 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842525 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842725 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842749 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842833 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842615 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.843026 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842648 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.843196 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842682 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842992 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.842959 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.844231 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.844599 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.844616 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.844649 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.845291 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.845487 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.846767 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.846556 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.847337 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.847757 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.848481 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.855311 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.856342 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tklm8\" (UniqueName: \"kubernetes.io/projected/c55a1d93-4aed-460d-915c-452613c7718c-kube-api-access-tklm8\") pod \"openshift-apiserver-operator-796bbdcf4f-9klvl\" (UID: \"c55a1d93-4aed-460d-915c-452613c7718c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.857450 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.858418 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.859233 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.865416 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.866558 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.867092 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.869348 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.870402 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.871267 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-4v9zm"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.872068 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.871682 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.872844 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mxrlb"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.873196 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.873123 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.877119 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.877748 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.881472 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.885786 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.885881 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-serving-cert\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.885940 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/842d8f24-8875-4968-a486-99a53a278850-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.885975 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886006 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-serving-cert\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886028 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzxcm\" (UniqueName: \"kubernetes.io/projected/78dab1db-992a-4ae3-97a1-d70613ac41fe-kube-api-access-tzxcm\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886051 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886085 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/842d8f24-8875-4968-a486-99a53a278850-audit-policies\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886112 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886132 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9rtg\" (UniqueName: \"kubernetes.io/projected/4a83492d-36e3-4400-a969-71934ecfc9f7-kube-api-access-z9rtg\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886220 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886319 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/78dab1db-992a-4ae3-97a1-d70613ac41fe-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886342 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-config\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886383 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/842d8f24-8875-4968-a486-99a53a278850-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886431 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/842d8f24-8875-4968-a486-99a53a278850-audit-dir\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886461 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78dab1db-992a-4ae3-97a1-d70613ac41fe-config\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886485 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-config\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886511 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd6a836a-9268-4a6a-ab7f-00605823dc9f-metrics-tls\") pod \"dns-operator-744455d44c-hf9wc\" (UID: \"cd6a836a-9268-4a6a-ab7f-00605823dc9f\") " pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886537 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/776ea39d-e7fb-497f-bfaf-41b385b76754-serving-cert\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886566 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-client-ca\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886583 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpjhb\" (UniqueName: \"kubernetes.io/projected/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-kube-api-access-xpjhb\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886611 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/842d8f24-8875-4968-a486-99a53a278850-etcd-client\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886651 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-serving-cert\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886678 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886703 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-node-pullsecrets\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886727 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/842d8f24-8875-4968-a486-99a53a278850-serving-cert\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.886781 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.887394 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-config\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.887465 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.888253 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-policies\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.894105 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.894926 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/842d8f24-8875-4968-a486-99a53a278850-audit-policies\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.894981 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/842d8f24-8875-4968-a486-99a53a278850-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.895026 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-node-pullsecrets\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.896089 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.897101 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.898419 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-serving-cert\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.898675 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/842d8f24-8875-4968-a486-99a53a278850-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.899559 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.899685 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.899843 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/842d8f24-8875-4968-a486-99a53a278850-serving-cert\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.899949 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-serving-cert\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900059 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/842d8f24-8875-4968-a486-99a53a278850-audit-dir\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.888305 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-oauth-config\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900101 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900124 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900172 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-client-ca\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900199 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcdzv\" (UniqueName: \"kubernetes.io/projected/f5125f20-92c4-4700-8c7f-f8c7fc7b48b9-kube-api-access-xcdzv\") pod \"downloads-7954f5f757-jbfcj\" (UID: \"f5125f20-92c4-4700-8c7f-f8c7fc7b48b9\") " pod="openshift-console/downloads-7954f5f757-jbfcj" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900226 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-image-import-ca\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900242 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-encryption-config\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900264 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-dir\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900281 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-oauth-serving-cert\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900301 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900319 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-trusted-ca-bundle\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900347 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900364 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-audit-dir\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900383 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-etcd-serving-ca\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900407 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-service-ca\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900437 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz2vq\" (UniqueName: \"kubernetes.io/projected/810f6f52-858d-46f6-b2d2-71f9c3135263-kube-api-access-dz2vq\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900464 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-console-config\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900467 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-policies\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900488 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpfcs\" (UniqueName: \"kubernetes.io/projected/776ea39d-e7fb-497f-bfaf-41b385b76754-kube-api-access-gpfcs\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900536 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-audit\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900564 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-config\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900592 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900616 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg5rv\" (UniqueName: \"kubernetes.io/projected/cd6a836a-9268-4a6a-ab7f-00605823dc9f-kube-api-access-rg5rv\") pod \"dns-operator-744455d44c-hf9wc\" (UID: \"cd6a836a-9268-4a6a-ab7f-00605823dc9f\") " pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900644 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900675 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-927kk\" (UniqueName: \"kubernetes.io/projected/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-kube-api-access-927kk\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900704 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-etcd-client\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900731 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900762 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900790 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/842d8f24-8875-4968-a486-99a53a278850-encryption-config\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.900823 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78dab1db-992a-4ae3-97a1-d70613ac41fe-images\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.901076 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6h58\" (UniqueName: \"kubernetes.io/projected/842d8f24-8875-4968-a486-99a53a278850-kube-api-access-v6h58\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.901340 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-image-import-ca\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.901690 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd6a836a-9268-4a6a-ab7f-00605823dc9f-metrics-tls\") pod \"dns-operator-744455d44c-hf9wc\" (UID: \"cd6a836a-9268-4a6a-ab7f-00605823dc9f\") " pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.901914 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-client-ca\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.902248 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-dir\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.902585 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.903276 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.903300 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-trusted-ca-bundle\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.903336 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-oauth-serving-cert\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.903358 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-audit-dir\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.904386 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-audit\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.904455 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-service-ca\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.904529 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.904610 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-console-config\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.904811 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.905330 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-etcd-serving-ca\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.905425 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.907187 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78dab1db-992a-4ae3-97a1-d70613ac41fe-config\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.908944 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/842d8f24-8875-4968-a486-99a53a278850-etcd-client\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.909510 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.909591 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-serving-cert\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.910298 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-config\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.910630 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-encryption-config\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.910732 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.910722 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-oauth-config\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.911080 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78dab1db-992a-4ae3-97a1-d70613ac41fe-images\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.911400 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-etcd-client\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.912324 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/842d8f24-8875-4968-a486-99a53a278850-encryption-config\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.912353 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.912695 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkq94"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.913576 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.913632 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/78dab1db-992a-4ae3-97a1-d70613ac41fe-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.914098 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.914500 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.915492 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.916056 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.916606 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.916959 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.917884 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.918962 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.919125 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.919492 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.919949 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.920188 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.920248 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.920757 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.921082 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mtr5k"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.921504 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.922087 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.922555 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.923276 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qgfrx"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.923631 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.924298 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.925603 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.926594 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jvpqh"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.928496 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rls7b"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.929419 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-944nm"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.930440 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5xcbl"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.932114 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gn6zd"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.932690 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bxx4k"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.933630 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.934623 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.935745 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hf9wc"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.936607 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.937555 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5qnph"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.938286 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5qnph" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.938712 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.939700 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.939727 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.940699 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mxrlb"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.941684 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.942598 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.943572 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.944583 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.946167 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.947321 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.948306 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkq94"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.949252 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lvbft"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.950241 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rf798"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.952715 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gd25h"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.954879 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jbfcj"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.956270 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.959778 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.960673 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.961505 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.964588 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qgfrx"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.966988 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.969648 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.973069 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mtr5k"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.974775 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8dzsr"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.975611 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.976333 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.978976 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.979203 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.981721 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5qnph"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.983841 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.985763 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-58ksd"] Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.986832 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:16 crc kubenswrapper[4753]: I0129 14:04:16.987711 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-58ksd"] Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.000382 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.014327 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.020192 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.040045 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.079752 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.099989 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.120566 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.140252 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.148519 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.148997 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.149320 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.161128 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.180253 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.200873 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.220465 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.239602 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.259665 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.280716 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.301174 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl"] Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.302662 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 29 14:04:17 crc kubenswrapper[4753]: W0129 14:04:17.310290 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc55a1d93_4aed_460d_915c_452613c7718c.slice/crio-47c82f853d4d375f58526fc0d3955dcf1410646738b76c4fea014ac4ae388b9a WatchSource:0}: Error finding container 47c82f853d4d375f58526fc0d3955dcf1410646738b76c4fea014ac4ae388b9a: Status 404 returned error can't find the container with id 47c82f853d4d375f58526fc0d3955dcf1410646738b76c4fea014ac4ae388b9a Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.319996 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.340289 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.360726 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.390366 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.400370 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.420591 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.439996 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.460092 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.480117 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.499728 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.520028 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.540222 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.560647 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.580699 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.599977 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.621036 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.640856 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.681911 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.701308 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.713882 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" event={"ID":"c55a1d93-4aed-460d-915c-452613c7718c","Type":"ContainerStarted","Data":"cc772ab863dd787174cb38299f0e3a58b32fc2e2f491ca9a833e41c3e93902f7"} Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.713939 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" event={"ID":"c55a1d93-4aed-460d-915c-452613c7718c","Type":"ContainerStarted","Data":"47c82f853d4d375f58526fc0d3955dcf1410646738b76c4fea014ac4ae388b9a"} Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.720791 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.740859 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.760713 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.781340 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.815196 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.849255 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzxcm\" (UniqueName: \"kubernetes.io/projected/78dab1db-992a-4ae3-97a1-d70613ac41fe-kube-api-access-tzxcm\") pod \"machine-api-operator-5694c8668f-jvpqh\" (UID: \"78dab1db-992a-4ae3-97a1-d70613ac41fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.859369 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9rtg\" (UniqueName: \"kubernetes.io/projected/4a83492d-36e3-4400-a969-71934ecfc9f7-kube-api-access-z9rtg\") pod \"oauth-openshift-558db77b4-gn6zd\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.878413 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpjhb\" (UniqueName: \"kubernetes.io/projected/8a7be31e-2bc5-4b5c-8d26-a220f35b87d4-kube-api-access-xpjhb\") pod \"apiserver-76f77b778f-gd25h\" (UID: \"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4\") " pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.898044 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6h58\" (UniqueName: \"kubernetes.io/projected/842d8f24-8875-4968-a486-99a53a278850-kube-api-access-v6h58\") pod \"apiserver-7bbb656c7d-bp589\" (UID: \"842d8f24-8875-4968-a486-99a53a278850\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:17 crc kubenswrapper[4753]: E0129 14:04:17.901904 4753 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 29 14:04:17 crc kubenswrapper[4753]: E0129 14:04:17.902017 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/776ea39d-e7fb-497f-bfaf-41b385b76754-serving-cert podName:776ea39d-e7fb-497f-bfaf-41b385b76754 nodeName:}" failed. No retries permitted until 2026-01-29 14:04:18.401993051 +0000 UTC m=+93.096727443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/776ea39d-e7fb-497f-bfaf-41b385b76754-serving-cert") pod "route-controller-manager-6576b87f9c-lhpqt" (UID: "776ea39d-e7fb-497f-bfaf-41b385b76754") : failed to sync secret cache: timed out waiting for the condition Jan 29 14:04:17 crc kubenswrapper[4753]: E0129 14:04:17.902253 4753 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Jan 29 14:04:17 crc kubenswrapper[4753]: E0129 14:04:17.902325 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-client-ca podName:776ea39d-e7fb-497f-bfaf-41b385b76754 nodeName:}" failed. No retries permitted until 2026-01-29 14:04:18.402306689 +0000 UTC m=+93.097041071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-client-ca") pod "route-controller-manager-6576b87f9c-lhpqt" (UID: "776ea39d-e7fb-497f-bfaf-41b385b76754") : failed to sync configmap cache: timed out waiting for the condition Jan 29 14:04:17 crc kubenswrapper[4753]: E0129 14:04:17.902396 4753 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Jan 29 14:04:17 crc kubenswrapper[4753]: E0129 14:04:17.902465 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-config podName:776ea39d-e7fb-497f-bfaf-41b385b76754 nodeName:}" failed. No retries permitted until 2026-01-29 14:04:18.402451423 +0000 UTC m=+93.097186025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-config") pod "route-controller-manager-6576b87f9c-lhpqt" (UID: "776ea39d-e7fb-497f-bfaf-41b385b76754") : failed to sync configmap cache: timed out waiting for the condition Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.918421 4753 request.go:700] Waited for 1.014993023s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.919064 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcdzv\" (UniqueName: \"kubernetes.io/projected/f5125f20-92c4-4700-8c7f-f8c7fc7b48b9-kube-api-access-xcdzv\") pod \"downloads-7954f5f757-jbfcj\" (UID: \"f5125f20-92c4-4700-8c7f-f8c7fc7b48b9\") " pod="openshift-console/downloads-7954f5f757-jbfcj" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.942433 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz2vq\" (UniqueName: \"kubernetes.io/projected/810f6f52-858d-46f6-b2d2-71f9c3135263-kube-api-access-dz2vq\") pod \"console-f9d7485db-bxx4k\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.953192 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-927kk\" (UniqueName: \"kubernetes.io/projected/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-kube-api-access-927kk\") pod \"controller-manager-879f6c89f-5xcbl\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:17 crc kubenswrapper[4753]: I0129 14:04:17.979059 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg5rv\" (UniqueName: \"kubernetes.io/projected/cd6a836a-9268-4a6a-ab7f-00605823dc9f-kube-api-access-rg5rv\") pod \"dns-operator-744455d44c-hf9wc\" (UID: \"cd6a836a-9268-4a6a-ab7f-00605823dc9f\") " pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.000566 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.032687 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.035123 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.039459 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.047864 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.060601 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.064401 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.072115 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.080738 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.084549 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.100776 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.120248 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.128102 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.135444 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jbfcj" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.144429 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.147490 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.153059 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.160430 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.189993 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.201470 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.222124 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.241663 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.262359 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.281833 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.304065 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.323867 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.343870 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.344793 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5xcbl"] Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.360174 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.370530 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589"] Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.380586 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.400396 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.421263 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.425838 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/776ea39d-e7fb-497f-bfaf-41b385b76754-serving-cert\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.425875 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-client-ca\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.425934 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-config\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.442192 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.460328 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.479588 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.504207 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.520037 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.541812 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.568936 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.580803 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.599605 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.617148 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gd25h"] Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.620564 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jvpqh"] Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.620668 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: W0129 14:04:18.625823 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a7be31e_2bc5_4b5c_8d26_a220f35b87d4.slice/crio-59016a807e712e54b8a69d2046a1752f40c6480974b020cdd7ead6ded61989f8 WatchSource:0}: Error finding container 59016a807e712e54b8a69d2046a1752f40c6480974b020cdd7ead6ded61989f8: Status 404 returned error can't find the container with id 59016a807e712e54b8a69d2046a1752f40c6480974b020cdd7ead6ded61989f8 Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.643691 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.663034 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.668955 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hf9wc"] Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.672282 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jbfcj"] Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.672340 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gn6zd"] Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.680572 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.692787 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bxx4k"] Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.704550 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.723539 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.730376 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" event={"ID":"cd6a836a-9268-4a6a-ab7f-00605823dc9f","Type":"ContainerStarted","Data":"af4becf7011d2db007ff48e821f324965e97604df3a6a474f126d75a84e070b2"} Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.731867 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" event={"ID":"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4","Type":"ContainerStarted","Data":"59016a807e712e54b8a69d2046a1752f40c6480974b020cdd7ead6ded61989f8"} Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.733861 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" event={"ID":"4a83492d-36e3-4400-a969-71934ecfc9f7","Type":"ContainerStarted","Data":"6b0d82125df57a7442de78d5a13fc3a210b3ef4dd79f2a9c1088e432cec5775b"} Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.736005 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" event={"ID":"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23","Type":"ContainerStarted","Data":"b22c5c3be2362c805abdf785d9d3ed0a8e49b9a524ae6159b3e13b2b1a8d945d"} Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.736031 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" event={"ID":"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23","Type":"ContainerStarted","Data":"1993f9406507092e575b1f0f4df62719cfa72f813b2f425799ab8a94bbd513a4"} Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.736223 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.738188 4753 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5xcbl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.738236 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" podUID="18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.740012 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.740815 4753 generic.go:334] "Generic (PLEG): container finished" podID="842d8f24-8875-4968-a486-99a53a278850" containerID="82263e204583159252ce5c59740be25b2fb70c8319574f9d7f4b495c98f01b38" exitCode=0 Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.740926 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" event={"ID":"842d8f24-8875-4968-a486-99a53a278850","Type":"ContainerDied","Data":"82263e204583159252ce5c59740be25b2fb70c8319574f9d7f4b495c98f01b38"} Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.740955 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" event={"ID":"842d8f24-8875-4968-a486-99a53a278850","Type":"ContainerStarted","Data":"c02aa714ac91ba9962b6512a38c5bca00acf5621de73272c7dbedf98906d8231"} Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.742949 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" event={"ID":"78dab1db-992a-4ae3-97a1-d70613ac41fe","Type":"ContainerStarted","Data":"8b2bdcac6e220751dfafa684e2625c1bc1687408d7e610d615dcfa2931cb4289"} Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.760533 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.780124 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.800688 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.821611 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.840515 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.864130 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.881448 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.920744 4753 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.921015 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.938323 4753 request.go:700] Waited for 1.951232686s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.941247 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: I0129 14:04:18.963185 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 29 14:04:18 crc kubenswrapper[4753]: E0129 14:04:18.997441 4753 projected.go:288] Couldn't get configMap openshift-route-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.003890 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.022624 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.041138 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.059825 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.082019 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.088370 4753 projected.go:194] Error preparing data for projected volume kube-api-access-gpfcs for pod openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt: failed to sync configmap cache: timed out waiting for the condition Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.088453 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/776ea39d-e7fb-497f-bfaf-41b385b76754-kube-api-access-gpfcs podName:776ea39d-e7fb-497f-bfaf-41b385b76754 nodeName:}" failed. No retries permitted until 2026-01-29 14:04:19.588427152 +0000 UTC m=+94.283161534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gpfcs" (UniqueName: "kubernetes.io/projected/776ea39d-e7fb-497f-bfaf-41b385b76754-kube-api-access-gpfcs") pod "route-controller-manager-6576b87f9c-lhpqt" (UID: "776ea39d-e7fb-497f-bfaf-41b385b76754") : failed to sync configmap cache: timed out waiting for the condition Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.120807 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.133453 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ca5f336c-cf5c-4e2a-b7fb-64acbd039052-available-featuregates\") pod \"openshift-config-operator-7777fb866f-t9bsg\" (UID: \"ca5f336c-cf5c-4e2a-b7fb-64acbd039052\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.133670 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rzh4\" (UniqueName: \"kubernetes.io/projected/7b31af78-d6bd-49eb-8d02-d22af087771d-kube-api-access-5rzh4\") pod \"dns-default-mxrlb\" (UID: \"7b31af78-d6bd-49eb-8d02-d22af087771d\") " pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.133791 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6a18fef7-00cd-4027-bec1-91ded07e3bfb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.133895 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5f336c-cf5c-4e2a-b7fb-64acbd039052-serving-cert\") pod \"openshift-config-operator-7777fb866f-t9bsg\" (UID: \"ca5f336c-cf5c-4e2a-b7fb-64acbd039052\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.134120 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxn5v\" (UniqueName: \"kubernetes.io/projected/ca5f336c-cf5c-4e2a-b7fb-64acbd039052-kube-api-access-qxn5v\") pod \"openshift-config-operator-7777fb866f-t9bsg\" (UID: \"ca5f336c-cf5c-4e2a-b7fb-64acbd039052\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.134200 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db55f8c6-430b-46b6-b862-99a7f86d11da-trusted-ca\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.134237 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-certificates\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.134576 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1db476d8-783c-4520-9b07-dfd4525a064f-metrics-certs\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.134601 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e82f4e07-1554-4230-867f-427e7469c789-config\") pod \"kube-apiserver-operator-766d6c64bb-9gtxn\" (UID: \"e82f4e07-1554-4230-867f-427e7469c789\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.134683 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b31af78-d6bd-49eb-8d02-d22af087771d-metrics-tls\") pod \"dns-default-mxrlb\" (UID: \"7b31af78-d6bd-49eb-8d02-d22af087771d\") " pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.134767 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bslb\" (UniqueName: \"kubernetes.io/projected/6d8327fe-0e6c-46eb-b79e-6440451d8393-kube-api-access-8bslb\") pod \"multus-admission-controller-857f4d67dd-lvbft\" (UID: \"6d8327fe-0e6c-46eb-b79e-6440451d8393\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.134815 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1db476d8-783c-4520-9b07-dfd4525a064f-service-ca-bundle\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.134836 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ed097d8a-81a4-4c18-960e-20041c9df31a-srv-cert\") pod \"olm-operator-6b444d44fb-h24bz\" (UID: \"ed097d8a-81a4-4c18-960e-20041c9df31a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.134880 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e21223f9-863b-45dd-b641-afa73286591f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.135224 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e82f4e07-1554-4230-867f-427e7469c789-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9gtxn\" (UID: \"e82f4e07-1554-4230-867f-427e7469c789\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.135395 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e21223f9-863b-45dd-b641-afa73286591f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.135562 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33156757-7442-4962-ab8e-fe8e8a735fcc-service-ca-bundle\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.135635 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6cb6d4b-c2d5-4efd-8269-93827412a96d-config\") pod \"service-ca-operator-777779d784-fwm9n\" (UID: \"f6cb6d4b-c2d5-4efd-8269-93827412a96d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.135740 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzxdq\" (UniqueName: \"kubernetes.io/projected/1db476d8-783c-4520-9b07-dfd4525a064f-kube-api-access-lzxdq\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.135890 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1db476d8-783c-4520-9b07-dfd4525a064f-default-certificate\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.135979 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1db476d8-783c-4520-9b07-dfd4525a064f-stats-auth\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.136047 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdxn\" (UniqueName: \"kubernetes.io/projected/dcbdaa74-121b-4b7c-bd7b-df2aa01a341c-kube-api-access-qrdxn\") pod \"cluster-samples-operator-665b6dd947-8fv88\" (UID: \"dcbdaa74-121b-4b7c-bd7b-df2aa01a341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.136127 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db55f8c6-430b-46b6-b862-99a7f86d11da-serving-cert\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.136244 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2be7ad19-75d6-4c8e-bf95-5e62182fd5ac-proxy-tls\") pod \"machine-config-controller-84d6567774-6cp9x\" (UID: \"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.136317 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905043f7-d346-450b-b46b-4ffcd29313af-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9dl5\" (UID: \"905043f7-d346-450b-b46b-4ffcd29313af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.136415 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905043f7-d346-450b-b46b-4ffcd29313af-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9dl5\" (UID: \"905043f7-d346-450b-b46b-4ffcd29313af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.136485 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9cdn\" (UniqueName: \"kubernetes.io/projected/f6cb6d4b-c2d5-4efd-8269-93827412a96d-kube-api-access-c9cdn\") pod \"service-ca-operator-777779d784-fwm9n\" (UID: \"f6cb6d4b-c2d5-4efd-8269-93827412a96d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.136571 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ed95e9c-31ac-4716-8c19-a76da12afe85-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f297q\" (UID: \"6ed95e9c-31ac-4716-8c19-a76da12afe85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.136650 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ed097d8a-81a4-4c18-960e-20041c9df31a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h24bz\" (UID: \"ed097d8a-81a4-4c18-960e-20041c9df31a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.136735 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8vfn\" (UniqueName: \"kubernetes.io/projected/905043f7-d346-450b-b46b-4ffcd29313af-kube-api-access-w8vfn\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9dl5\" (UID: \"905043f7-d346-450b-b46b-4ffcd29313af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.138235 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3423cc35-1ccb-487e-8d2e-fc72a2f03d9f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnbcl\" (UID: \"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.138334 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-bound-sa-token\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.138382 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgvn6\" (UniqueName: \"kubernetes.io/projected/042c5d25-f716-4d9d-a663-28f18b9dfbc1-kube-api-access-pgvn6\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.138410 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmdkp\" (UniqueName: \"kubernetes.io/projected/ed097d8a-81a4-4c18-960e-20041c9df31a-kube-api-access-qmdkp\") pod \"olm-operator-6b444d44fb-h24bz\" (UID: \"ed097d8a-81a4-4c18-960e-20041c9df31a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.138475 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2be7ad19-75d6-4c8e-bf95-5e62182fd5ac-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6cp9x\" (UID: \"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.138616 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6a18fef7-00cd-4027-bec1-91ded07e3bfb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.138720 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e21223f9-863b-45dd-b641-afa73286591f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.138847 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33156757-7442-4962-ab8e-fe8e8a735fcc-serving-cert\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.138932 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e82f4e07-1554-4230-867f-427e7469c789-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9gtxn\" (UID: \"e82f4e07-1554-4230-867f-427e7469c789\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139003 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt49s\" (UniqueName: \"kubernetes.io/projected/33156757-7442-4962-ab8e-fe8e8a735fcc-kube-api-access-vt49s\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139065 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js8l9\" (UniqueName: \"kubernetes.io/projected/e21223f9-863b-45dd-b641-afa73286591f-kube-api-access-js8l9\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139126 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cmg4\" (UniqueName: \"kubernetes.io/projected/6ed95e9c-31ac-4716-8c19-a76da12afe85-kube-api-access-8cmg4\") pod \"package-server-manager-789f6589d5-f297q\" (UID: \"6ed95e9c-31ac-4716-8c19-a76da12afe85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139270 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139372 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33156757-7442-4962-ab8e-fe8e8a735fcc-config\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139410 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfcv2\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-kube-api-access-jfcv2\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139559 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6d8327fe-0e6c-46eb-b79e-6440451d8393-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lvbft\" (UID: \"6d8327fe-0e6c-46eb-b79e-6440451d8393\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139585 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3423cc35-1ccb-487e-8d2e-fc72a2f03d9f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnbcl\" (UID: \"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139660 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b31af78-d6bd-49eb-8d02-d22af087771d-config-volume\") pod \"dns-default-mxrlb\" (UID: \"7b31af78-d6bd-49eb-8d02-d22af087771d\") " pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139721 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-842ww\" (UniqueName: \"kubernetes.io/projected/2be7ad19-75d6-4c8e-bf95-5e62182fd5ac-kube-api-access-842ww\") pod \"machine-config-controller-84d6567774-6cp9x\" (UID: \"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139743 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-tls\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.139759 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db55f8c6-430b-46b6-b862-99a7f86d11da-config\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.143907 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.146168 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:19.646134528 +0000 UTC m=+94.340868910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.146324 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042c5d25-f716-4d9d-a663-28f18b9dfbc1-config\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.146731 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33156757-7442-4962-ab8e-fe8e8a735fcc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.146860 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-trusted-ca\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.146916 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcbdaa74-121b-4b7c-bd7b-df2aa01a341c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8fv88\" (UID: \"dcbdaa74-121b-4b7c-bd7b-df2aa01a341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.147026 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6cb6d4b-c2d5-4efd-8269-93827412a96d-serving-cert\") pod \"service-ca-operator-777779d784-fwm9n\" (UID: \"f6cb6d4b-c2d5-4efd-8269-93827412a96d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.147105 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/042c5d25-f716-4d9d-a663-28f18b9dfbc1-auth-proxy-config\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.147180 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx7lx\" (UniqueName: \"kubernetes.io/projected/db55f8c6-430b-46b6-b862-99a7f86d11da-kube-api-access-cx7lx\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.148057 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpz7t\" (UniqueName: \"kubernetes.io/projected/3423cc35-1ccb-487e-8d2e-fc72a2f03d9f-kube-api-access-cpz7t\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnbcl\" (UID: \"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.148125 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/042c5d25-f716-4d9d-a663-28f18b9dfbc1-machine-approver-tls\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.162101 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.167719 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-client-ca\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.180669 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.193693 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/776ea39d-e7fb-497f-bfaf-41b385b76754-serving-cert\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.200190 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.210220 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-config\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.221032 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.240081 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249260 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.249457 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:19.74942667 +0000 UTC m=+94.444161052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249554 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6cb6d4b-c2d5-4efd-8269-93827412a96d-serving-cert\") pod \"service-ca-operator-777779d784-fwm9n\" (UID: \"f6cb6d4b-c2d5-4efd-8269-93827412a96d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249602 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8lc6\" (UniqueName: \"kubernetes.io/projected/de6aa533-9dd6-4579-aee2-38c2aebc7e31-kube-api-access-f8lc6\") pod \"ingress-canary-5qnph\" (UID: \"de6aa533-9dd6-4579-aee2-38c2aebc7e31\") " pod="openshift-ingress-canary/ingress-canary-5qnph" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249627 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83519b1e-5a50-4774-a86c-7117668abf6e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-txd8j\" (UID: \"83519b1e-5a50-4774-a86c-7117668abf6e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249652 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/042c5d25-f716-4d9d-a663-28f18b9dfbc1-auth-proxy-config\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249675 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx7lx\" (UniqueName: \"kubernetes.io/projected/db55f8c6-430b-46b6-b862-99a7f86d11da-kube-api-access-cx7lx\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249701 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpz7t\" (UniqueName: \"kubernetes.io/projected/3423cc35-1ccb-487e-8d2e-fc72a2f03d9f-kube-api-access-cpz7t\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnbcl\" (UID: \"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249724 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/042c5d25-f716-4d9d-a663-28f18b9dfbc1-machine-approver-tls\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249749 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6e8f94ac-ad91-40e6-868d-e5e8de3de9b9-node-bootstrap-token\") pod \"machine-config-server-8dzsr\" (UID: \"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9\") " pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249789 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83519b1e-5a50-4774-a86c-7117668abf6e-config\") pod \"kube-controller-manager-operator-78b949d7b-txd8j\" (UID: \"83519b1e-5a50-4774-a86c-7117668abf6e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249813 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a293056-ca09-4e84-86a5-11785aaa9a62-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d76gc\" (UID: \"5a293056-ca09-4e84-86a5-11785aaa9a62\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249840 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ca5f336c-cf5c-4e2a-b7fb-64acbd039052-available-featuregates\") pod \"openshift-config-operator-7777fb866f-t9bsg\" (UID: \"ca5f336c-cf5c-4e2a-b7fb-64acbd039052\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249876 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6a18fef7-00cd-4027-bec1-91ded07e3bfb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249900 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5f336c-cf5c-4e2a-b7fb-64acbd039052-serving-cert\") pod \"openshift-config-operator-7777fb866f-t9bsg\" (UID: \"ca5f336c-cf5c-4e2a-b7fb-64acbd039052\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249925 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rzh4\" (UniqueName: \"kubernetes.io/projected/7b31af78-d6bd-49eb-8d02-d22af087771d-kube-api-access-5rzh4\") pod \"dns-default-mxrlb\" (UID: \"7b31af78-d6bd-49eb-8d02-d22af087771d\") " pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249952 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-csi-data-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.249975 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6e8f94ac-ad91-40e6-868d-e5e8de3de9b9-certs\") pod \"machine-config-server-8dzsr\" (UID: \"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9\") " pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250001 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljrc5\" (UniqueName: \"kubernetes.io/projected/5beed408-607e-42c8-be14-5b942a65f510-kube-api-access-ljrc5\") pod \"service-ca-9c57cc56f-qgfrx\" (UID: \"5beed408-607e-42c8-be14-5b942a65f510\") " pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250020 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ff4ba356-bf66-4e46-83f9-224c75fbbc85-tmpfs\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250043 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/647fe5d7-4243-4608-8351-6bc2e13b9f15-secret-volume\") pod \"collect-profiles-29494920-hwfxp\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250068 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxn5v\" (UniqueName: \"kubernetes.io/projected/ca5f336c-cf5c-4e2a-b7fb-64acbd039052-kube-api-access-qxn5v\") pod \"openshift-config-operator-7777fb866f-t9bsg\" (UID: \"ca5f336c-cf5c-4e2a-b7fb-64acbd039052\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250091 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c4427451-802a-4521-9a21-3363d191e650-srv-cert\") pod \"catalog-operator-68c6474976-st6s2\" (UID: \"c4427451-802a-4521-9a21-3363d191e650\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250116 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db55f8c6-430b-46b6-b862-99a7f86d11da-trusted-ca\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250145 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5beed408-607e-42c8-be14-5b942a65f510-signing-key\") pod \"service-ca-9c57cc56f-qgfrx\" (UID: \"5beed408-607e-42c8-be14-5b942a65f510\") " pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250195 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vkq94\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250353 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/042c5d25-f716-4d9d-a663-28f18b9dfbc1-auth-proxy-config\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250673 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ca5f336c-cf5c-4e2a-b7fb-64acbd039052-available-featuregates\") pod \"openshift-config-operator-7777fb866f-t9bsg\" (UID: \"ca5f336c-cf5c-4e2a-b7fb-64acbd039052\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.250967 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1db476d8-783c-4520-9b07-dfd4525a064f-metrics-certs\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251005 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8bc8\" (UniqueName: \"kubernetes.io/projected/266c621d-99dd-42a9-83c3-ac5288325710-kube-api-access-b8bc8\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251030 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6230cf30-7e1a-40ea-babe-72eb39ac2be7-etcd-client\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251052 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85dd3e88-2190-4d98-94d6-26b9fee9d20f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lb8gb\" (UID: \"85dd3e88-2190-4d98-94d6-26b9fee9d20f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251102 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-certificates\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251128 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e82f4e07-1554-4230-867f-427e7469c789-config\") pod \"kube-apiserver-operator-766d6c64bb-9gtxn\" (UID: \"e82f4e07-1554-4230-867f-427e7469c789\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251169 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86f9x\" (UniqueName: \"kubernetes.io/projected/6230cf30-7e1a-40ea-babe-72eb39ac2be7-kube-api-access-86f9x\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251194 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzg58\" (UniqueName: \"kubernetes.io/projected/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-kube-api-access-dzg58\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251219 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldbdr\" (UniqueName: \"kubernetes.io/projected/c4427451-802a-4521-9a21-3363d191e650-kube-api-access-ldbdr\") pod \"catalog-operator-68c6474976-st6s2\" (UID: \"c4427451-802a-4521-9a21-3363d191e650\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251239 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251260 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/266c621d-99dd-42a9-83c3-ac5288325710-metrics-tls\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251282 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-socket-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251306 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b31af78-d6bd-49eb-8d02-d22af087771d-metrics-tls\") pod \"dns-default-mxrlb\" (UID: \"7b31af78-d6bd-49eb-8d02-d22af087771d\") " pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251327 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f82mk\" (UniqueName: \"kubernetes.io/projected/19f60d75-bda2-4816-b146-f5e29203ffbc-kube-api-access-f82mk\") pod \"marketplace-operator-79b997595-vkq94\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251351 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85dd3e88-2190-4d98-94d6-26b9fee9d20f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lb8gb\" (UID: \"85dd3e88-2190-4d98-94d6-26b9fee9d20f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251373 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6230cf30-7e1a-40ea-babe-72eb39ac2be7-etcd-service-ca\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251393 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1db476d8-783c-4520-9b07-dfd4525a064f-service-ca-bundle\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251416 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bslb\" (UniqueName: \"kubernetes.io/projected/6d8327fe-0e6c-46eb-b79e-6440451d8393-kube-api-access-8bslb\") pod \"multus-admission-controller-857f4d67dd-lvbft\" (UID: \"6d8327fe-0e6c-46eb-b79e-6440451d8393\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251441 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ed097d8a-81a4-4c18-960e-20041c9df31a-srv-cert\") pod \"olm-operator-6b444d44fb-h24bz\" (UID: \"ed097d8a-81a4-4c18-960e-20041c9df31a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251464 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e21223f9-863b-45dd-b641-afa73286591f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251490 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6230cf30-7e1a-40ea-babe-72eb39ac2be7-config\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251511 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6230cf30-7e1a-40ea-babe-72eb39ac2be7-serving-cert\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251533 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e82f4e07-1554-4230-867f-427e7469c789-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9gtxn\" (UID: \"e82f4e07-1554-4230-867f-427e7469c789\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251557 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e21223f9-863b-45dd-b641-afa73286591f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251577 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6cb6d4b-c2d5-4efd-8269-93827412a96d-config\") pod \"service-ca-operator-777779d784-fwm9n\" (UID: \"f6cb6d4b-c2d5-4efd-8269-93827412a96d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251597 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzxdq\" (UniqueName: \"kubernetes.io/projected/1db476d8-783c-4520-9b07-dfd4525a064f-kube-api-access-lzxdq\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251619 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33156757-7442-4962-ab8e-fe8e8a735fcc-service-ca-bundle\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251642 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1db476d8-783c-4520-9b07-dfd4525a064f-default-certificate\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251665 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1db476d8-783c-4520-9b07-dfd4525a064f-stats-auth\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251688 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/266c621d-99dd-42a9-83c3-ac5288325710-trusted-ca\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251727 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdxn\" (UniqueName: \"kubernetes.io/projected/dcbdaa74-121b-4b7c-bd7b-df2aa01a341c-kube-api-access-qrdxn\") pod \"cluster-samples-operator-665b6dd947-8fv88\" (UID: \"dcbdaa74-121b-4b7c-bd7b-df2aa01a341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251747 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvcjt\" (UniqueName: \"kubernetes.io/projected/f5f34f58-d006-4253-9328-eacdd6728e68-kube-api-access-hvcjt\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251770 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db55f8c6-430b-46b6-b862-99a7f86d11da-serving-cert\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251793 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2be7ad19-75d6-4c8e-bf95-5e62182fd5ac-proxy-tls\") pod \"machine-config-controller-84d6567774-6cp9x\" (UID: \"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251815 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905043f7-d346-450b-b46b-4ffcd29313af-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9dl5\" (UID: \"905043f7-d346-450b-b46b-4ffcd29313af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251836 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905043f7-d346-450b-b46b-4ffcd29313af-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9dl5\" (UID: \"905043f7-d346-450b-b46b-4ffcd29313af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251860 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ed95e9c-31ac-4716-8c19-a76da12afe85-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f297q\" (UID: \"6ed95e9c-31ac-4716-8c19-a76da12afe85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251882 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9cdn\" (UniqueName: \"kubernetes.io/projected/f6cb6d4b-c2d5-4efd-8269-93827412a96d-kube-api-access-c9cdn\") pod \"service-ca-operator-777779d784-fwm9n\" (UID: \"f6cb6d4b-c2d5-4efd-8269-93827412a96d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251904 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-proxy-tls\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251928 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ed097d8a-81a4-4c18-960e-20041c9df31a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h24bz\" (UID: \"ed097d8a-81a4-4c18-960e-20041c9df31a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251952 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8vfn\" (UniqueName: \"kubernetes.io/projected/905043f7-d346-450b-b46b-4ffcd29313af-kube-api-access-w8vfn\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9dl5\" (UID: \"905043f7-d346-450b-b46b-4ffcd29313af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251974 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmbv9\" (UniqueName: \"kubernetes.io/projected/ff4ba356-bf66-4e46-83f9-224c75fbbc85-kube-api-access-tmbv9\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.251995 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3423cc35-1ccb-487e-8d2e-fc72a2f03d9f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnbcl\" (UID: \"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252018 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czcmt\" (UniqueName: \"kubernetes.io/projected/647fe5d7-4243-4608-8351-6bc2e13b9f15-kube-api-access-czcmt\") pod \"collect-profiles-29494920-hwfxp\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252059 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgvn6\" (UniqueName: \"kubernetes.io/projected/042c5d25-f716-4d9d-a663-28f18b9dfbc1-kube-api-access-pgvn6\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252084 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmdkp\" (UniqueName: \"kubernetes.io/projected/ed097d8a-81a4-4c18-960e-20041c9df31a-kube-api-access-qmdkp\") pod \"olm-operator-6b444d44fb-h24bz\" (UID: \"ed097d8a-81a4-4c18-960e-20041c9df31a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252109 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff4ba356-bf66-4e46-83f9-224c75fbbc85-webhook-cert\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252130 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de6aa533-9dd6-4579-aee2-38c2aebc7e31-cert\") pod \"ingress-canary-5qnph\" (UID: \"de6aa533-9dd6-4579-aee2-38c2aebc7e31\") " pod="openshift-ingress-canary/ingress-canary-5qnph" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252167 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-bound-sa-token\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252191 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff4ba356-bf66-4e46-83f9-224c75fbbc85-apiservice-cert\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252210 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6a18fef7-00cd-4027-bec1-91ded07e3bfb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252233 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2be7ad19-75d6-4c8e-bf95-5e62182fd5ac-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6cp9x\" (UID: \"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252257 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f98pb\" (UniqueName: \"kubernetes.io/projected/a98eae96-08b8-4589-ab80-07e0a551d398-kube-api-access-f98pb\") pod \"migrator-59844c95c7-99c7v\" (UID: \"a98eae96-08b8-4589-ab80-07e0a551d398\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252278 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6230cf30-7e1a-40ea-babe-72eb39ac2be7-etcd-ca\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253068 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e21223f9-863b-45dd-b641-afa73286591f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253097 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/266c621d-99dd-42a9-83c3-ac5288325710-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253130 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-images\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253178 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33156757-7442-4962-ab8e-fe8e8a735fcc-serving-cert\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253197 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e82f4e07-1554-4230-867f-427e7469c789-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9gtxn\" (UID: \"e82f4e07-1554-4230-867f-427e7469c789\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253215 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85dd3e88-2190-4d98-94d6-26b9fee9d20f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lb8gb\" (UID: \"85dd3e88-2190-4d98-94d6-26b9fee9d20f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253236 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-registration-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253255 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt49s\" (UniqueName: \"kubernetes.io/projected/33156757-7442-4962-ab8e-fe8e8a735fcc-kube-api-access-vt49s\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253296 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js8l9\" (UniqueName: \"kubernetes.io/projected/e21223f9-863b-45dd-b641-afa73286591f-kube-api-access-js8l9\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253317 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cmg4\" (UniqueName: \"kubernetes.io/projected/6ed95e9c-31ac-4716-8c19-a76da12afe85-kube-api-access-8cmg4\") pod \"package-server-manager-789f6589d5-f297q\" (UID: \"6ed95e9c-31ac-4716-8c19-a76da12afe85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253441 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6cb6d4b-c2d5-4efd-8269-93827412a96d-config\") pod \"service-ca-operator-777779d784-fwm9n\" (UID: \"f6cb6d4b-c2d5-4efd-8269-93827412a96d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253447 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253513 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33156757-7442-4962-ab8e-fe8e8a735fcc-config\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253543 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5beed408-607e-42c8-be14-5b942a65f510-signing-cabundle\") pod \"service-ca-9c57cc56f-qgfrx\" (UID: \"5beed408-607e-42c8-be14-5b942a65f510\") " pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253581 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfcv2\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-kube-api-access-jfcv2\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253609 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c4427451-802a-4521-9a21-3363d191e650-profile-collector-cert\") pod \"catalog-operator-68c6474976-st6s2\" (UID: \"c4427451-802a-4521-9a21-3363d191e650\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253633 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2qhq\" (UniqueName: \"kubernetes.io/projected/6e8f94ac-ad91-40e6-868d-e5e8de3de9b9-kube-api-access-n2qhq\") pod \"machine-config-server-8dzsr\" (UID: \"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9\") " pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253668 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/647fe5d7-4243-4608-8351-6bc2e13b9f15-config-volume\") pod \"collect-profiles-29494920-hwfxp\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253700 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3423cc35-1ccb-487e-8d2e-fc72a2f03d9f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnbcl\" (UID: \"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253722 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-mountpoint-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253743 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83519b1e-5a50-4774-a86c-7117668abf6e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-txd8j\" (UID: \"83519b1e-5a50-4774-a86c-7117668abf6e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253768 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvvp9\" (UniqueName: \"kubernetes.io/projected/5a293056-ca09-4e84-86a5-11785aaa9a62-kube-api-access-tvvp9\") pod \"control-plane-machine-set-operator-78cbb6b69f-d76gc\" (UID: \"5a293056-ca09-4e84-86a5-11785aaa9a62\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253794 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6d8327fe-0e6c-46eb-b79e-6440451d8393-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lvbft\" (UID: \"6d8327fe-0e6c-46eb-b79e-6440451d8393\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253817 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b31af78-d6bd-49eb-8d02-d22af087771d-config-volume\") pod \"dns-default-mxrlb\" (UID: \"7b31af78-d6bd-49eb-8d02-d22af087771d\") " pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253836 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-plugins-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253865 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-842ww\" (UniqueName: \"kubernetes.io/projected/2be7ad19-75d6-4c8e-bf95-5e62182fd5ac-kube-api-access-842ww\") pod \"machine-config-controller-84d6567774-6cp9x\" (UID: \"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253886 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-tls\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253902 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db55f8c6-430b-46b6-b862-99a7f86d11da-config\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253927 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042c5d25-f716-4d9d-a663-28f18b9dfbc1-config\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253950 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33156757-7442-4962-ab8e-fe8e8a735fcc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253967 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-trusted-ca\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.253985 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcbdaa74-121b-4b7c-bd7b-df2aa01a341c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8fv88\" (UID: \"dcbdaa74-121b-4b7c-bd7b-df2aa01a341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.254004 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vkq94\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.252048 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db55f8c6-430b-46b6-b862-99a7f86d11da-trusted-ca\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.254989 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2be7ad19-75d6-4c8e-bf95-5e62182fd5ac-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6cp9x\" (UID: \"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.255226 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e82f4e07-1554-4230-867f-427e7469c789-config\") pod \"kube-apiserver-operator-766d6c64bb-9gtxn\" (UID: \"e82f4e07-1554-4230-867f-427e7469c789\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.255406 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6cb6d4b-c2d5-4efd-8269-93827412a96d-serving-cert\") pod \"service-ca-operator-777779d784-fwm9n\" (UID: \"f6cb6d4b-c2d5-4efd-8269-93827412a96d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.255472 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6a18fef7-00cd-4027-bec1-91ded07e3bfb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.256000 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3423cc35-1ccb-487e-8d2e-fc72a2f03d9f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnbcl\" (UID: \"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.256988 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33156757-7442-4962-ab8e-fe8e8a735fcc-service-ca-bundle\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.257796 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-certificates\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.257819 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/042c5d25-f716-4d9d-a663-28f18b9dfbc1-machine-approver-tls\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.257852 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1db476d8-783c-4520-9b07-dfd4525a064f-metrics-certs\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.258502 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b31af78-d6bd-49eb-8d02-d22af087771d-config-volume\") pod \"dns-default-mxrlb\" (UID: \"7b31af78-d6bd-49eb-8d02-d22af087771d\") " pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.258706 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1db476d8-783c-4520-9b07-dfd4525a064f-service-ca-bundle\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.259187 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e82f4e07-1554-4230-867f-427e7469c789-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9gtxn\" (UID: \"e82f4e07-1554-4230-867f-427e7469c789\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.259659 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33156757-7442-4962-ab8e-fe8e8a735fcc-config\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.260301 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905043f7-d346-450b-b46b-4ffcd29313af-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9dl5\" (UID: \"905043f7-d346-450b-b46b-4ffcd29313af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.260428 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e21223f9-863b-45dd-b641-afa73286591f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.261646 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e21223f9-863b-45dd-b641-afa73286591f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.261682 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ed097d8a-81a4-4c18-960e-20041c9df31a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h24bz\" (UID: \"ed097d8a-81a4-4c18-960e-20041c9df31a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.262726 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:19.76269262 +0000 UTC m=+94.457427002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.263843 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db55f8c6-430b-46b6-b862-99a7f86d11da-config\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.263904 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33156757-7442-4962-ab8e-fe8e8a735fcc-serving-cert\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.264299 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ed097d8a-81a4-4c18-960e-20041c9df31a-srv-cert\") pod \"olm-operator-6b444d44fb-h24bz\" (UID: \"ed097d8a-81a4-4c18-960e-20041c9df31a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.264445 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-trusted-ca\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.264786 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33156757-7442-4962-ab8e-fe8e8a735fcc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.266188 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b31af78-d6bd-49eb-8d02-d22af087771d-metrics-tls\") pod \"dns-default-mxrlb\" (UID: \"7b31af78-d6bd-49eb-8d02-d22af087771d\") " pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.266431 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ed95e9c-31ac-4716-8c19-a76da12afe85-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f297q\" (UID: \"6ed95e9c-31ac-4716-8c19-a76da12afe85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.266667 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042c5d25-f716-4d9d-a663-28f18b9dfbc1-config\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.267922 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1db476d8-783c-4520-9b07-dfd4525a064f-stats-auth\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.268765 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2be7ad19-75d6-4c8e-bf95-5e62182fd5ac-proxy-tls\") pod \"machine-config-controller-84d6567774-6cp9x\" (UID: \"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.268942 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-tls\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.269093 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6a18fef7-00cd-4027-bec1-91ded07e3bfb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.269478 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db55f8c6-430b-46b6-b862-99a7f86d11da-serving-cert\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.269666 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6d8327fe-0e6c-46eb-b79e-6440451d8393-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lvbft\" (UID: \"6d8327fe-0e6c-46eb-b79e-6440451d8393\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.270209 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5f336c-cf5c-4e2a-b7fb-64acbd039052-serving-cert\") pod \"openshift-config-operator-7777fb866f-t9bsg\" (UID: \"ca5f336c-cf5c-4e2a-b7fb-64acbd039052\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.274657 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3423cc35-1ccb-487e-8d2e-fc72a2f03d9f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnbcl\" (UID: \"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.275077 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1db476d8-783c-4520-9b07-dfd4525a064f-default-certificate\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.275203 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcbdaa74-121b-4b7c-bd7b-df2aa01a341c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8fv88\" (UID: \"dcbdaa74-121b-4b7c-bd7b-df2aa01a341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.276486 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905043f7-d346-450b-b46b-4ffcd29313af-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9dl5\" (UID: \"905043f7-d346-450b-b46b-4ffcd29313af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.296956 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx7lx\" (UniqueName: \"kubernetes.io/projected/db55f8c6-430b-46b6-b862-99a7f86d11da-kube-api-access-cx7lx\") pod \"console-operator-58897d9998-rls7b\" (UID: \"db55f8c6-430b-46b6-b862-99a7f86d11da\") " pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.316283 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpz7t\" (UniqueName: \"kubernetes.io/projected/3423cc35-1ccb-487e-8d2e-fc72a2f03d9f-kube-api-access-cpz7t\") pod \"openshift-controller-manager-operator-756b6f6bc6-lnbcl\" (UID: \"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.341878 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxn5v\" (UniqueName: \"kubernetes.io/projected/ca5f336c-cf5c-4e2a-b7fb-64acbd039052-kube-api-access-qxn5v\") pod \"openshift-config-operator-7777fb866f-t9bsg\" (UID: \"ca5f336c-cf5c-4e2a-b7fb-64acbd039052\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.351659 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355176 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355385 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/266c621d-99dd-42a9-83c3-ac5288325710-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355444 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-images\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355480 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85dd3e88-2190-4d98-94d6-26b9fee9d20f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lb8gb\" (UID: \"85dd3e88-2190-4d98-94d6-26b9fee9d20f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355532 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-registration-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355570 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5beed408-607e-42c8-be14-5b942a65f510-signing-cabundle\") pod \"service-ca-9c57cc56f-qgfrx\" (UID: \"5beed408-607e-42c8-be14-5b942a65f510\") " pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355606 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c4427451-802a-4521-9a21-3363d191e650-profile-collector-cert\") pod \"catalog-operator-68c6474976-st6s2\" (UID: \"c4427451-802a-4521-9a21-3363d191e650\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355634 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2qhq\" (UniqueName: \"kubernetes.io/projected/6e8f94ac-ad91-40e6-868d-e5e8de3de9b9-kube-api-access-n2qhq\") pod \"machine-config-server-8dzsr\" (UID: \"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9\") " pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355672 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/647fe5d7-4243-4608-8351-6bc2e13b9f15-config-volume\") pod \"collect-profiles-29494920-hwfxp\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355701 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-mountpoint-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355728 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83519b1e-5a50-4774-a86c-7117668abf6e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-txd8j\" (UID: \"83519b1e-5a50-4774-a86c-7117668abf6e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355751 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvvp9\" (UniqueName: \"kubernetes.io/projected/5a293056-ca09-4e84-86a5-11785aaa9a62-kube-api-access-tvvp9\") pod \"control-plane-machine-set-operator-78cbb6b69f-d76gc\" (UID: \"5a293056-ca09-4e84-86a5-11785aaa9a62\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355777 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-plugins-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355820 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vkq94\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355850 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8lc6\" (UniqueName: \"kubernetes.io/projected/de6aa533-9dd6-4579-aee2-38c2aebc7e31-kube-api-access-f8lc6\") pod \"ingress-canary-5qnph\" (UID: \"de6aa533-9dd6-4579-aee2-38c2aebc7e31\") " pod="openshift-ingress-canary/ingress-canary-5qnph" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355875 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83519b1e-5a50-4774-a86c-7117668abf6e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-txd8j\" (UID: \"83519b1e-5a50-4774-a86c-7117668abf6e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355903 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6e8f94ac-ad91-40e6-868d-e5e8de3de9b9-node-bootstrap-token\") pod \"machine-config-server-8dzsr\" (UID: \"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9\") " pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355926 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83519b1e-5a50-4774-a86c-7117668abf6e-config\") pod \"kube-controller-manager-operator-78b949d7b-txd8j\" (UID: \"83519b1e-5a50-4774-a86c-7117668abf6e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355954 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a293056-ca09-4e84-86a5-11785aaa9a62-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d76gc\" (UID: \"5a293056-ca09-4e84-86a5-11785aaa9a62\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.355980 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-csi-data-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356024 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6e8f94ac-ad91-40e6-868d-e5e8de3de9b9-certs\") pod \"machine-config-server-8dzsr\" (UID: \"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9\") " pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356050 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljrc5\" (UniqueName: \"kubernetes.io/projected/5beed408-607e-42c8-be14-5b942a65f510-kube-api-access-ljrc5\") pod \"service-ca-9c57cc56f-qgfrx\" (UID: \"5beed408-607e-42c8-be14-5b942a65f510\") " pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356075 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ff4ba356-bf66-4e46-83f9-224c75fbbc85-tmpfs\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356106 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/647fe5d7-4243-4608-8351-6bc2e13b9f15-secret-volume\") pod \"collect-profiles-29494920-hwfxp\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356130 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c4427451-802a-4521-9a21-3363d191e650-srv-cert\") pod \"catalog-operator-68c6474976-st6s2\" (UID: \"c4427451-802a-4521-9a21-3363d191e650\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356177 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5beed408-607e-42c8-be14-5b942a65f510-signing-key\") pod \"service-ca-9c57cc56f-qgfrx\" (UID: \"5beed408-607e-42c8-be14-5b942a65f510\") " pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356206 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vkq94\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356231 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8bc8\" (UniqueName: \"kubernetes.io/projected/266c621d-99dd-42a9-83c3-ac5288325710-kube-api-access-b8bc8\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356258 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6230cf30-7e1a-40ea-babe-72eb39ac2be7-etcd-client\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356282 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85dd3e88-2190-4d98-94d6-26b9fee9d20f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lb8gb\" (UID: \"85dd3e88-2190-4d98-94d6-26b9fee9d20f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356309 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86f9x\" (UniqueName: \"kubernetes.io/projected/6230cf30-7e1a-40ea-babe-72eb39ac2be7-kube-api-access-86f9x\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356332 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzg58\" (UniqueName: \"kubernetes.io/projected/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-kube-api-access-dzg58\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356359 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/266c621d-99dd-42a9-83c3-ac5288325710-metrics-tls\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356382 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-socket-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356404 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldbdr\" (UniqueName: \"kubernetes.io/projected/c4427451-802a-4521-9a21-3363d191e650-kube-api-access-ldbdr\") pod \"catalog-operator-68c6474976-st6s2\" (UID: \"c4427451-802a-4521-9a21-3363d191e650\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356430 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356458 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f82mk\" (UniqueName: \"kubernetes.io/projected/19f60d75-bda2-4816-b146-f5e29203ffbc-kube-api-access-f82mk\") pod \"marketplace-operator-79b997595-vkq94\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356485 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85dd3e88-2190-4d98-94d6-26b9fee9d20f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lb8gb\" (UID: \"85dd3e88-2190-4d98-94d6-26b9fee9d20f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356510 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6230cf30-7e1a-40ea-babe-72eb39ac2be7-etcd-service-ca\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356547 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6230cf30-7e1a-40ea-babe-72eb39ac2be7-config\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356574 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6230cf30-7e1a-40ea-babe-72eb39ac2be7-serving-cert\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356625 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-images\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356636 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/266c621d-99dd-42a9-83c3-ac5288325710-trusted-ca\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356675 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvcjt\" (UniqueName: \"kubernetes.io/projected/f5f34f58-d006-4253-9328-eacdd6728e68-kube-api-access-hvcjt\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356723 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5beed408-607e-42c8-be14-5b942a65f510-signing-cabundle\") pod \"service-ca-9c57cc56f-qgfrx\" (UID: \"5beed408-607e-42c8-be14-5b942a65f510\") " pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.356724 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:19.856706854 +0000 UTC m=+94.551441246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356778 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-proxy-tls\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356819 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmbv9\" (UniqueName: \"kubernetes.io/projected/ff4ba356-bf66-4e46-83f9-224c75fbbc85-kube-api-access-tmbv9\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356869 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czcmt\" (UniqueName: \"kubernetes.io/projected/647fe5d7-4243-4608-8351-6bc2e13b9f15-kube-api-access-czcmt\") pod \"collect-profiles-29494920-hwfxp\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356932 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff4ba356-bf66-4e46-83f9-224c75fbbc85-webhook-cert\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356956 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de6aa533-9dd6-4579-aee2-38c2aebc7e31-cert\") pod \"ingress-canary-5qnph\" (UID: \"de6aa533-9dd6-4579-aee2-38c2aebc7e31\") " pod="openshift-ingress-canary/ingress-canary-5qnph" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.356991 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff4ba356-bf66-4e46-83f9-224c75fbbc85-apiservice-cert\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.357015 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f98pb\" (UniqueName: \"kubernetes.io/projected/a98eae96-08b8-4589-ab80-07e0a551d398-kube-api-access-f98pb\") pod \"migrator-59844c95c7-99c7v\" (UID: \"a98eae96-08b8-4589-ab80-07e0a551d398\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.357042 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6230cf30-7e1a-40ea-babe-72eb39ac2be7-etcd-ca\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.357640 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6230cf30-7e1a-40ea-babe-72eb39ac2be7-etcd-ca\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.359858 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rzh4\" (UniqueName: \"kubernetes.io/projected/7b31af78-d6bd-49eb-8d02-d22af087771d-kube-api-access-5rzh4\") pod \"dns-default-mxrlb\" (UID: \"7b31af78-d6bd-49eb-8d02-d22af087771d\") " pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.360420 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-registration-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.362413 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c4427451-802a-4521-9a21-3363d191e650-srv-cert\") pod \"catalog-operator-68c6474976-st6s2\" (UID: \"c4427451-802a-4521-9a21-3363d191e650\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.363489 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/647fe5d7-4243-4608-8351-6bc2e13b9f15-config-volume\") pod \"collect-profiles-29494920-hwfxp\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.363574 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-mountpoint-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.363740 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-plugins-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.364978 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vkq94\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.365897 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-csi-data-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.368501 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85dd3e88-2190-4d98-94d6-26b9fee9d20f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lb8gb\" (UID: \"85dd3e88-2190-4d98-94d6-26b9fee9d20f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.369340 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83519b1e-5a50-4774-a86c-7117668abf6e-config\") pod \"kube-controller-manager-operator-78b949d7b-txd8j\" (UID: \"83519b1e-5a50-4774-a86c-7117668abf6e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.369378 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83519b1e-5a50-4774-a86c-7117668abf6e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-txd8j\" (UID: \"83519b1e-5a50-4774-a86c-7117668abf6e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.369676 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ff4ba356-bf66-4e46-83f9-224c75fbbc85-tmpfs\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.370790 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c4427451-802a-4521-9a21-3363d191e650-profile-collector-cert\") pod \"catalog-operator-68c6474976-st6s2\" (UID: \"c4427451-802a-4521-9a21-3363d191e650\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.371071 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6e8f94ac-ad91-40e6-868d-e5e8de3de9b9-node-bootstrap-token\") pod \"machine-config-server-8dzsr\" (UID: \"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9\") " pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.371881 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f5f34f58-d006-4253-9328-eacdd6728e68-socket-dir\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.372254 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6230cf30-7e1a-40ea-babe-72eb39ac2be7-etcd-service-ca\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.372372 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff4ba356-bf66-4e46-83f9-224c75fbbc85-webhook-cert\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.373019 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6230cf30-7e1a-40ea-babe-72eb39ac2be7-config\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.373542 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.373588 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/647fe5d7-4243-4608-8351-6bc2e13b9f15-secret-volume\") pod \"collect-profiles-29494920-hwfxp\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.374218 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/266c621d-99dd-42a9-83c3-ac5288325710-trusted-ca\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.374799 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a293056-ca09-4e84-86a5-11785aaa9a62-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d76gc\" (UID: \"5a293056-ca09-4e84-86a5-11785aaa9a62\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.376329 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85dd3e88-2190-4d98-94d6-26b9fee9d20f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lb8gb\" (UID: \"85dd3e88-2190-4d98-94d6-26b9fee9d20f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.377417 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6230cf30-7e1a-40ea-babe-72eb39ac2be7-serving-cert\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.377896 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vkq94\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.378534 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de6aa533-9dd6-4579-aee2-38c2aebc7e31-cert\") pod \"ingress-canary-5qnph\" (UID: \"de6aa533-9dd6-4579-aee2-38c2aebc7e31\") " pod="openshift-ingress-canary/ingress-canary-5qnph" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.378569 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff4ba356-bf66-4e46-83f9-224c75fbbc85-apiservice-cert\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.378782 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.378967 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6230cf30-7e1a-40ea-babe-72eb39ac2be7-etcd-client\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.378983 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/266c621d-99dd-42a9-83c3-ac5288325710-metrics-tls\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.379169 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6e8f94ac-ad91-40e6-868d-e5e8de3de9b9-certs\") pod \"machine-config-server-8dzsr\" (UID: \"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9\") " pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.379552 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5beed408-607e-42c8-be14-5b942a65f510-signing-key\") pod \"service-ca-9c57cc56f-qgfrx\" (UID: \"5beed408-607e-42c8-be14-5b942a65f510\") " pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.379979 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-proxy-tls\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.385397 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.386094 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e82f4e07-1554-4230-867f-427e7469c789-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9gtxn\" (UID: \"e82f4e07-1554-4230-867f-427e7469c789\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.396019 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e21223f9-863b-45dd-b641-afa73286591f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.410345 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.414511 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8vfn\" (UniqueName: \"kubernetes.io/projected/905043f7-d346-450b-b46b-4ffcd29313af-kube-api-access-w8vfn\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9dl5\" (UID: \"905043f7-d346-450b-b46b-4ffcd29313af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.435552 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmdkp\" (UniqueName: \"kubernetes.io/projected/ed097d8a-81a4-4c18-960e-20041c9df31a-kube-api-access-qmdkp\") pod \"olm-operator-6b444d44fb-h24bz\" (UID: \"ed097d8a-81a4-4c18-960e-20041c9df31a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.435955 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.452991 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.460681 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-bound-sa-token\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.460917 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.461435 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:19.961420217 +0000 UTC m=+94.656154599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.474712 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.478673 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt49s\" (UniqueName: \"kubernetes.io/projected/33156757-7442-4962-ab8e-fe8e8a735fcc-kube-api-access-vt49s\") pod \"authentication-operator-69f744f599-944nm\" (UID: \"33156757-7442-4962-ab8e-fe8e8a735fcc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.496837 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgvn6\" (UniqueName: \"kubernetes.io/projected/042c5d25-f716-4d9d-a663-28f18b9dfbc1-kube-api-access-pgvn6\") pod \"machine-approver-56656f9798-fv6pm\" (UID: \"042c5d25-f716-4d9d-a663-28f18b9dfbc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.518229 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cmg4\" (UniqueName: \"kubernetes.io/projected/6ed95e9c-31ac-4716-8c19-a76da12afe85-kube-api-access-8cmg4\") pod \"package-server-manager-789f6589d5-f297q\" (UID: \"6ed95e9c-31ac-4716-8c19-a76da12afe85\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.539336 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js8l9\" (UniqueName: \"kubernetes.io/projected/e21223f9-863b-45dd-b641-afa73286591f-kube-api-access-js8l9\") pod \"cluster-image-registry-operator-dc59b4c8b-lmpvk\" (UID: \"e21223f9-863b-45dd-b641-afa73286591f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.557271 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bslb\" (UniqueName: \"kubernetes.io/projected/6d8327fe-0e6c-46eb-b79e-6440451d8393-kube-api-access-8bslb\") pod \"multus-admission-controller-857f4d67dd-lvbft\" (UID: \"6d8327fe-0e6c-46eb-b79e-6440451d8393\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.562602 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.562839 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.062808467 +0000 UTC m=+94.757542849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.565386 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.565806 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.06579353 +0000 UTC m=+94.760527912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.583587 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzxdq\" (UniqueName: \"kubernetes.io/projected/1db476d8-783c-4520-9b07-dfd4525a064f-kube-api-access-lzxdq\") pod \"router-default-5444994796-4v9zm\" (UID: \"1db476d8-783c-4520-9b07-dfd4525a064f\") " pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.597504 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl"] Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.606953 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-842ww\" (UniqueName: \"kubernetes.io/projected/2be7ad19-75d6-4c8e-bf95-5e62182fd5ac-kube-api-access-842ww\") pod \"machine-config-controller-84d6567774-6cp9x\" (UID: \"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.617581 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfcv2\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-kube-api-access-jfcv2\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: W0129 14:04:19.623916 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3423cc35_1ccb_487e_8d2e_fc72a2f03d9f.slice/crio-656a24a8bffbcfbef78609a0808aa9ffaf5f75579a0276d0f9c057d5f9df4b66 WatchSource:0}: Error finding container 656a24a8bffbcfbef78609a0808aa9ffaf5f75579a0276d0f9c057d5f9df4b66: Status 404 returned error can't find the container with id 656a24a8bffbcfbef78609a0808aa9ffaf5f75579a0276d0f9c057d5f9df4b66 Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.657181 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdxn\" (UniqueName: \"kubernetes.io/projected/dcbdaa74-121b-4b7c-bd7b-df2aa01a341c-kube-api-access-qrdxn\") pod \"cluster-samples-operator-665b6dd947-8fv88\" (UID: \"dcbdaa74-121b-4b7c-bd7b-df2aa01a341c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.659334 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.663699 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9cdn\" (UniqueName: \"kubernetes.io/projected/f6cb6d4b-c2d5-4efd-8269-93827412a96d-kube-api-access-c9cdn\") pod \"service-ca-operator-777779d784-fwm9n\" (UID: \"f6cb6d4b-c2d5-4efd-8269-93827412a96d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.666882 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.667171 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpfcs\" (UniqueName: \"kubernetes.io/projected/776ea39d-e7fb-497f-bfaf-41b385b76754-kube-api-access-gpfcs\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.669327 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.169297299 +0000 UTC m=+94.864031681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.669645 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.671010 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpfcs\" (UniqueName: \"kubernetes.io/projected/776ea39d-e7fb-497f-bfaf-41b385b76754-kube-api-access-gpfcs\") pod \"route-controller-manager-6576b87f9c-lhpqt\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.699312 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/266c621d-99dd-42a9-83c3-ac5288325710-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.699939 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.713727 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.720216 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czcmt\" (UniqueName: \"kubernetes.io/projected/647fe5d7-4243-4608-8351-6bc2e13b9f15-kube-api-access-czcmt\") pod \"collect-profiles-29494920-hwfxp\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.722591 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.732854 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.740997 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2qhq\" (UniqueName: \"kubernetes.io/projected/6e8f94ac-ad91-40e6-868d-e5e8de3de9b9-kube-api-access-n2qhq\") pod \"machine-config-server-8dzsr\" (UID: \"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9\") " pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.743954 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.758530 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.762617 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83519b1e-5a50-4774-a86c-7117668abf6e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-txd8j\" (UID: \"83519b1e-5a50-4774-a86c-7117668abf6e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.765758 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.765813 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" event={"ID":"042c5d25-f716-4d9d-a663-28f18b9dfbc1","Type":"ContainerStarted","Data":"c5289a63447ad28b72181703df89475e430a598df301f627c1211e5f5a047716"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.768344 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.768780 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.268767336 +0000 UTC m=+94.963501708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.771130 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" event={"ID":"842d8f24-8875-4968-a486-99a53a278850","Type":"ContainerStarted","Data":"73e34a687ed588cdfe8b40894365bc41654beafbfbe12e8814e17181f558881f"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.772605 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" event={"ID":"78dab1db-992a-4ae3-97a1-d70613ac41fe","Type":"ContainerStarted","Data":"18033989f8b3f875c4978e3927d8cf1dc46d2fd372bbf725b450a1a2ec61bdd3"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.772642 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" event={"ID":"78dab1db-992a-4ae3-97a1-d70613ac41fe","Type":"ContainerStarted","Data":"ee6e16a7fe0c1195e5fd121cd22805c2ef2ce4f114481aa3d913e4c66e88d1e3"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.781290 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvvp9\" (UniqueName: \"kubernetes.io/projected/5a293056-ca09-4e84-86a5-11785aaa9a62-kube-api-access-tvvp9\") pod \"control-plane-machine-set-operator-78cbb6b69f-d76gc\" (UID: \"5a293056-ca09-4e84-86a5-11785aaa9a62\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.791196 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jbfcj" event={"ID":"f5125f20-92c4-4700-8c7f-f8c7fc7b48b9","Type":"ContainerStarted","Data":"da34c50a3885b67a7de3c5159893a9a565cd0d1b54cf47306aa8d169f4cf4c2e"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.791260 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jbfcj" event={"ID":"f5125f20-92c4-4700-8c7f-f8c7fc7b48b9","Type":"ContainerStarted","Data":"3fe963dc22f3d21d06aa7324ffc48ac9a42ee173a2e3e7ce1a3b454e9e87901f"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.792353 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jbfcj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.800224 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8lc6\" (UniqueName: \"kubernetes.io/projected/de6aa533-9dd6-4579-aee2-38c2aebc7e31-kube-api-access-f8lc6\") pod \"ingress-canary-5qnph\" (UID: \"de6aa533-9dd6-4579-aee2-38c2aebc7e31\") " pod="openshift-ingress-canary/ingress-canary-5qnph" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.802422 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bxx4k" event={"ID":"810f6f52-858d-46f6-b2d2-71f9c3135263","Type":"ContainerStarted","Data":"afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.802455 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bxx4k" event={"ID":"810f6f52-858d-46f6-b2d2-71f9c3135263","Type":"ContainerStarted","Data":"ec77a16d63b0f3e13f632231d0532b6182f70ffc8ad80a54069e8b11c4af903c"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.814101 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-jbfcj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.814180 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jbfcj" podUID="f5125f20-92c4-4700-8c7f-f8c7fc7b48b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.820613 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.825114 4753 generic.go:334] "Generic (PLEG): container finished" podID="8a7be31e-2bc5-4b5c-8d26-a220f35b87d4" containerID="0e801cfeed6330a17a6ba693abfc32df3e4ddbe8f0c2544515a734e2509781f9" exitCode=0 Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.825455 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" event={"ID":"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4","Type":"ContainerDied","Data":"0e801cfeed6330a17a6ba693abfc32df3e4ddbe8f0c2544515a734e2509781f9"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.839813 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.846571 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz"] Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.846705 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmbv9\" (UniqueName: \"kubernetes.io/projected/ff4ba356-bf66-4e46-83f9-224c75fbbc85-kube-api-access-tmbv9\") pod \"packageserver-d55dfcdfc-l9ddt\" (UID: \"ff4ba356-bf66-4e46-83f9-224c75fbbc85\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.851554 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" event={"ID":"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f","Type":"ContainerStarted","Data":"656a24a8bffbcfbef78609a0808aa9ffaf5f75579a0276d0f9c057d5f9df4b66"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.865959 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f98pb\" (UniqueName: \"kubernetes.io/projected/a98eae96-08b8-4589-ab80-07e0a551d398-kube-api-access-f98pb\") pod \"migrator-59844c95c7-99c7v\" (UID: \"a98eae96-08b8-4589-ab80-07e0a551d398\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.868220 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljrc5\" (UniqueName: \"kubernetes.io/projected/5beed408-607e-42c8-be14-5b942a65f510-kube-api-access-ljrc5\") pod \"service-ca-9c57cc56f-qgfrx\" (UID: \"5beed408-607e-42c8-be14-5b942a65f510\") " pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.869327 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.873222 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.37137717 +0000 UTC m=+95.066111552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.880471 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.882984 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rls7b"] Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.891507 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.901804 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldbdr\" (UniqueName: \"kubernetes.io/projected/c4427451-802a-4521-9a21-3363d191e650-kube-api-access-ldbdr\") pod \"catalog-operator-68c6474976-st6s2\" (UID: \"c4427451-802a-4521-9a21-3363d191e650\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.910295 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.910787 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5qnph" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.932960 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8dzsr" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.942109 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" event={"ID":"cd6a836a-9268-4a6a-ab7f-00605823dc9f","Type":"ContainerStarted","Data":"483fb81f9c285b645982eb6b910750414122cfc1ebedd32a3ddea2bee48a26a7"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.942176 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" event={"ID":"cd6a836a-9268-4a6a-ab7f-00605823dc9f","Type":"ContainerStarted","Data":"544d4efee7254d34e8e6bd2fa08e00c5d00021ffc6e8b71b3fb4a9853710fc9d"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.943355 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86f9x\" (UniqueName: \"kubernetes.io/projected/6230cf30-7e1a-40ea-babe-72eb39ac2be7-kube-api-access-86f9x\") pod \"etcd-operator-b45778765-mtr5k\" (UID: \"6230cf30-7e1a-40ea-babe-72eb39ac2be7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.952746 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8bc8\" (UniqueName: \"kubernetes.io/projected/266c621d-99dd-42a9-83c3-ac5288325710-kube-api-access-b8bc8\") pod \"ingress-operator-5b745b69d9-gqggj\" (UID: \"266c621d-99dd-42a9-83c3-ac5288325710\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.960815 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85dd3e88-2190-4d98-94d6-26b9fee9d20f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lb8gb\" (UID: \"85dd3e88-2190-4d98-94d6-26b9fee9d20f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.961009 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5"] Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.971185 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:19 crc kubenswrapper[4753]: E0129 14:04:19.972061 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.47204535 +0000 UTC m=+95.166779732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.973513 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" event={"ID":"4a83492d-36e3-4400-a969-71934ecfc9f7","Type":"ContainerStarted","Data":"bab852b08fa726a717811425c76977d4b8a81a9e88e319395564a351fec56207"} Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.974081 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.983782 4753 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-gn6zd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.983849 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" podUID="4a83492d-36e3-4400-a969-71934ecfc9f7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.984511 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzg58\" (UniqueName: \"kubernetes.io/projected/0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc-kube-api-access-dzg58\") pod \"machine-config-operator-74547568cd-gfb7j\" (UID: \"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:19 crc kubenswrapper[4753]: I0129 14:04:19.998877 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f82mk\" (UniqueName: \"kubernetes.io/projected/19f60d75-bda2-4816-b146-f5e29203ffbc-kube-api-access-f82mk\") pod \"marketplace-operator-79b997595-vkq94\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.001646 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.002124 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.027038 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvcjt\" (UniqueName: \"kubernetes.io/projected/f5f34f58-d006-4253-9328-eacdd6728e68-kube-api-access-hvcjt\") pod \"csi-hostpathplugin-58ksd\" (UID: \"f5f34f58-d006-4253-9328-eacdd6728e68\") " pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.028541 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.076051 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:20 crc kubenswrapper[4753]: E0129 14:04:20.079221 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.579175861 +0000 UTC m=+95.273910253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:20 crc kubenswrapper[4753]: W0129 14:04:20.084712 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb55f8c6_430b_46b6_b862_99a7f86d11da.slice/crio-2d74ab8947829aef2a488e83dbc768afd7b39fa26ff9502f46a35841244f2623 WatchSource:0}: Error finding container 2d74ab8947829aef2a488e83dbc768afd7b39fa26ff9502f46a35841244f2623: Status 404 returned error can't find the container with id 2d74ab8947829aef2a488e83dbc768afd7b39fa26ff9502f46a35841244f2623 Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.087824 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.099721 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.101420 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.108923 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.115686 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mxrlb"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.139065 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.147784 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.159523 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" podStartSLOduration=75.159505345 podStartE2EDuration="1m15.159505345s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:20.156830541 +0000 UTC m=+94.851564913" watchObservedRunningTime="2026-01-29 14:04:20.159505345 +0000 UTC m=+94.854239727" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.162125 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.181264 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:20 crc kubenswrapper[4753]: E0129 14:04:20.181560 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.681547139 +0000 UTC m=+95.376281521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.183142 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" Jan 29 14:04:20 crc kubenswrapper[4753]: W0129 14:04:20.226461 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b31af78_d6bd_49eb_8d02_d22af087771d.slice/crio-2bc4b24f3be9178479de734ee2d3138fa6bb25f4ec5a2dc6e8984bac045ae245 WatchSource:0}: Error finding container 2bc4b24f3be9178479de734ee2d3138fa6bb25f4ec5a2dc6e8984bac045ae245: Status 404 returned error can't find the container with id 2bc4b24f3be9178479de734ee2d3138fa6bb25f4ec5a2dc6e8984bac045ae245 Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.235734 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-58ksd" Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.282362 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:20 crc kubenswrapper[4753]: E0129 14:04:20.283085 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.783068233 +0000 UTC m=+95.477802615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.301517 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-944nm"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.392052 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:20 crc kubenswrapper[4753]: E0129 14:04:20.392477 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.892463186 +0000 UTC m=+95.587197568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.398357 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.450665 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.491510 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.494234 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:20 crc kubenswrapper[4753]: E0129 14:04:20.494283 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.994260347 +0000 UTC m=+95.688994729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.495434 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.495541 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:20 crc kubenswrapper[4753]: E0129 14:04:20.495914 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:20.995901893 +0000 UTC m=+95.690636275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.548451 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.560009 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.579222 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lvbft"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.596859 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:20 crc kubenswrapper[4753]: E0129 14:04:20.597309 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:21.097292303 +0000 UTC m=+95.792026685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:20 crc kubenswrapper[4753]: W0129 14:04:20.636620 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6cb6d4b_c2d5_4efd_8269_93827412a96d.slice/crio-ba9e64b5b23a5d9070499951402b863837615ac04dcd6c78a6fe267852ed4454 WatchSource:0}: Error finding container ba9e64b5b23a5d9070499951402b863837615ac04dcd6c78a6fe267852ed4454: Status 404 returned error can't find the container with id ba9e64b5b23a5d9070499951402b863837615ac04dcd6c78a6fe267852ed4454 Jan 29 14:04:20 crc kubenswrapper[4753]: W0129 14:04:20.685836 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode21223f9_863b_45dd_b641_afa73286591f.slice/crio-2a30da7f526990ac0dd9fab62408365c63bcfca71a4ef08e856a0b5ac0a6fad5 WatchSource:0}: Error finding container 2a30da7f526990ac0dd9fab62408365c63bcfca71a4ef08e856a0b5ac0a6fad5: Status 404 returned error can't find the container with id 2a30da7f526990ac0dd9fab62408365c63bcfca71a4ef08e856a0b5ac0a6fad5 Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.698029 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:20 crc kubenswrapper[4753]: E0129 14:04:20.698460 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:21.198447007 +0000 UTC m=+95.893181389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.751507 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.799012 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:20 crc kubenswrapper[4753]: E0129 14:04:20.799638 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:21.299622041 +0000 UTC m=+95.994356423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.865415 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt"] Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.901582 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:20 crc kubenswrapper[4753]: E0129 14:04:20.902036 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:21.40202369 +0000 UTC m=+96.096758082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:20 crc kubenswrapper[4753]: I0129 14:04:20.944752 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9klvl" podStartSLOduration=75.944685636 podStartE2EDuration="1m15.944685636s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:20.926522151 +0000 UTC m=+95.621256533" watchObservedRunningTime="2026-01-29 14:04:20.944685636 +0000 UTC m=+95.639420018" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.003976 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.004343 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:21.504328635 +0000 UTC m=+96.199063017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.048007 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-4v9zm" event={"ID":"1db476d8-783c-4520-9b07-dfd4525a064f","Type":"ContainerStarted","Data":"9792deca88f8bb04816b1d3e797883424d20689e368506d068628d605a8b5326"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.066957 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v"] Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.068874 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" event={"ID":"33156757-7442-4962-ab8e-fe8e8a735fcc","Type":"ContainerStarted","Data":"86bc88ea5dd32e35ee56e583cce4e2045d6d69b1dfe286b4bcd20ac665a839a7"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.079212 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" event={"ID":"ca5f336c-cf5c-4e2a-b7fb-64acbd039052","Type":"ContainerStarted","Data":"f2592a3af82865e44d6c2e76306db8a34f06f0ad531262f3dc8dae1199cf1757"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.105371 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.105687 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:21.605675764 +0000 UTC m=+96.300410146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.117617 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" event={"ID":"6d8327fe-0e6c-46eb-b79e-6440451d8393","Type":"ContainerStarted","Data":"f2fc5b1f1baac17f8edb0cb08c942684588085bc36e369cb0d33f7b0082ef2a8"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.118739 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" event={"ID":"f6cb6d4b-c2d5-4efd-8269-93827412a96d","Type":"ContainerStarted","Data":"ba9e64b5b23a5d9070499951402b863837615ac04dcd6c78a6fe267852ed4454"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.119630 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" event={"ID":"e82f4e07-1554-4230-867f-427e7469c789","Type":"ContainerStarted","Data":"70b2704fd60c0a4c9032ac773d64df79f0614045565e41101ddda383829a41be"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.130215 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" event={"ID":"776ea39d-e7fb-497f-bfaf-41b385b76754","Type":"ContainerStarted","Data":"6de60352cd886e1035cbd0fe39197dd5b25eaef1cc236789bba25d1864d6e0b1"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.132328 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rls7b" event={"ID":"db55f8c6-430b-46b6-b862-99a7f86d11da","Type":"ContainerStarted","Data":"2d74ab8947829aef2a488e83dbc768afd7b39fa26ff9502f46a35841244f2623"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.139292 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" event={"ID":"6ed95e9c-31ac-4716-8c19-a76da12afe85","Type":"ContainerStarted","Data":"66ced4c9d44cc90b2fc95a254286ccb8cafc949631c4b62f2f09a0ff709d330b"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.148282 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" event={"ID":"3423cc35-1ccb-487e-8d2e-fc72a2f03d9f","Type":"ContainerStarted","Data":"f7f31be7e389d38bd86b8db1052a0bf339c85f4be9aec40b4380cc0c0deae97e"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.155235 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mxrlb" event={"ID":"7b31af78-d6bd-49eb-8d02-d22af087771d","Type":"ContainerStarted","Data":"2bc4b24f3be9178479de734ee2d3138fa6bb25f4ec5a2dc6e8984bac045ae245"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.164001 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" event={"ID":"ed097d8a-81a4-4c18-960e-20041c9df31a","Type":"ContainerStarted","Data":"66e914cee3e6c2462e60c2536343bee8b54e66cb1066c52a2f323eed4d35cb99"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.166452 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.167215 4753 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-h24bz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.167832 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" podUID="ed097d8a-81a4-4c18-960e-20041c9df31a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.168481 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" event={"ID":"dcbdaa74-121b-4b7c-bd7b-df2aa01a341c","Type":"ContainerStarted","Data":"6d5043846b5f4e08f77132db4818d05c507bb1114cc0b839ac09f076e3f049b8"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.195102 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" event={"ID":"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac","Type":"ContainerStarted","Data":"50b12898a0756ca91cd08456d57c7d34c45c267e786a457241d096c39846bbde"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.199621 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" event={"ID":"e21223f9-863b-45dd-b641-afa73286591f","Type":"ContainerStarted","Data":"2a30da7f526990ac0dd9fab62408365c63bcfca71a4ef08e856a0b5ac0a6fad5"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.203786 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" event={"ID":"905043f7-d346-450b-b46b-4ffcd29313af","Type":"ContainerStarted","Data":"d200d1feae0e7fc762565ff90963c9ac5b54d72a3e6222ffdee4a636c20875a7"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.207530 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.207805 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:21.707763734 +0000 UTC m=+96.402498126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.208027 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.210481 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:21.710464499 +0000 UTC m=+96.405198881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.234590 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" event={"ID":"042c5d25-f716-4d9d-a663-28f18b9dfbc1","Type":"ContainerStarted","Data":"32a6139494362cfcdf95432bcb8bebbd634d25ff3c63a22de7d289e5bc6c741d"} Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.235912 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-jbfcj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.235963 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jbfcj" podUID="f5125f20-92c4-4700-8c7f-f8c7fc7b48b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.315572 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.323120 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:21.823094432 +0000 UTC m=+96.517828824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.383221 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5qnph"] Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.425425 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.425789 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:21.925777238 +0000 UTC m=+96.620511620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.436491 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.527052 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.527837 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:22.027820497 +0000 UTC m=+96.722554879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.588606 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-58ksd"] Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.614066 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jbfcj" podStartSLOduration=76.614049485 podStartE2EDuration="1m16.614049485s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:21.613111628 +0000 UTC m=+96.307846010" watchObservedRunningTime="2026-01-29 14:04:21.614049485 +0000 UTC m=+96.308783867" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.628841 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.629181 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:22.129167745 +0000 UTC m=+96.823902127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.699372 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" podStartSLOduration=76.699352648 podStartE2EDuration="1m16.699352648s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:21.697752013 +0000 UTC m=+96.392486395" watchObservedRunningTime="2026-01-29 14:04:21.699352648 +0000 UTC m=+96.394087030" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.700222 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" podStartSLOduration=76.700216582 podStartE2EDuration="1m16.700216582s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:21.665452455 +0000 UTC m=+96.360186837" watchObservedRunningTime="2026-01-29 14:04:21.700216582 +0000 UTC m=+96.394950964" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.730201 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.730648 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:22.230631327 +0000 UTC m=+96.925365709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.764787 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-bxx4k" podStartSLOduration=76.764758627 podStartE2EDuration="1m16.764758627s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:21.751687573 +0000 UTC m=+96.446421965" watchObservedRunningTime="2026-01-29 14:04:21.764758627 +0000 UTC m=+96.459493009" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.800652 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" podStartSLOduration=76.800619534 podStartE2EDuration="1m16.800619534s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:21.781845902 +0000 UTC m=+96.476580294" watchObservedRunningTime="2026-01-29 14:04:21.800619534 +0000 UTC m=+96.495353916" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.858703 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-hf9wc" podStartSLOduration=76.8586816 podStartE2EDuration="1m16.8586816s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:21.857394664 +0000 UTC m=+96.552129046" watchObservedRunningTime="2026-01-29 14:04:21.8586816 +0000 UTC m=+96.553415982" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.860327 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.860851 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:22.360838739 +0000 UTC m=+97.055573121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.907677 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jvpqh" podStartSLOduration=76.907649722 podStartE2EDuration="1m16.907649722s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:21.906257883 +0000 UTC m=+96.600992265" watchObservedRunningTime="2026-01-29 14:04:21.907649722 +0000 UTC m=+96.602384104" Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.965854 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:21 crc kubenswrapper[4753]: E0129 14:04:21.966388 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:22.466364995 +0000 UTC m=+97.161099377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:21 crc kubenswrapper[4753]: I0129 14:04:21.974496 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j"] Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.019248 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lnbcl" podStartSLOduration=77.019226645 podStartE2EDuration="1m17.019226645s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:21.996182164 +0000 UTC m=+96.690916556" watchObservedRunningTime="2026-01-29 14:04:22.019226645 +0000 UTC m=+96.713961017" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.068186 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:22 crc kubenswrapper[4753]: E0129 14:04:22.068687 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:22.568667291 +0000 UTC m=+97.263401673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.170462 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:22 crc kubenswrapper[4753]: E0129 14:04:22.170830 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:22.670815592 +0000 UTC m=+97.365549974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.192379 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j"] Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.222006 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc"] Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.253268 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkq94"] Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.260546 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb"] Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.263603 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8dzsr" event={"ID":"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9","Type":"ContainerStarted","Data":"bea0c4285cd3f79fa8e8ad11066f2569d0173ff126ea22f51f8da06104a798a5"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.272404 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:22 crc kubenswrapper[4753]: E0129 14:04:22.272809 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:22.772795249 +0000 UTC m=+97.467529631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.294941 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" event={"ID":"c4427451-802a-4521-9a21-3363d191e650","Type":"ContainerStarted","Data":"8258bcbb1bf9228f2e8814c2bcc395793c9689f0c77ae6199db48dc92dd1145d"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.294979 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" event={"ID":"c4427451-802a-4521-9a21-3363d191e650","Type":"ContainerStarted","Data":"ccc0433e2cf4bf330d4600d67f1dcb3780cfe18ac1d77b3c6d5d0eb7158117fb"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.297749 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" event={"ID":"83519b1e-5a50-4774-a86c-7117668abf6e","Type":"ContainerStarted","Data":"b985670e19911f81dab5b8631bc51397b4cd3d24b5c0b77e490450e804483282"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.306415 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" event={"ID":"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac","Type":"ContainerStarted","Data":"655c7152288c77310f021145476903a8aa96f332592919d0772e834570f9eff7"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.307427 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" event={"ID":"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc","Type":"ContainerStarted","Data":"a06f2e6d1f5cfd85897ae85b81c89f03424f134f7b2d60ca00362789da6470ce"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.310060 4753 generic.go:334] "Generic (PLEG): container finished" podID="ca5f336c-cf5c-4e2a-b7fb-64acbd039052" containerID="74b61b5f35144b14586d8bf83b37cbb898a5dbea461bd7386441079e1646832a" exitCode=0 Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.310134 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" event={"ID":"ca5f336c-cf5c-4e2a-b7fb-64acbd039052","Type":"ContainerDied","Data":"74b61b5f35144b14586d8bf83b37cbb898a5dbea461bd7386441079e1646832a"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.313021 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" event={"ID":"ed097d8a-81a4-4c18-960e-20041c9df31a","Type":"ContainerStarted","Data":"e06d4afbc9579fdc75cd15c876c61f257b8b3c5c80d7b93a37f8b90584faabce"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.315567 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" event={"ID":"f6cb6d4b-c2d5-4efd-8269-93827412a96d","Type":"ContainerStarted","Data":"c951c5553d284484358862c5c015ee0fe0f5836c12a55d80aec70239d6d10f91"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.318268 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" event={"ID":"776ea39d-e7fb-497f-bfaf-41b385b76754","Type":"ContainerStarted","Data":"f3f3147f75b0472631fc699585d5535c19e3da50acddeadfece6a2e4ed7892b7"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.318488 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.320833 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" event={"ID":"33156757-7442-4962-ab8e-fe8e8a735fcc","Type":"ContainerStarted","Data":"860aa7eb80c33438c84e45ff4b5206a3a16fd5060d8379646ffbb431a70a1823"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.328041 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" event={"ID":"ff4ba356-bf66-4e46-83f9-224c75fbbc85","Type":"ContainerStarted","Data":"74fc761f96db3a2576735e462d15a96258a0d6613a63fba8ac672379079a3df6"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.330531 4753 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lhpqt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.330597 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" podUID="776ea39d-e7fb-497f-bfaf-41b385b76754" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.333269 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h24bz" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.340592 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" event={"ID":"905043f7-d346-450b-b46b-4ffcd29313af","Type":"ContainerStarted","Data":"2527d218db9494c300a11d63dfd572575d95cc8a43e3f7009abdc6a9f51cb4ce"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.345351 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v" event={"ID":"a98eae96-08b8-4589-ab80-07e0a551d398","Type":"ContainerStarted","Data":"9b8148bf93899ec5ba1ea7df82fd9fa86b3bf08bc8d3f2dc46a37c7f91a9b0b9"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.350304 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qgfrx"] Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.353937 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" event={"ID":"e82f4e07-1554-4230-867f-427e7469c789","Type":"ContainerStarted","Data":"2835149e2d7f262853cb1de99baeab17c194fe1b6e81c826c1c9ad324290165d"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.358035 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mtr5k"] Jan 29 14:04:22 crc kubenswrapper[4753]: W0129 14:04:22.373180 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5beed408_607e_42c8_be14_5b942a65f510.slice/crio-63ba4144383f8c331bdc041764d97a246a92dbee9137c5f75a6b741fb74402d8 WatchSource:0}: Error finding container 63ba4144383f8c331bdc041764d97a246a92dbee9137c5f75a6b741fb74402d8: Status 404 returned error can't find the container with id 63ba4144383f8c331bdc041764d97a246a92dbee9137c5f75a6b741fb74402d8 Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.373853 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:22 crc kubenswrapper[4753]: E0129 14:04:22.374225 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:22.87420904 +0000 UTC m=+97.568943422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.374369 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5qnph" event={"ID":"de6aa533-9dd6-4579-aee2-38c2aebc7e31","Type":"ContainerStarted","Data":"9ccb29694d920d01641b6772ef89c4a3ba353b8be0585d96ae54408b1ffd0e7c"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.375263 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" podStartSLOduration=77.375247379 podStartE2EDuration="1m17.375247379s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:22.373767698 +0000 UTC m=+97.068502080" watchObservedRunningTime="2026-01-29 14:04:22.375247379 +0000 UTC m=+97.069981761" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.379585 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-4v9zm" event={"ID":"1db476d8-783c-4520-9b07-dfd4525a064f","Type":"ContainerStarted","Data":"cb4f15ad42c5a441eeab1807e79f388cf0f8e5f3b0dd5f1f641d3afc29460f01"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.390198 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rls7b" event={"ID":"db55f8c6-430b-46b6-b862-99a7f86d11da","Type":"ContainerStarted","Data":"7e60db2d180119e06bc3fbc24305b647f75a5bbe6fd7e6512b367475ab61b0e0"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.391272 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.394396 4753 patch_prober.go:28] interesting pod/console-operator-58897d9998-rls7b container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.394458 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rls7b" podUID="db55f8c6-430b-46b6-b862-99a7f86d11da" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.397370 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp"] Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.414459 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj"] Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.415740 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fwm9n" podStartSLOduration=77.415727855 podStartE2EDuration="1m17.415727855s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:22.415664873 +0000 UTC m=+97.110399255" watchObservedRunningTime="2026-01-29 14:04:22.415727855 +0000 UTC m=+97.110462247" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.436492 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" event={"ID":"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4","Type":"ContainerStarted","Data":"eebce87f188c52b86664f508ea873077841352f975ddcf861362a64c3aca3276"} Jan 29 14:04:22 crc kubenswrapper[4753]: W0129 14:04:22.450758 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647fe5d7_4243_4608_8351_6bc2e13b9f15.slice/crio-ccf52d40fabd14bc8ac01133cccccd51fd4b6be0eaada6107380a2fdcaa2dc7f WatchSource:0}: Error finding container ccf52d40fabd14bc8ac01133cccccd51fd4b6be0eaada6107380a2fdcaa2dc7f: Status 404 returned error can't find the container with id ccf52d40fabd14bc8ac01133cccccd51fd4b6be0eaada6107380a2fdcaa2dc7f Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.457063 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-58ksd" event={"ID":"f5f34f58-d006-4253-9328-eacdd6728e68","Type":"ContainerStarted","Data":"232e7b384798eda08cc29ec2c15942c5e9721c71d9e1d6817ca1069d0c4053b7"} Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.464269 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-jbfcj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.464654 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jbfcj" podUID="f5125f20-92c4-4700-8c7f-f8c7fc7b48b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.485446 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:22 crc kubenswrapper[4753]: E0129 14:04:22.510637 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.010618214 +0000 UTC m=+97.705352596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.538142 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9dl5" podStartSLOduration=77.538116789 podStartE2EDuration="1m17.538116789s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:22.478791429 +0000 UTC m=+97.173525811" watchObservedRunningTime="2026-01-29 14:04:22.538116789 +0000 UTC m=+97.232851171" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.554965 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-rls7b" podStartSLOduration=77.554933047 podStartE2EDuration="1m17.554933047s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:22.534741855 +0000 UTC m=+97.229476237" watchObservedRunningTime="2026-01-29 14:04:22.554933047 +0000 UTC m=+97.249667429" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.595101 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-4v9zm" podStartSLOduration=77.595081024 podStartE2EDuration="1m17.595081024s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:22.594590309 +0000 UTC m=+97.289324691" watchObservedRunningTime="2026-01-29 14:04:22.595081024 +0000 UTC m=+97.289815406" Jan 29 14:04:22 crc kubenswrapper[4753]: W0129 14:04:22.596340 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod266c621d_99dd_42a9_83c3_ac5288325710.slice/crio-891e1a326b76b7346fa6138b8bcfa32f01fd887e5191f4b3f401916609b0f0fc WatchSource:0}: Error finding container 891e1a326b76b7346fa6138b8bcfa32f01fd887e5191f4b3f401916609b0f0fc: Status 404 returned error can't find the container with id 891e1a326b76b7346fa6138b8bcfa32f01fd887e5191f4b3f401916609b0f0fc Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.615804 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9gtxn" podStartSLOduration=77.61578241 podStartE2EDuration="1m17.61578241s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:22.614993087 +0000 UTC m=+97.309727469" watchObservedRunningTime="2026-01-29 14:04:22.61578241 +0000 UTC m=+97.310516782" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.663097 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:22 crc kubenswrapper[4753]: E0129 14:04:22.664215 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.164183875 +0000 UTC m=+97.858918257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.766102 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.766459 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:22 crc kubenswrapper[4753]: E0129 14:04:22.766574 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.266560653 +0000 UTC m=+97.961295035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.772362 4753 patch_prober.go:28] interesting pod/router-default-5444994796-4v9zm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 14:04:22 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Jan 29 14:04:22 crc kubenswrapper[4753]: [+]process-running ok Jan 29 14:04:22 crc kubenswrapper[4753]: healthz check failed Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.772415 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v9zm" podUID="1db476d8-783c-4520-9b07-dfd4525a064f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.867400 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:22 crc kubenswrapper[4753]: E0129 14:04:22.868019 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.368002795 +0000 UTC m=+98.062737177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:22 crc kubenswrapper[4753]: I0129 14:04:22.971920 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:22 crc kubenswrapper[4753]: E0129 14:04:22.972429 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.472411739 +0000 UTC m=+98.167146121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.035782 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.036404 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.048986 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.084020 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.084535 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.584518207 +0000 UTC m=+98.279252589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.186437 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.187606 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.687591305 +0000 UTC m=+98.382325687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.287629 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.287796 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.787763742 +0000 UTC m=+98.482498114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.288251 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.288614 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.788603734 +0000 UTC m=+98.483338106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.389651 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.390884 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.890866959 +0000 UTC m=+98.585601341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.464749 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" event={"ID":"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc","Type":"ContainerStarted","Data":"bb4639984d10a4462b4c4c9a08f26afd90becd47bb5edeceecee6aa1d1c04d45"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.464816 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" event={"ID":"0ab6b6ce-46b9-4ffb-a95f-f46fc72d8edc","Type":"ContainerStarted","Data":"b7cc0cd7c3c571ea4ccc19dea31ee95467e24b6395200df3a470b860e66c2cbe"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.466560 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" event={"ID":"647fe5d7-4243-4608-8351-6bc2e13b9f15","Type":"ContainerStarted","Data":"f7344526d39deb3662a79603e6c72db1784dcdad1b3d65cb4b36220c87cb7682"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.466592 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" event={"ID":"647fe5d7-4243-4608-8351-6bc2e13b9f15","Type":"ContainerStarted","Data":"ccf52d40fabd14bc8ac01133cccccd51fd4b6be0eaada6107380a2fdcaa2dc7f"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.468759 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" event={"ID":"6ed95e9c-31ac-4716-8c19-a76da12afe85","Type":"ContainerStarted","Data":"db329b74c7c2836b659248df91d9e5e58cad32b4017671e2ff0546b1e031e8a8"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.468787 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" event={"ID":"6ed95e9c-31ac-4716-8c19-a76da12afe85","Type":"ContainerStarted","Data":"c375e6548a887cfe2a226eab73911727a967069c32e10a63ea4b0437fa1452e2"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.468982 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.470577 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" event={"ID":"83519b1e-5a50-4774-a86c-7117668abf6e","Type":"ContainerStarted","Data":"d297ac76f3a0ed9f0006e23988bce7414760a0b84dcb6134df28fb0bb58d3fee"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.476538 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" event={"ID":"6230cf30-7e1a-40ea-babe-72eb39ac2be7","Type":"ContainerStarted","Data":"5f2bd85d19cd9a3145a87e2da7bb01b649434e0afd89912970fb81c6bb0a36fe"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.476601 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" event={"ID":"6230cf30-7e1a-40ea-babe-72eb39ac2be7","Type":"ContainerStarted","Data":"d1ad70d05647d48dcb5d6c0eabe738f1b20e3c668e0988b34e2efbf2cc47d078"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.478321 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" event={"ID":"5beed408-607e-42c8-be14-5b942a65f510","Type":"ContainerStarted","Data":"5c40905370622a19cd09aeb3a146d418c7db636d48f8aca15ccae1df11533a2f"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.478365 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" event={"ID":"5beed408-607e-42c8-be14-5b942a65f510","Type":"ContainerStarted","Data":"63ba4144383f8c331bdc041764d97a246a92dbee9137c5f75a6b741fb74402d8"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.481475 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" event={"ID":"ca5f336c-cf5c-4e2a-b7fb-64acbd039052","Type":"ContainerStarted","Data":"ad2a90c883f42fed5d08476b3ec8dd9ee0d252d6177991ef284015e738ded55c"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.481954 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.485234 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" event={"ID":"85dd3e88-2190-4d98-94d6-26b9fee9d20f","Type":"ContainerStarted","Data":"f251f969f7ee43e76a5db0a4a755662b0fb371523c1a0b984946e326163569db"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.485263 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" event={"ID":"85dd3e88-2190-4d98-94d6-26b9fee9d20f","Type":"ContainerStarted","Data":"f6890a8a7c93cb465a75b65975c5a96e7e20e271b46001fd9a523eb213848742"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.486650 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8dzsr" event={"ID":"6e8f94ac-ad91-40e6-868d-e5e8de3de9b9","Type":"ContainerStarted","Data":"0d65f7bd269e6838850068194fc620197d00e9bd0046d6deab5791f91edd81e4"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.487927 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" event={"ID":"5a293056-ca09-4e84-86a5-11785aaa9a62","Type":"ContainerStarted","Data":"d2dae518812484b91c75879b89c1a363afeacd83c578e184ab2cce7541bc3a57"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.487956 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" event={"ID":"5a293056-ca09-4e84-86a5-11785aaa9a62","Type":"ContainerStarted","Data":"46a87f79eeb6e4500da62230b78b2dc5f624f47bd16e37600d04fd0bce61fe93"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.490317 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" event={"ID":"6d8327fe-0e6c-46eb-b79e-6440451d8393","Type":"ContainerStarted","Data":"5a74e91df70517baec75af41cf84849aac1fbcaa3b6dedb4783115179968ab6f"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.490345 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" event={"ID":"6d8327fe-0e6c-46eb-b79e-6440451d8393","Type":"ContainerStarted","Data":"0590732640a96d66e61adb05260ae8f87310fcef8204795180d5ebfaaa0fdba0"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.491445 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.492859 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:23.992839945 +0000 UTC m=+98.687574327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.494365 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" event={"ID":"042c5d25-f716-4d9d-a663-28f18b9dfbc1","Type":"ContainerStarted","Data":"2a08dbaa0c524cb7e45e96ce50aadf05d8b21caf7a7fb1a5d3f536a809d60a31"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.496835 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5qnph" event={"ID":"de6aa533-9dd6-4579-aee2-38c2aebc7e31","Type":"ContainerStarted","Data":"d429f454bd9af4019e68f08c8c5cb61dc37ae21fcfe5323ef4d1cd5e40f36e1e"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.499287 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" event={"ID":"ff4ba356-bf66-4e46-83f9-224c75fbbc85","Type":"ContainerStarted","Data":"22aac6834b8e29bd43e9c84b72ca33acb0dcff26174339aba1f2360be0c13d9d"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.499376 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.502449 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" event={"ID":"2be7ad19-75d6-4c8e-bf95-5e62182fd5ac","Type":"ContainerStarted","Data":"524e7ff98b9a620d04db9595169d09d2058f69c3b3f666f688696827bb996076"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.502755 4753 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-l9ddt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.502835 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" podUID="ff4ba356-bf66-4e46-83f9-224c75fbbc85" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.508663 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gfb7j" podStartSLOduration=78.508636964 podStartE2EDuration="1m18.508636964s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.508004507 +0000 UTC m=+98.202738889" watchObservedRunningTime="2026-01-29 14:04:23.508636964 +0000 UTC m=+98.203371346" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.508880 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mxrlb" event={"ID":"7b31af78-d6bd-49eb-8d02-d22af087771d","Type":"ContainerStarted","Data":"556d57f69e8d0b77d90395aed9aeaf20f36b04b1f780e3ac49f86ad547a72171"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.508973 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mxrlb" event={"ID":"7b31af78-d6bd-49eb-8d02-d22af087771d","Type":"ContainerStarted","Data":"babf35ba817e6c2ef200b609d54b28e32cd556e3d556436f92bc2ee69f8c7783"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.509086 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.512094 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" event={"ID":"dcbdaa74-121b-4b7c-bd7b-df2aa01a341c","Type":"ContainerStarted","Data":"cdf0c1511a0e11d4e0a4782a45bff83f74782c2fa1810cdea48259d0c21f458b"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.512140 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" event={"ID":"dcbdaa74-121b-4b7c-bd7b-df2aa01a341c","Type":"ContainerStarted","Data":"f60a41e91fb582b5c3f44de090828f9a5a48663f443f627971e572764ec01a05"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.515060 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" event={"ID":"e21223f9-863b-45dd-b641-afa73286591f","Type":"ContainerStarted","Data":"5fd045c1755f73ae68e5bb0d0a3eac2e6a678e2998ae3b0356d05a2fe56dfe70"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.517315 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" event={"ID":"19f60d75-bda2-4816-b146-f5e29203ffbc","Type":"ContainerStarted","Data":"e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.517338 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" event={"ID":"19f60d75-bda2-4816-b146-f5e29203ffbc","Type":"ContainerStarted","Data":"ce986b1a47f994b5d1b8ce1640b0b95878642b06ebf70c4321f2dd4c2b0d61cc"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.517352 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.520096 4753 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vkq94 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.520199 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" podUID="19f60d75-bda2-4816-b146-f5e29203ffbc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.520652 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" event={"ID":"8a7be31e-2bc5-4b5c-8d26-a220f35b87d4","Type":"ContainerStarted","Data":"085b3f8a1fe88d5bececa1bf68c117684aef9f2ee51c4d366dea0fd4b96e967a"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.523603 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v" event={"ID":"a98eae96-08b8-4589-ab80-07e0a551d398","Type":"ContainerStarted","Data":"407fe475d900e80b3e2a3c44fe076cca2a996ea81d1a13b302d72aa00476ea42"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.523636 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v" event={"ID":"a98eae96-08b8-4589-ab80-07e0a551d398","Type":"ContainerStarted","Data":"1959c30bcb313994e2295d9d8218d58fb5a6b6d96d829316fdf414a13d20a494"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.532914 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" event={"ID":"266c621d-99dd-42a9-83c3-ac5288325710","Type":"ContainerStarted","Data":"cf99e19ab5795fb89902666089761c6b448d967baeb5c5d6489d3f96743ea53b"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.533045 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" event={"ID":"266c621d-99dd-42a9-83c3-ac5288325710","Type":"ContainerStarted","Data":"e4c4bd0147fc0b972bb4dfaa133ec3fe0cabf36433aa85493bb3ef55bee90fdc"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.533071 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" event={"ID":"266c621d-99dd-42a9-83c3-ac5288325710","Type":"ContainerStarted","Data":"891e1a326b76b7346fa6138b8bcfa32f01fd887e5191f4b3f401916609b0f0fc"} Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.534002 4753 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lhpqt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.534056 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" podUID="776ea39d-e7fb-497f-bfaf-41b385b76754" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.543559 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-mtr5k" podStartSLOduration=78.543546656 podStartE2EDuration="1m18.543546656s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.540911252 +0000 UTC m=+98.235645634" watchObservedRunningTime="2026-01-29 14:04:23.543546656 +0000 UTC m=+98.238281028" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.551345 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bp589" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.569344 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8dzsr" podStartSLOduration=7.569315123 podStartE2EDuration="7.569315123s" podCreationTimestamp="2026-01-29 14:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.56812578 +0000 UTC m=+98.262860162" watchObservedRunningTime="2026-01-29 14:04:23.569315123 +0000 UTC m=+98.264049505" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.593211 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" podStartSLOduration=78.593193537 podStartE2EDuration="1m18.593193537s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.590523153 +0000 UTC m=+98.285257545" watchObservedRunningTime="2026-01-29 14:04:23.593193537 +0000 UTC m=+98.287927919" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.594943 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.595034 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.095019488 +0000 UTC m=+98.789753870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.608551 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.630894 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d76gc" podStartSLOduration=78.630867575 podStartE2EDuration="1m18.630867575s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.620551678 +0000 UTC m=+98.315286060" watchObservedRunningTime="2026-01-29 14:04:23.630867575 +0000 UTC m=+98.325601957" Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.649227 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.149183375 +0000 UTC m=+98.843917757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.671947 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lb8gb" podStartSLOduration=78.671925857 podStartE2EDuration="1m18.671925857s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.660993463 +0000 UTC m=+98.355727845" watchObservedRunningTime="2026-01-29 14:04:23.671925857 +0000 UTC m=+98.366660239" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.714824 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.717722 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" podStartSLOduration=78.717701491 podStartE2EDuration="1m18.717701491s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.717032451 +0000 UTC m=+98.411766833" watchObservedRunningTime="2026-01-29 14:04:23.717701491 +0000 UTC m=+98.412435873" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.719214 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6cp9x" podStartSLOduration=78.719209502 podStartE2EDuration="1m18.719209502s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.690300498 +0000 UTC m=+98.385034880" watchObservedRunningTime="2026-01-29 14:04:23.719209502 +0000 UTC m=+98.413943884" Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.719235 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.219207532 +0000 UTC m=+98.913941914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.760612 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" podStartSLOduration=78.760587903 podStartE2EDuration="1m18.760587903s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.741535673 +0000 UTC m=+98.436270055" watchObservedRunningTime="2026-01-29 14:04:23.760587903 +0000 UTC m=+98.455322275" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.762659 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" podStartSLOduration=78.762651611 podStartE2EDuration="1m18.762651611s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.7594211 +0000 UTC m=+98.454155482" watchObservedRunningTime="2026-01-29 14:04:23.762651611 +0000 UTC m=+98.457385993" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.770387 4753 patch_prober.go:28] interesting pod/router-default-5444994796-4v9zm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 14:04:23 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Jan 29 14:04:23 crc kubenswrapper[4753]: [+]process-running ok Jan 29 14:04:23 crc kubenswrapper[4753]: healthz check failed Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.770492 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v9zm" podUID="1db476d8-783c-4520-9b07-dfd4525a064f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.812861 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fv6pm" podStartSLOduration=78.812838607 podStartE2EDuration="1m18.812838607s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.784649832 +0000 UTC m=+98.479384214" watchObservedRunningTime="2026-01-29 14:04:23.812838607 +0000 UTC m=+98.507572999" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.814826 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-qgfrx" podStartSLOduration=78.814818482 podStartE2EDuration="1m18.814818482s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.810891392 +0000 UTC m=+98.505625774" watchObservedRunningTime="2026-01-29 14:04:23.814818482 +0000 UTC m=+98.509552874" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.816134 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.816475 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.316459967 +0000 UTC m=+99.011194349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.827961 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5qnph" podStartSLOduration=7.827942686 podStartE2EDuration="7.827942686s" podCreationTimestamp="2026-01-29 14:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.824594863 +0000 UTC m=+98.519329245" watchObservedRunningTime="2026-01-29 14:04:23.827942686 +0000 UTC m=+98.522677068" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.844300 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lvbft" podStartSLOduration=78.844279881 podStartE2EDuration="1m18.844279881s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.843999783 +0000 UTC m=+98.538734165" watchObservedRunningTime="2026-01-29 14:04:23.844279881 +0000 UTC m=+98.539014263" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.911563 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8fv88" podStartSLOduration=78.911544392 podStartE2EDuration="1m18.911544392s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.909800544 +0000 UTC m=+98.604534926" watchObservedRunningTime="2026-01-29 14:04:23.911544392 +0000 UTC m=+98.606278774" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.911657 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-txd8j" podStartSLOduration=78.911652985 podStartE2EDuration="1m18.911652985s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.886591588 +0000 UTC m=+98.581325970" watchObservedRunningTime="2026-01-29 14:04:23.911652985 +0000 UTC m=+98.606387367" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.920900 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:23 crc kubenswrapper[4753]: E0129 14:04:23.921249 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.421231462 +0000 UTC m=+99.115965844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.972694 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" podStartSLOduration=78.972650502 podStartE2EDuration="1m18.972650502s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.944367325 +0000 UTC m=+98.639101707" watchObservedRunningTime="2026-01-29 14:04:23.972650502 +0000 UTC m=+98.667384884" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.997764 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-944nm" podStartSLOduration=78.99774245 podStartE2EDuration="1m18.99774245s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.974066001 +0000 UTC m=+98.668800383" watchObservedRunningTime="2026-01-29 14:04:23.99774245 +0000 UTC m=+98.692476832" Jan 29 14:04:23 crc kubenswrapper[4753]: I0129 14:04:23.998693 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" podStartSLOduration=78.998655045 podStartE2EDuration="1m18.998655045s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:23.99739319 +0000 UTC m=+98.692127572" watchObservedRunningTime="2026-01-29 14:04:23.998655045 +0000 UTC m=+98.693389427" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.022117 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.022179 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.022479 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.522467667 +0000 UTC m=+99.217202049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.035518 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/281741ae-7781-4682-8b1d-207c9a437581-metrics-certs\") pod \"network-metrics-daemon-57lv7\" (UID: \"281741ae-7781-4682-8b1d-207c9a437581\") " pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.061669 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mxrlb" podStartSLOduration=8.061650857 podStartE2EDuration="8.061650857s" podCreationTimestamp="2026-01-29 14:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:24.060429563 +0000 UTC m=+98.755163945" watchObservedRunningTime="2026-01-29 14:04:24.061650857 +0000 UTC m=+98.756385239" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.124206 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.125517 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.625487833 +0000 UTC m=+99.320222205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.125669 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.126374 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.626201883 +0000 UTC m=+99.320936255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.148187 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lmpvk" podStartSLOduration=79.148171724 podStartE2EDuration="1m19.148171724s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:24.135216514 +0000 UTC m=+98.829950926" watchObservedRunningTime="2026-01-29 14:04:24.148171724 +0000 UTC m=+98.842906106" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.179073 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gqggj" podStartSLOduration=79.179052773 podStartE2EDuration="1m19.179052773s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:24.173974181 +0000 UTC m=+98.868708563" watchObservedRunningTime="2026-01-29 14:04:24.179052773 +0000 UTC m=+98.873787155" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.208913 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-57lv7" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.226454 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.226967 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.726951295 +0000 UTC m=+99.421685677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.266019 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-99c7v" podStartSLOduration=79.265982501 podStartE2EDuration="1m19.265982501s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:24.263411719 +0000 UTC m=+98.958146101" watchObservedRunningTime="2026-01-29 14:04:24.265982501 +0000 UTC m=+98.960716883" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.328280 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.328734 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.828709426 +0000 UTC m=+99.523443798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.429832 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.430030 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.929999733 +0000 UTC m=+99.624734115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.430103 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.430486 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:24.930472227 +0000 UTC m=+99.625206609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.531937 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.532266 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.032227357 +0000 UTC m=+99.726961739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.532768 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.533267 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.033248825 +0000 UTC m=+99.727983207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.533262 4753 patch_prober.go:28] interesting pod/console-operator-58897d9998-rls7b container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.533336 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rls7b" podUID="db55f8c6-430b-46b6-b862-99a7f86d11da" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.541982 4753 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-l9ddt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.542042 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" podUID="ff4ba356-bf66-4e46-83f9-224c75fbbc85" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.542470 4753 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vkq94 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.542524 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" podUID="19f60d75-bda2-4816-b146-f5e29203ffbc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.543606 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-58ksd" event={"ID":"f5f34f58-d006-4253-9328-eacdd6728e68","Type":"ContainerStarted","Data":"98c5f197d57ed522dcf89a93a3c5658a3e20a90260ef3a868390e04dc9c13685"} Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.633869 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.634358 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.134309796 +0000 UTC m=+99.829044178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.715532 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-rls7b" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.736191 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.739332 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.239317297 +0000 UTC m=+99.934051679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.758377 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" podStartSLOduration=79.758357017 podStartE2EDuration="1m19.758357017s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:24.357919449 +0000 UTC m=+99.052653831" watchObservedRunningTime="2026-01-29 14:04:24.758357017 +0000 UTC m=+99.453091399" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.783785 4753 csr.go:261] certificate signing request csr-lvlzh is approved, waiting to be issued Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.784206 4753 csr.go:257] certificate signing request csr-lvlzh is issued Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.800997 4753 patch_prober.go:28] interesting pod/router-default-5444994796-4v9zm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 14:04:24 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Jan 29 14:04:24 crc kubenswrapper[4753]: [+]process-running ok Jan 29 14:04:24 crc kubenswrapper[4753]: healthz check failed Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.801075 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v9zm" podUID="1db476d8-783c-4520-9b07-dfd4525a064f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.838139 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.838568 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.338551707 +0000 UTC m=+100.033286089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.861837 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-57lv7"] Jan 29 14:04:24 crc kubenswrapper[4753]: I0129 14:04:24.939613 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:24 crc kubenswrapper[4753]: E0129 14:04:24.939957 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.439946038 +0000 UTC m=+100.134680420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.041394 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.041784 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.54176629 +0000 UTC m=+100.236500672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.142546 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.142940 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.642923493 +0000 UTC m=+100.337657875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.243960 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.244189 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.74416473 +0000 UTC m=+100.438899112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.244650 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.244987 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.744978752 +0000 UTC m=+100.439713134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.345646 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.345830 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.845797157 +0000 UTC m=+100.540531539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.345869 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.346227 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.846217559 +0000 UTC m=+100.540951941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.391038 4753 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-t9bsg container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.391094 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" podUID="ca5f336c-cf5c-4e2a-b7fb-64acbd039052" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.391379 4753 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-t9bsg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.391395 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" podUID="ca5f336c-cf5c-4e2a-b7fb-64acbd039052" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.447727 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.448065 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:25.948050551 +0000 UTC m=+100.642784933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.553361 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.553680 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.053666859 +0000 UTC m=+100.748401241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.568945 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-57lv7" event={"ID":"281741ae-7781-4682-8b1d-207c9a437581","Type":"ContainerStarted","Data":"d713a038d6b7639b6067d26b11ef8db6f35bec909452e84a0561528dca03cdc3"} Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.569012 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-57lv7" event={"ID":"281741ae-7781-4682-8b1d-207c9a437581","Type":"ContainerStarted","Data":"c39123bfc6c38bee439cfcf0f057a7779a7c45c84c671f6f3cf74523a603d8d8"} Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.654619 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.655271 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.155231844 +0000 UTC m=+100.849966226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.755700 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.756487 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.256415649 +0000 UTC m=+100.951150021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.770444 4753 patch_prober.go:28] interesting pod/router-default-5444994796-4v9zm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 14:04:25 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Jan 29 14:04:25 crc kubenswrapper[4753]: [+]process-running ok Jan 29 14:04:25 crc kubenswrapper[4753]: healthz check failed Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.770512 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v9zm" podUID="1db476d8-783c-4520-9b07-dfd4525a064f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.785095 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-29 13:59:24 +0000 UTC, rotation deadline is 2026-11-08 17:34:10.133635065 +0000 UTC Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.785132 4753 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6795h29m44.348505558s for next certificate rotation Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.857833 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.858134 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.358118717 +0000 UTC m=+101.052853099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:25 crc kubenswrapper[4753]: I0129 14:04:25.959474 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:25 crc kubenswrapper[4753]: E0129 14:04:25.959792 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.459779135 +0000 UTC m=+101.154513517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.060921 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.061114 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.561084653 +0000 UTC m=+101.255819035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.061319 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.061685 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.56167608 +0000 UTC m=+101.256410462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.166555 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.166859 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.666797994 +0000 UTC m=+101.361532386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.167199 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.167814 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.667801291 +0000 UTC m=+101.362535673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.249463 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-t9bsg" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.270182 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.270781 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.770741905 +0000 UTC m=+101.465476287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.282235 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p7v9s"] Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.283564 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.291428 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7v9s"] Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.304812 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.372551 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.372946 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.872933217 +0000 UTC m=+101.567667589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.406374 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gz9wv"] Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.407614 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.425669 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.441225 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gz9wv"] Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.474956 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.475353 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.975322955 +0000 UTC m=+101.670057337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.475465 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.475575 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctxk6\" (UniqueName: \"kubernetes.io/projected/dc59885a-9b00-47db-b9d8-e857c589abce-kube-api-access-ctxk6\") pod \"certified-operators-p7v9s\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.475672 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-catalog-content\") pod \"certified-operators-p7v9s\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.475711 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-utilities\") pod \"certified-operators-p7v9s\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.476253 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:26.976233751 +0000 UTC m=+101.670968133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.584994 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.585206 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctxk6\" (UniqueName: \"kubernetes.io/projected/dc59885a-9b00-47db-b9d8-e857c589abce-kube-api-access-ctxk6\") pod \"certified-operators-p7v9s\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.585283 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-utilities\") pod \"community-operators-gz9wv\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.585307 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-catalog-content\") pod \"certified-operators-p7v9s\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.585343 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-catalog-content\") pod \"community-operators-gz9wv\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.585363 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-utilities\") pod \"certified-operators-p7v9s\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.585430 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2zwr\" (UniqueName: \"kubernetes.io/projected/e5ef4648-0d9e-487b-a796-476432ec0ca8-kube-api-access-f2zwr\") pod \"community-operators-gz9wv\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.585596 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:27.085577602 +0000 UTC m=+101.780311984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.586413 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-catalog-content\") pod \"certified-operators-p7v9s\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.586682 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-utilities\") pod \"certified-operators-p7v9s\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.587722 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f8rzt"] Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.589308 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.600768 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f8rzt"] Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.604444 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-57lv7" event={"ID":"281741ae-7781-4682-8b1d-207c9a437581","Type":"ContainerStarted","Data":"4cfe3aa20ced9e9254bf33f8542e0b89fa152b1c3e6b8fd4e9cbe7234de620f3"} Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.658808 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctxk6\" (UniqueName: \"kubernetes.io/projected/dc59885a-9b00-47db-b9d8-e857c589abce-kube-api-access-ctxk6\") pod \"certified-operators-p7v9s\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.658909 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-58ksd" event={"ID":"f5f34f58-d006-4253-9328-eacdd6728e68","Type":"ContainerStarted","Data":"7ec1d92b8a45cf73c100e1ecb7ec328bd23eb9b548e1c6b6ec538d9a9eafade0"} Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.658942 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-58ksd" event={"ID":"f5f34f58-d006-4253-9328-eacdd6728e68","Type":"ContainerStarted","Data":"9ab87c1d3e11d9f5fe21510ae1efec8d2ae6a1ec5fee14b006e4459e11f6ec92"} Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.664637 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-57lv7" podStartSLOduration=81.664623201 podStartE2EDuration="1m21.664623201s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:26.659398136 +0000 UTC m=+101.354132508" watchObservedRunningTime="2026-01-29 14:04:26.664623201 +0000 UTC m=+101.359357583" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.687604 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2zwr\" (UniqueName: \"kubernetes.io/projected/e5ef4648-0d9e-487b-a796-476432ec0ca8-kube-api-access-f2zwr\") pod \"community-operators-gz9wv\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.687654 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.687719 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-utilities\") pod \"certified-operators-f8rzt\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.687750 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-utilities\") pod \"community-operators-gz9wv\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.687779 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-catalog-content\") pod \"community-operators-gz9wv\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.687812 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-catalog-content\") pod \"certified-operators-f8rzt\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.687830 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vdnp\" (UniqueName: \"kubernetes.io/projected/482babf9-e074-489f-aae6-eb9c48639f25-kube-api-access-4vdnp\") pod \"certified-operators-f8rzt\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.688612 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:27.188595297 +0000 UTC m=+101.883329679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.689217 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-catalog-content\") pod \"community-operators-gz9wv\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.689219 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-utilities\") pod \"community-operators-gz9wv\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.699881 4753 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.712226 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2zwr\" (UniqueName: \"kubernetes.io/projected/e5ef4648-0d9e-487b-a796-476432ec0ca8-kube-api-access-f2zwr\") pod \"community-operators-gz9wv\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.748549 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.772251 4753 patch_prober.go:28] interesting pod/router-default-5444994796-4v9zm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 14:04:26 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Jan 29 14:04:26 crc kubenswrapper[4753]: [+]process-running ok Jan 29 14:04:26 crc kubenswrapper[4753]: healthz check failed Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.772815 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v9zm" podUID="1db476d8-783c-4520-9b07-dfd4525a064f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.789192 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.789563 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-utilities\") pod \"certified-operators-f8rzt\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.789666 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-catalog-content\") pod \"certified-operators-f8rzt\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.789692 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vdnp\" (UniqueName: \"kubernetes.io/projected/482babf9-e074-489f-aae6-eb9c48639f25-kube-api-access-4vdnp\") pod \"certified-operators-f8rzt\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.790205 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:27.290141662 +0000 UTC m=+101.984876054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.790793 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-utilities\") pod \"certified-operators-f8rzt\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.791636 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-catalog-content\") pod \"certified-operators-f8rzt\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.796686 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gz65p"] Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.798027 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.819319 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vdnp\" (UniqueName: \"kubernetes.io/projected/482babf9-e074-489f-aae6-eb9c48639f25-kube-api-access-4vdnp\") pod \"certified-operators-f8rzt\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.822656 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gz65p"] Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.890811 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hkp\" (UniqueName: \"kubernetes.io/projected/6e49742f-8ecc-4140-b477-5a3448e130cc-kube-api-access-r2hkp\") pod \"community-operators-gz65p\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.891279 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-utilities\") pod \"community-operators-gz65p\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.891453 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.891537 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-catalog-content\") pod \"community-operators-gz65p\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.891893 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:27.391873092 +0000 UTC m=+102.086607474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.911790 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.912352 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.993039 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.993440 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hkp\" (UniqueName: \"kubernetes.io/projected/6e49742f-8ecc-4140-b477-5a3448e130cc-kube-api-access-r2hkp\") pod \"community-operators-gz65p\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.993491 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-utilities\") pod \"community-operators-gz65p\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.993600 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-catalog-content\") pod \"community-operators-gz65p\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.994082 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-catalog-content\") pod \"community-operators-gz65p\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:26 crc kubenswrapper[4753]: E0129 14:04:26.994229 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:27.494212569 +0000 UTC m=+102.188946951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:26 crc kubenswrapper[4753]: I0129 14:04:26.994695 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-utilities\") pod \"community-operators-gz65p\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.016870 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hkp\" (UniqueName: \"kubernetes.io/projected/6e49742f-8ecc-4140-b477-5a3448e130cc-kube-api-access-r2hkp\") pod \"community-operators-gz65p\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.095073 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:27 crc kubenswrapper[4753]: E0129 14:04:27.095571 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:27.595553357 +0000 UTC m=+102.290287739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.120580 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.175992 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f8rzt"] Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.196363 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:27 crc kubenswrapper[4753]: E0129 14:04:27.196497 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:27.696467914 +0000 UTC m=+102.391202296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.196708 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:27 crc kubenswrapper[4753]: E0129 14:04:27.197015 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-29 14:04:27.69700578 +0000 UTC m=+102.391740162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rf798" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.287471 4753 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-29T14:04:26.700039756Z","Handler":null,"Name":""} Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.293243 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gz9wv"] Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.298204 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:27 crc kubenswrapper[4753]: E0129 14:04:27.298523 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-29 14:04:27.798506703 +0000 UTC m=+102.493241085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.299881 4753 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.299918 4753 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.364467 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7v9s"] Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.399111 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.404888 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.404925 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.483072 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rf798\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.492583 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.504780 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.511585 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.666118 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gz65p"] Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.667211 4753 generic.go:334] "Generic (PLEG): container finished" podID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerID="6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc" exitCode=0 Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.667258 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz9wv" event={"ID":"e5ef4648-0d9e-487b-a796-476432ec0ca8","Type":"ContainerDied","Data":"6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc"} Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.667293 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz9wv" event={"ID":"e5ef4648-0d9e-487b-a796-476432ec0ca8","Type":"ContainerStarted","Data":"84f12de06eec476466b6a8134fb427aa0c66259e4d26ae1eaca7c66107cf7818"} Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.675289 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-58ksd" event={"ID":"f5f34f58-d006-4253-9328-eacdd6728e68","Type":"ContainerStarted","Data":"d837e7625dd2b647b662ac652ba6263efeb6d41efc003cec57ec39f0427f9df0"} Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.680961 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.711229 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7v9s" event={"ID":"dc59885a-9b00-47db-b9d8-e857c589abce","Type":"ContainerStarted","Data":"f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963"} Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.711312 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7v9s" event={"ID":"dc59885a-9b00-47db-b9d8-e857c589abce","Type":"ContainerStarted","Data":"e517ac2d5af94ab915954b5c6acdaa747a469aacd4e800a628a255502dfa777c"} Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.725889 4753 generic.go:334] "Generic (PLEG): container finished" podID="482babf9-e074-489f-aae6-eb9c48639f25" containerID="3e6bd543f9ac97d25415abbd607976b3bab94772a2a7f1fba5587be6558d40a6" exitCode=0 Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.726639 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8rzt" event={"ID":"482babf9-e074-489f-aae6-eb9c48639f25","Type":"ContainerDied","Data":"3e6bd543f9ac97d25415abbd607976b3bab94772a2a7f1fba5587be6558d40a6"} Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.726704 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8rzt" event={"ID":"482babf9-e074-489f-aae6-eb9c48639f25","Type":"ContainerStarted","Data":"d167c36fafc6b23d600751c65377eb6c5a284c97914b9fbdc1c3885f97b6bf01"} Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.739808 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-58ksd" podStartSLOduration=11.739681054 podStartE2EDuration="11.739681054s" podCreationTimestamp="2026-01-29 14:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:27.737282677 +0000 UTC m=+102.432017069" watchObservedRunningTime="2026-01-29 14:04:27.739681054 +0000 UTC m=+102.434415446" Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.772119 4753 patch_prober.go:28] interesting pod/router-default-5444994796-4v9zm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 14:04:27 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Jan 29 14:04:27 crc kubenswrapper[4753]: [+]process-running ok Jan 29 14:04:27 crc kubenswrapper[4753]: healthz check failed Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.772204 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v9zm" podUID="1db476d8-783c-4520-9b07-dfd4525a064f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.778679 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rf798"] Jan 29 14:04:27 crc kubenswrapper[4753]: W0129 14:04:27.803106 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a18fef7_00cd_4027_bec1_91ded07e3bfb.slice/crio-f52becb34a6d57151b96fa50e01a058ff47cb6a6c2f9c31fcc35f199487d48b3 WatchSource:0}: Error finding container f52becb34a6d57151b96fa50e01a058ff47cb6a6c2f9c31fcc35f199487d48b3: Status 404 returned error can't find the container with id f52becb34a6d57151b96fa50e01a058ff47cb6a6c2f9c31fcc35f199487d48b3 Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.997581 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 29 14:04:27 crc kubenswrapper[4753]: I0129 14:04:27.998290 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.001307 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.002481 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.015542 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.048947 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.049021 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.056974 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.118835 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24015559-56e9-482d-bd0a-defdfe883ca9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"24015559-56e9-482d-bd0a-defdfe883ca9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.119361 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24015559-56e9-482d-bd0a-defdfe883ca9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"24015559-56e9-482d-bd0a-defdfe883ca9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.128641 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.128697 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.130317 4753 patch_prober.go:28] interesting pod/console-f9d7485db-bxx4k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.130366 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bxx4k" podUID="810f6f52-858d-46f6-b2d2-71f9c3135263" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.136305 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-jbfcj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.136377 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jbfcj" podUID="f5125f20-92c4-4700-8c7f-f8c7fc7b48b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.136318 4753 patch_prober.go:28] interesting pod/downloads-7954f5f757-jbfcj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.136481 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jbfcj" podUID="f5125f20-92c4-4700-8c7f-f8c7fc7b48b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.159095 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.221040 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24015559-56e9-482d-bd0a-defdfe883ca9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"24015559-56e9-482d-bd0a-defdfe883ca9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.221183 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24015559-56e9-482d-bd0a-defdfe883ca9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"24015559-56e9-482d-bd0a-defdfe883ca9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.221201 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24015559-56e9-482d-bd0a-defdfe883ca9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"24015559-56e9-482d-bd0a-defdfe883ca9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.242927 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24015559-56e9-482d-bd0a-defdfe883ca9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"24015559-56e9-482d-bd0a-defdfe883ca9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.326248 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.397665 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jmmtj"] Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.399003 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.404127 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.422502 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jmmtj"] Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.527829 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-utilities\") pod \"redhat-marketplace-jmmtj\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.528208 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-catalog-content\") pod \"redhat-marketplace-jmmtj\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.529294 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw7pb\" (UniqueName: \"kubernetes.io/projected/5875e879-79f3-4499-b460-b4b4a3e1637c-kube-api-access-sw7pb\") pod \"redhat-marketplace-jmmtj\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.631312 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw7pb\" (UniqueName: \"kubernetes.io/projected/5875e879-79f3-4499-b460-b4b4a3e1637c-kube-api-access-sw7pb\") pod \"redhat-marketplace-jmmtj\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.631376 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-utilities\") pod \"redhat-marketplace-jmmtj\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.631403 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-catalog-content\") pod \"redhat-marketplace-jmmtj\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.632004 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-utilities\") pod \"redhat-marketplace-jmmtj\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.632251 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-catalog-content\") pod \"redhat-marketplace-jmmtj\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.649644 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw7pb\" (UniqueName: \"kubernetes.io/projected/5875e879-79f3-4499-b460-b4b4a3e1637c-kube-api-access-sw7pb\") pod \"redhat-marketplace-jmmtj\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.722460 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.735043 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" event={"ID":"6a18fef7-00cd-4027-bec1-91ded07e3bfb","Type":"ContainerStarted","Data":"3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c"} Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.735095 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" event={"ID":"6a18fef7-00cd-4027-bec1-91ded07e3bfb","Type":"ContainerStarted","Data":"f52becb34a6d57151b96fa50e01a058ff47cb6a6c2f9c31fcc35f199487d48b3"} Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.735137 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.738222 4753 generic.go:334] "Generic (PLEG): container finished" podID="647fe5d7-4243-4608-8351-6bc2e13b9f15" containerID="f7344526d39deb3662a79603e6c72db1784dcdad1b3d65cb4b36220c87cb7682" exitCode=0 Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.738290 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" event={"ID":"647fe5d7-4243-4608-8351-6bc2e13b9f15","Type":"ContainerDied","Data":"f7344526d39deb3662a79603e6c72db1784dcdad1b3d65cb4b36220c87cb7682"} Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.756119 4753 generic.go:334] "Generic (PLEG): container finished" podID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerID="6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522" exitCode=0 Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.756140 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz65p" event={"ID":"6e49742f-8ecc-4140-b477-5a3448e130cc","Type":"ContainerDied","Data":"6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522"} Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.760622 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz65p" event={"ID":"6e49742f-8ecc-4140-b477-5a3448e130cc","Type":"ContainerStarted","Data":"7ea8788e793c6aee02dd5275d67004d091cb69d8ad60afba164b0e7a3627fe92"} Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.764853 4753 generic.go:334] "Generic (PLEG): container finished" podID="dc59885a-9b00-47db-b9d8-e857c589abce" containerID="f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963" exitCode=0 Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.765287 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7v9s" event={"ID":"dc59885a-9b00-47db-b9d8-e857c589abce","Type":"ContainerDied","Data":"f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963"} Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.778263 4753 patch_prober.go:28] interesting pod/router-default-5444994796-4v9zm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 14:04:28 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Jan 29 14:04:28 crc kubenswrapper[4753]: [+]process-running ok Jan 29 14:04:28 crc kubenswrapper[4753]: healthz check failed Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.778391 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v9zm" podUID="1db476d8-783c-4520-9b07-dfd4525a064f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.783489 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gd25h" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.784215 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" podStartSLOduration=83.784189318 podStartE2EDuration="1m23.784189318s" podCreationTimestamp="2026-01-29 14:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:28.779929489 +0000 UTC m=+103.474663871" watchObservedRunningTime="2026-01-29 14:04:28.784189318 +0000 UTC m=+103.478923700" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.799032 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rj7d8"] Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.813976 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.814044 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj7d8"] Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.870800 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.939186 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd6pv\" (UniqueName: \"kubernetes.io/projected/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-kube-api-access-wd6pv\") pod \"redhat-marketplace-rj7d8\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.939340 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-catalog-content\") pod \"redhat-marketplace-rj7d8\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:28 crc kubenswrapper[4753]: I0129 14:04:28.939363 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-utilities\") pod \"redhat-marketplace-rj7d8\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.008848 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.041331 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-catalog-content\") pod \"redhat-marketplace-rj7d8\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.041376 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-utilities\") pod \"redhat-marketplace-rj7d8\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.041439 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd6pv\" (UniqueName: \"kubernetes.io/projected/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-kube-api-access-wd6pv\") pod \"redhat-marketplace-rj7d8\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.042127 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-catalog-content\") pod \"redhat-marketplace-rj7d8\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.042377 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-utilities\") pod \"redhat-marketplace-rj7d8\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.068822 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd6pv\" (UniqueName: \"kubernetes.io/projected/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-kube-api-access-wd6pv\") pod \"redhat-marketplace-rj7d8\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.127754 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jmmtj"] Jan 29 14:04:29 crc kubenswrapper[4753]: W0129 14:04:29.143640 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5875e879_79f3_4499_b460_b4b4a3e1637c.slice/crio-2f776b87193d1f11d63f3e3d43b41495f4ea10e54efbe7e0669a24a6262c2ac2 WatchSource:0}: Error finding container 2f776b87193d1f11d63f3e3d43b41495f4ea10e54efbe7e0669a24a6262c2ac2: Status 404 returned error can't find the container with id 2f776b87193d1f11d63f3e3d43b41495f4ea10e54efbe7e0669a24a6262c2ac2 Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.165891 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.443330 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2sj8l"] Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.444762 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.447893 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.465324 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2sj8l"] Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.553378 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-catalog-content\") pod \"redhat-operators-2sj8l\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.553447 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-utilities\") pod \"redhat-operators-2sj8l\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.553692 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qbh\" (UniqueName: \"kubernetes.io/projected/48968e19-5dbd-4231-895f-c28e4178bf33-kube-api-access-58qbh\") pod \"redhat-operators-2sj8l\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.655384 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-catalog-content\") pod \"redhat-operators-2sj8l\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.655741 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-utilities\") pod \"redhat-operators-2sj8l\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.655820 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qbh\" (UniqueName: \"kubernetes.io/projected/48968e19-5dbd-4231-895f-c28e4178bf33-kube-api-access-58qbh\") pod \"redhat-operators-2sj8l\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.656477 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-catalog-content\") pod \"redhat-operators-2sj8l\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.656682 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-utilities\") pod \"redhat-operators-2sj8l\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.656691 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj7d8"] Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.680533 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qbh\" (UniqueName: \"kubernetes.io/projected/48968e19-5dbd-4231-895f-c28e4178bf33-kube-api-access-58qbh\") pod \"redhat-operators-2sj8l\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: W0129 14:04:29.681446 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e71e0d4_bc4b_4f43_8742_bcfe62d1d7e5.slice/crio-43e43a7f20107290566cb56a2d04b835e91336b8e7a300e1a80e696bc6b24236 WatchSource:0}: Error finding container 43e43a7f20107290566cb56a2d04b835e91336b8e7a300e1a80e696bc6b24236: Status 404 returned error can't find the container with id 43e43a7f20107290566cb56a2d04b835e91336b8e7a300e1a80e696bc6b24236 Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.774051 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.779513 4753 patch_prober.go:28] interesting pod/router-default-5444994796-4v9zm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 14:04:29 crc kubenswrapper[4753]: [-]has-synced failed: reason withheld Jan 29 14:04:29 crc kubenswrapper[4753]: [+]process-running ok Jan 29 14:04:29 crc kubenswrapper[4753]: healthz check failed Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.779572 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v9zm" podUID="1db476d8-783c-4520-9b07-dfd4525a064f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.780305 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b6pp2"] Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.786182 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.795407 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b6pp2"] Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.797119 4753 generic.go:334] "Generic (PLEG): container finished" podID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerID="3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662" exitCode=0 Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.797205 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmmtj" event={"ID":"5875e879-79f3-4499-b460-b4b4a3e1637c","Type":"ContainerDied","Data":"3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662"} Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.797230 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmmtj" event={"ID":"5875e879-79f3-4499-b460-b4b4a3e1637c","Type":"ContainerStarted","Data":"2f776b87193d1f11d63f3e3d43b41495f4ea10e54efbe7e0669a24a6262c2ac2"} Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.822793 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"24015559-56e9-482d-bd0a-defdfe883ca9","Type":"ContainerStarted","Data":"f58e9ff87c5d461f44e69ec96d5a481ff2b723265abcd6026ccd04378c44e049"} Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.822844 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"24015559-56e9-482d-bd0a-defdfe883ca9","Type":"ContainerStarted","Data":"9e7dccefdea81b6e4d6b5b7f0086184a95122f49d9011f5394ea1b141e8d77f9"} Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.827565 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj7d8" event={"ID":"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5","Type":"ContainerStarted","Data":"43e43a7f20107290566cb56a2d04b835e91336b8e7a300e1a80e696bc6b24236"} Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.830624 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.838305 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.920957 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.920930068 podStartE2EDuration="2.920930068s" podCreationTimestamp="2026-01-29 14:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:29.903176454 +0000 UTC m=+104.597910836" watchObservedRunningTime="2026-01-29 14:04:29.920930068 +0000 UTC m=+104.615664450" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.976961 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-utilities\") pod \"redhat-operators-b6pp2\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.977084 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2brn2\" (UniqueName: \"kubernetes.io/projected/4762024a-21e5-4b76-a778-f2a16551c198-kube-api-access-2brn2\") pod \"redhat-operators-b6pp2\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:29 crc kubenswrapper[4753]: I0129 14:04:29.977117 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-catalog-content\") pod \"redhat-operators-b6pp2\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.078773 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2brn2\" (UniqueName: \"kubernetes.io/projected/4762024a-21e5-4b76-a778-f2a16551c198-kube-api-access-2brn2\") pod \"redhat-operators-b6pp2\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.079041 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-catalog-content\") pod \"redhat-operators-b6pp2\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.079124 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-utilities\") pod \"redhat-operators-b6pp2\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.079584 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-utilities\") pod \"redhat-operators-b6pp2\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.079680 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-catalog-content\") pod \"redhat-operators-b6pp2\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.088777 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.098079 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-st6s2" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.107634 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.109200 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2brn2\" (UniqueName: \"kubernetes.io/projected/4762024a-21e5-4b76-a778-f2a16551c198-kube-api-access-2brn2\") pod \"redhat-operators-b6pp2\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.141103 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l9ddt" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.202564 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.296736 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.385912 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/647fe5d7-4243-4608-8351-6bc2e13b9f15-config-volume\") pod \"647fe5d7-4243-4608-8351-6bc2e13b9f15\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.386422 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czcmt\" (UniqueName: \"kubernetes.io/projected/647fe5d7-4243-4608-8351-6bc2e13b9f15-kube-api-access-czcmt\") pod \"647fe5d7-4243-4608-8351-6bc2e13b9f15\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.386503 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/647fe5d7-4243-4608-8351-6bc2e13b9f15-secret-volume\") pod \"647fe5d7-4243-4608-8351-6bc2e13b9f15\" (UID: \"647fe5d7-4243-4608-8351-6bc2e13b9f15\") " Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.391193 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647fe5d7-4243-4608-8351-6bc2e13b9f15-config-volume" (OuterVolumeSpecName: "config-volume") pod "647fe5d7-4243-4608-8351-6bc2e13b9f15" (UID: "647fe5d7-4243-4608-8351-6bc2e13b9f15"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.393841 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647fe5d7-4243-4608-8351-6bc2e13b9f15-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "647fe5d7-4243-4608-8351-6bc2e13b9f15" (UID: "647fe5d7-4243-4608-8351-6bc2e13b9f15"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.404874 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647fe5d7-4243-4608-8351-6bc2e13b9f15-kube-api-access-czcmt" (OuterVolumeSpecName: "kube-api-access-czcmt") pod "647fe5d7-4243-4608-8351-6bc2e13b9f15" (UID: "647fe5d7-4243-4608-8351-6bc2e13b9f15"). InnerVolumeSpecName "kube-api-access-czcmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.488604 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/647fe5d7-4243-4608-8351-6bc2e13b9f15-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.488647 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/647fe5d7-4243-4608-8351-6bc2e13b9f15-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.488660 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czcmt\" (UniqueName: \"kubernetes.io/projected/647fe5d7-4243-4608-8351-6bc2e13b9f15-kube-api-access-czcmt\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.683100 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2sj8l"] Jan 29 14:04:30 crc kubenswrapper[4753]: W0129 14:04:30.686941 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48968e19_5dbd_4231_895f_c28e4178bf33.slice/crio-48fac7314c91c0f85ccbb2ec22413e2ef7bfc191fc14520534d19d9313983eaa WatchSource:0}: Error finding container 48fac7314c91c0f85ccbb2ec22413e2ef7bfc191fc14520534d19d9313983eaa: Status 404 returned error can't find the container with id 48fac7314c91c0f85ccbb2ec22413e2ef7bfc191fc14520534d19d9313983eaa Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.743045 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b6pp2"] Jan 29 14:04:30 crc kubenswrapper[4753]: W0129 14:04:30.761349 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4762024a_21e5_4b76_a778_f2a16551c198.slice/crio-3b7c4052d6f2dd18311c8daa0a73df3610a1a88cad683ad029ae5c0b8c4e66e9 WatchSource:0}: Error finding container 3b7c4052d6f2dd18311c8daa0a73df3610a1a88cad683ad029ae5c0b8c4e66e9: Status 404 returned error can't find the container with id 3b7c4052d6f2dd18311c8daa0a73df3610a1a88cad683ad029ae5c0b8c4e66e9 Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.771071 4753 patch_prober.go:28] interesting pod/router-default-5444994796-4v9zm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 29 14:04:30 crc kubenswrapper[4753]: [+]has-synced ok Jan 29 14:04:30 crc kubenswrapper[4753]: [+]process-running ok Jan 29 14:04:30 crc kubenswrapper[4753]: healthz check failed Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.771230 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4v9zm" podUID="1db476d8-783c-4520-9b07-dfd4525a064f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.900094 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" event={"ID":"647fe5d7-4243-4608-8351-6bc2e13b9f15","Type":"ContainerDied","Data":"ccf52d40fabd14bc8ac01133cccccd51fd4b6be0eaada6107380a2fdcaa2dc7f"} Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.900912 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.900922 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccf52d40fabd14bc8ac01133cccccd51fd4b6be0eaada6107380a2fdcaa2dc7f" Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.917495 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6pp2" event={"ID":"4762024a-21e5-4b76-a778-f2a16551c198","Type":"ContainerStarted","Data":"3b7c4052d6f2dd18311c8daa0a73df3610a1a88cad683ad029ae5c0b8c4e66e9"} Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.920912 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sj8l" event={"ID":"48968e19-5dbd-4231-895f-c28e4178bf33","Type":"ContainerStarted","Data":"48fac7314c91c0f85ccbb2ec22413e2ef7bfc191fc14520534d19d9313983eaa"} Jan 29 14:04:30 crc kubenswrapper[4753]: I0129 14:04:30.941987 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj7d8" event={"ID":"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5","Type":"ContainerStarted","Data":"f4731879011cb220b7ab9edaf8cb098ee14c9ae2a83105767e8531e6ff1d333b"} Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.180423 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 29 14:04:31 crc kubenswrapper[4753]: E0129 14:04:31.180651 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647fe5d7-4243-4608-8351-6bc2e13b9f15" containerName="collect-profiles" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.180664 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="647fe5d7-4243-4608-8351-6bc2e13b9f15" containerName="collect-profiles" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.180764 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="647fe5d7-4243-4608-8351-6bc2e13b9f15" containerName="collect-profiles" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.181141 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.184002 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.184215 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.226127 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.305864 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32e182d7-62af-4c3d-b948-485f7d7a0609-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"32e182d7-62af-4c3d-b948-485f7d7a0609\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.305988 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32e182d7-62af-4c3d-b948-485f7d7a0609-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"32e182d7-62af-4c3d-b948-485f7d7a0609\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.407886 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32e182d7-62af-4c3d-b948-485f7d7a0609-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"32e182d7-62af-4c3d-b948-485f7d7a0609\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.408026 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32e182d7-62af-4c3d-b948-485f7d7a0609-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"32e182d7-62af-4c3d-b948-485f7d7a0609\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.408287 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32e182d7-62af-4c3d-b948-485f7d7a0609-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"32e182d7-62af-4c3d-b948-485f7d7a0609\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.442339 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32e182d7-62af-4c3d-b948-485f7d7a0609-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"32e182d7-62af-4c3d-b948-485f7d7a0609\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.503626 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.775816 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.779563 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-4v9zm" Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.988074 4753 generic.go:334] "Generic (PLEG): container finished" podID="48968e19-5dbd-4231-895f-c28e4178bf33" containerID="c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e" exitCode=0 Jan 29 14:04:31 crc kubenswrapper[4753]: I0129 14:04:31.989234 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sj8l" event={"ID":"48968e19-5dbd-4231-895f-c28e4178bf33","Type":"ContainerDied","Data":"c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e"} Jan 29 14:04:32 crc kubenswrapper[4753]: I0129 14:04:32.011344 4753 generic.go:334] "Generic (PLEG): container finished" podID="24015559-56e9-482d-bd0a-defdfe883ca9" containerID="f58e9ff87c5d461f44e69ec96d5a481ff2b723265abcd6026ccd04378c44e049" exitCode=0 Jan 29 14:04:32 crc kubenswrapper[4753]: I0129 14:04:32.011425 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"24015559-56e9-482d-bd0a-defdfe883ca9","Type":"ContainerDied","Data":"f58e9ff87c5d461f44e69ec96d5a481ff2b723265abcd6026ccd04378c44e049"} Jan 29 14:04:32 crc kubenswrapper[4753]: I0129 14:04:32.030171 4753 generic.go:334] "Generic (PLEG): container finished" podID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerID="f4731879011cb220b7ab9edaf8cb098ee14c9ae2a83105767e8531e6ff1d333b" exitCode=0 Jan 29 14:04:32 crc kubenswrapper[4753]: I0129 14:04:32.030274 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj7d8" event={"ID":"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5","Type":"ContainerDied","Data":"f4731879011cb220b7ab9edaf8cb098ee14c9ae2a83105767e8531e6ff1d333b"} Jan 29 14:04:32 crc kubenswrapper[4753]: I0129 14:04:32.048694 4753 generic.go:334] "Generic (PLEG): container finished" podID="4762024a-21e5-4b76-a778-f2a16551c198" containerID="14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b" exitCode=0 Jan 29 14:04:32 crc kubenswrapper[4753]: I0129 14:04:32.048861 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6pp2" event={"ID":"4762024a-21e5-4b76-a778-f2a16551c198","Type":"ContainerDied","Data":"14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b"} Jan 29 14:04:32 crc kubenswrapper[4753]: I0129 14:04:32.058608 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 29 14:04:32 crc kubenswrapper[4753]: W0129 14:04:32.098647 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod32e182d7_62af_4c3d_b948_485f7d7a0609.slice/crio-6d412ae6181680c46ba6e23c372606ceb81fa42b6e42f7fb9e877551e5476dd1 WatchSource:0}: Error finding container 6d412ae6181680c46ba6e23c372606ceb81fa42b6e42f7fb9e877551e5476dd1: Status 404 returned error can't find the container with id 6d412ae6181680c46ba6e23c372606ceb81fa42b6e42f7fb9e877551e5476dd1 Jan 29 14:04:33 crc kubenswrapper[4753]: I0129 14:04:33.063415 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"32e182d7-62af-4c3d-b948-485f7d7a0609","Type":"ContainerStarted","Data":"777a2bf6721d9255e2d9f6446836e431f65254867ba364bea81e859719476b2e"} Jan 29 14:04:33 crc kubenswrapper[4753]: I0129 14:04:33.063778 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"32e182d7-62af-4c3d-b948-485f7d7a0609","Type":"ContainerStarted","Data":"6d412ae6181680c46ba6e23c372606ceb81fa42b6e42f7fb9e877551e5476dd1"} Jan 29 14:04:33 crc kubenswrapper[4753]: I0129 14:04:33.077506 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.077484298 podStartE2EDuration="2.077484298s" podCreationTimestamp="2026-01-29 14:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:04:33.076625865 +0000 UTC m=+107.771360277" watchObservedRunningTime="2026-01-29 14:04:33.077484298 +0000 UTC m=+107.772218680" Jan 29 14:04:33 crc kubenswrapper[4753]: I0129 14:04:33.419732 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 14:04:33 crc kubenswrapper[4753]: I0129 14:04:33.582363 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24015559-56e9-482d-bd0a-defdfe883ca9-kube-api-access\") pod \"24015559-56e9-482d-bd0a-defdfe883ca9\" (UID: \"24015559-56e9-482d-bd0a-defdfe883ca9\") " Jan 29 14:04:33 crc kubenswrapper[4753]: I0129 14:04:33.582537 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24015559-56e9-482d-bd0a-defdfe883ca9-kubelet-dir\") pod \"24015559-56e9-482d-bd0a-defdfe883ca9\" (UID: \"24015559-56e9-482d-bd0a-defdfe883ca9\") " Jan 29 14:04:33 crc kubenswrapper[4753]: I0129 14:04:33.582896 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24015559-56e9-482d-bd0a-defdfe883ca9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "24015559-56e9-482d-bd0a-defdfe883ca9" (UID: "24015559-56e9-482d-bd0a-defdfe883ca9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:04:33 crc kubenswrapper[4753]: I0129 14:04:33.614131 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24015559-56e9-482d-bd0a-defdfe883ca9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "24015559-56e9-482d-bd0a-defdfe883ca9" (UID: "24015559-56e9-482d-bd0a-defdfe883ca9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:04:33 crc kubenswrapper[4753]: I0129 14:04:33.684622 4753 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24015559-56e9-482d-bd0a-defdfe883ca9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:33 crc kubenswrapper[4753]: I0129 14:04:33.684657 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24015559-56e9-482d-bd0a-defdfe883ca9-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:34 crc kubenswrapper[4753]: I0129 14:04:34.083235 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 29 14:04:34 crc kubenswrapper[4753]: I0129 14:04:34.083246 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"24015559-56e9-482d-bd0a-defdfe883ca9","Type":"ContainerDied","Data":"9e7dccefdea81b6e4d6b5b7f0086184a95122f49d9011f5394ea1b141e8d77f9"} Jan 29 14:04:34 crc kubenswrapper[4753]: I0129 14:04:34.083342 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e7dccefdea81b6e4d6b5b7f0086184a95122f49d9011f5394ea1b141e8d77f9" Jan 29 14:04:34 crc kubenswrapper[4753]: I0129 14:04:34.133523 4753 generic.go:334] "Generic (PLEG): container finished" podID="32e182d7-62af-4c3d-b948-485f7d7a0609" containerID="777a2bf6721d9255e2d9f6446836e431f65254867ba364bea81e859719476b2e" exitCode=0 Jan 29 14:04:34 crc kubenswrapper[4753]: I0129 14:04:34.133577 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"32e182d7-62af-4c3d-b948-485f7d7a0609","Type":"ContainerDied","Data":"777a2bf6721d9255e2d9f6446836e431f65254867ba364bea81e859719476b2e"} Jan 29 14:04:34 crc kubenswrapper[4753]: I0129 14:04:34.479888 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mxrlb" Jan 29 14:04:35 crc kubenswrapper[4753]: I0129 14:04:35.492477 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 14:04:35 crc kubenswrapper[4753]: I0129 14:04:35.629844 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32e182d7-62af-4c3d-b948-485f7d7a0609-kubelet-dir\") pod \"32e182d7-62af-4c3d-b948-485f7d7a0609\" (UID: \"32e182d7-62af-4c3d-b948-485f7d7a0609\") " Jan 29 14:04:35 crc kubenswrapper[4753]: I0129 14:04:35.629929 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32e182d7-62af-4c3d-b948-485f7d7a0609-kube-api-access\") pod \"32e182d7-62af-4c3d-b948-485f7d7a0609\" (UID: \"32e182d7-62af-4c3d-b948-485f7d7a0609\") " Jan 29 14:04:35 crc kubenswrapper[4753]: I0129 14:04:35.630013 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32e182d7-62af-4c3d-b948-485f7d7a0609-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "32e182d7-62af-4c3d-b948-485f7d7a0609" (UID: "32e182d7-62af-4c3d-b948-485f7d7a0609"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:04:35 crc kubenswrapper[4753]: I0129 14:04:35.631132 4753 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32e182d7-62af-4c3d-b948-485f7d7a0609-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:35 crc kubenswrapper[4753]: I0129 14:04:35.636974 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e182d7-62af-4c3d-b948-485f7d7a0609-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "32e182d7-62af-4c3d-b948-485f7d7a0609" (UID: "32e182d7-62af-4c3d-b948-485f7d7a0609"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:04:35 crc kubenswrapper[4753]: I0129 14:04:35.732017 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32e182d7-62af-4c3d-b948-485f7d7a0609-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:36 crc kubenswrapper[4753]: I0129 14:04:36.183525 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"32e182d7-62af-4c3d-b948-485f7d7a0609","Type":"ContainerDied","Data":"6d412ae6181680c46ba6e23c372606ceb81fa42b6e42f7fb9e877551e5476dd1"} Jan 29 14:04:36 crc kubenswrapper[4753]: I0129 14:04:36.183566 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d412ae6181680c46ba6e23c372606ceb81fa42b6e42f7fb9e877551e5476dd1" Jan 29 14:04:36 crc kubenswrapper[4753]: I0129 14:04:36.183618 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 29 14:04:38 crc kubenswrapper[4753]: I0129 14:04:38.135719 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:38 crc kubenswrapper[4753]: I0129 14:04:38.140360 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:04:38 crc kubenswrapper[4753]: I0129 14:04:38.144280 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jbfcj" Jan 29 14:04:43 crc kubenswrapper[4753]: I0129 14:04:43.531901 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5xcbl"] Jan 29 14:04:43 crc kubenswrapper[4753]: I0129 14:04:43.536023 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" podUID="18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" containerName="controller-manager" containerID="cri-o://b22c5c3be2362c805abdf785d9d3ed0a8e49b9a524ae6159b3e13b2b1a8d945d" gracePeriod=30 Jan 29 14:04:43 crc kubenswrapper[4753]: I0129 14:04:43.543933 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt"] Jan 29 14:04:43 crc kubenswrapper[4753]: I0129 14:04:43.544251 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" podUID="776ea39d-e7fb-497f-bfaf-41b385b76754" containerName="route-controller-manager" containerID="cri-o://f3f3147f75b0472631fc699585d5535c19e3da50acddeadfece6a2e4ed7892b7" gracePeriod=30 Jan 29 14:04:44 crc kubenswrapper[4753]: I0129 14:04:44.279452 4753 generic.go:334] "Generic (PLEG): container finished" podID="776ea39d-e7fb-497f-bfaf-41b385b76754" containerID="f3f3147f75b0472631fc699585d5535c19e3da50acddeadfece6a2e4ed7892b7" exitCode=0 Jan 29 14:04:44 crc kubenswrapper[4753]: I0129 14:04:44.279565 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" event={"ID":"776ea39d-e7fb-497f-bfaf-41b385b76754","Type":"ContainerDied","Data":"f3f3147f75b0472631fc699585d5535c19e3da50acddeadfece6a2e4ed7892b7"} Jan 29 14:04:44 crc kubenswrapper[4753]: I0129 14:04:44.282857 4753 generic.go:334] "Generic (PLEG): container finished" podID="18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" containerID="b22c5c3be2362c805abdf785d9d3ed0a8e49b9a524ae6159b3e13b2b1a8d945d" exitCode=0 Jan 29 14:04:44 crc kubenswrapper[4753]: I0129 14:04:44.282904 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" event={"ID":"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23","Type":"ContainerDied","Data":"b22c5c3be2362c805abdf785d9d3ed0a8e49b9a524ae6159b3e13b2b1a8d945d"} Jan 29 14:04:47 crc kubenswrapper[4753]: I0129 14:04:47.498957 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.072913 4753 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5xcbl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.072986 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" podUID="18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.439615 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.484422 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4"] Jan 29 14:04:48 crc kubenswrapper[4753]: E0129 14:04:48.484721 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="776ea39d-e7fb-497f-bfaf-41b385b76754" containerName="route-controller-manager" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.484738 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="776ea39d-e7fb-497f-bfaf-41b385b76754" containerName="route-controller-manager" Jan 29 14:04:48 crc kubenswrapper[4753]: E0129 14:04:48.484752 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24015559-56e9-482d-bd0a-defdfe883ca9" containerName="pruner" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.484760 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="24015559-56e9-482d-bd0a-defdfe883ca9" containerName="pruner" Jan 29 14:04:48 crc kubenswrapper[4753]: E0129 14:04:48.484770 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e182d7-62af-4c3d-b948-485f7d7a0609" containerName="pruner" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.484779 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e182d7-62af-4c3d-b948-485f7d7a0609" containerName="pruner" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.484909 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="24015559-56e9-482d-bd0a-defdfe883ca9" containerName="pruner" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.484920 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="776ea39d-e7fb-497f-bfaf-41b385b76754" containerName="route-controller-manager" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.484935 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="32e182d7-62af-4c3d-b948-485f7d7a0609" containerName="pruner" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.485504 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.498894 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4"] Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.595451 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-client-ca\") pod \"776ea39d-e7fb-497f-bfaf-41b385b76754\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.595564 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/776ea39d-e7fb-497f-bfaf-41b385b76754-serving-cert\") pod \"776ea39d-e7fb-497f-bfaf-41b385b76754\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.595652 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpfcs\" (UniqueName: \"kubernetes.io/projected/776ea39d-e7fb-497f-bfaf-41b385b76754-kube-api-access-gpfcs\") pod \"776ea39d-e7fb-497f-bfaf-41b385b76754\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.595690 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-config\") pod \"776ea39d-e7fb-497f-bfaf-41b385b76754\" (UID: \"776ea39d-e7fb-497f-bfaf-41b385b76754\") " Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.595893 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-client-ca\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.595988 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2d5z\" (UniqueName: \"kubernetes.io/projected/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-kube-api-access-t2d5z\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.596043 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-serving-cert\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.596097 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-config\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.596184 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-client-ca" (OuterVolumeSpecName: "client-ca") pod "776ea39d-e7fb-497f-bfaf-41b385b76754" (UID: "776ea39d-e7fb-497f-bfaf-41b385b76754"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.596784 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-config" (OuterVolumeSpecName: "config") pod "776ea39d-e7fb-497f-bfaf-41b385b76754" (UID: "776ea39d-e7fb-497f-bfaf-41b385b76754"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.602720 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/776ea39d-e7fb-497f-bfaf-41b385b76754-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "776ea39d-e7fb-497f-bfaf-41b385b76754" (UID: "776ea39d-e7fb-497f-bfaf-41b385b76754"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.603377 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/776ea39d-e7fb-497f-bfaf-41b385b76754-kube-api-access-gpfcs" (OuterVolumeSpecName: "kube-api-access-gpfcs") pod "776ea39d-e7fb-497f-bfaf-41b385b76754" (UID: "776ea39d-e7fb-497f-bfaf-41b385b76754"). InnerVolumeSpecName "kube-api-access-gpfcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.700822 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2d5z\" (UniqueName: \"kubernetes.io/projected/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-kube-api-access-t2d5z\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.700895 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-serving-cert\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.700944 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-config\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.700970 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-client-ca\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.701022 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/776ea39d-e7fb-497f-bfaf-41b385b76754-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.701034 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpfcs\" (UniqueName: \"kubernetes.io/projected/776ea39d-e7fb-497f-bfaf-41b385b76754-kube-api-access-gpfcs\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.701044 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.701052 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/776ea39d-e7fb-497f-bfaf-41b385b76754-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.739416 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-client-ca\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.741551 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-config\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.747329 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-serving-cert\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.747806 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2d5z\" (UniqueName: \"kubernetes.io/projected/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-kube-api-access-t2d5z\") pod \"route-controller-manager-67b7ddd96d-mnql4\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:48 crc kubenswrapper[4753]: I0129 14:04:48.817608 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:04:49 crc kubenswrapper[4753]: I0129 14:04:49.325140 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" event={"ID":"776ea39d-e7fb-497f-bfaf-41b385b76754","Type":"ContainerDied","Data":"6de60352cd886e1035cbd0fe39197dd5b25eaef1cc236789bba25d1864d6e0b1"} Jan 29 14:04:49 crc kubenswrapper[4753]: I0129 14:04:49.325229 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt" Jan 29 14:04:49 crc kubenswrapper[4753]: I0129 14:04:49.325326 4753 scope.go:117] "RemoveContainer" containerID="f3f3147f75b0472631fc699585d5535c19e3da50acddeadfece6a2e4ed7892b7" Jan 29 14:04:49 crc kubenswrapper[4753]: I0129 14:04:49.373726 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt"] Jan 29 14:04:49 crc kubenswrapper[4753]: I0129 14:04:49.379510 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lhpqt"] Jan 29 14:04:50 crc kubenswrapper[4753]: I0129 14:04:50.159230 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="776ea39d-e7fb-497f-bfaf-41b385b76754" path="/var/lib/kubelet/pods/776ea39d-e7fb-497f-bfaf-41b385b76754/volumes" Jan 29 14:04:59 crc kubenswrapper[4753]: I0129 14:04:59.073791 4753 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5xcbl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 29 14:04:59 crc kubenswrapper[4753]: I0129 14:04:59.074772 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" podUID="18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 29 14:04:59 crc kubenswrapper[4753]: E0129 14:04:59.200259 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 14:04:59 crc kubenswrapper[4753]: E0129 14:04:59.200465 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4vdnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-f8rzt_openshift-marketplace(482babf9-e074-489f-aae6-eb9c48639f25): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 14:04:59 crc kubenswrapper[4753]: E0129 14:04:59.201608 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-f8rzt" podUID="482babf9-e074-489f-aae6-eb9c48639f25" Jan 29 14:04:59 crc kubenswrapper[4753]: I0129 14:04:59.749243 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f297q" Jan 29 14:05:00 crc kubenswrapper[4753]: E0129 14:05:00.638628 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-f8rzt" podUID="482babf9-e074-489f-aae6-eb9c48639f25" Jan 29 14:05:00 crc kubenswrapper[4753]: E0129 14:05:00.818644 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 14:05:00 crc kubenswrapper[4753]: E0129 14:05:00.818879 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sw7pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jmmtj_openshift-marketplace(5875e879-79f3-4499-b460-b4b4a3e1637c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 14:05:00 crc kubenswrapper[4753]: E0129 14:05:00.820084 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jmmtj" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" Jan 29 14:05:03 crc kubenswrapper[4753]: I0129 14:05:03.231612 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4"] Jan 29 14:05:04 crc kubenswrapper[4753]: E0129 14:05:04.895434 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jmmtj" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.042246 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.094787 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm"] Jan 29 14:05:05 crc kubenswrapper[4753]: E0129 14:05:05.095310 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" containerName="controller-manager" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.095345 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" containerName="controller-manager" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.095616 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" containerName="controller-manager" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.096553 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.105876 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm"] Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.119004 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-serving-cert\") pod \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.119096 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-927kk\" (UniqueName: \"kubernetes.io/projected/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-kube-api-access-927kk\") pod \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.119281 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-proxy-ca-bundles\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.119354 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-serving-cert\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.119427 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-config\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.119677 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-client-ca\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.119729 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgnfc\" (UniqueName: \"kubernetes.io/projected/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-kube-api-access-rgnfc\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.131069 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" (UID: "18bdd5fc-7b40-42b0-a25b-2c3abdc96f23"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.131119 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-kube-api-access-927kk" (OuterVolumeSpecName: "kube-api-access-927kk") pod "18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" (UID: "18bdd5fc-7b40-42b0-a25b-2c3abdc96f23"). InnerVolumeSpecName "kube-api-access-927kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.220244 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-client-ca\") pod \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.220303 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-config\") pod \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.220336 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-proxy-ca-bundles\") pod \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\" (UID: \"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23\") " Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.220460 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-serving-cert\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.220540 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-config\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.220593 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-client-ca\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.220615 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgnfc\" (UniqueName: \"kubernetes.io/projected/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-kube-api-access-rgnfc\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.220639 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-proxy-ca-bundles\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.220723 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-927kk\" (UniqueName: \"kubernetes.io/projected/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-kube-api-access-927kk\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.220738 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.221065 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-client-ca" (OuterVolumeSpecName: "client-ca") pod "18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" (UID: "18bdd5fc-7b40-42b0-a25b-2c3abdc96f23"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.221186 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" (UID: "18bdd5fc-7b40-42b0-a25b-2c3abdc96f23"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.221563 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-config" (OuterVolumeSpecName: "config") pod "18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" (UID: "18bdd5fc-7b40-42b0-a25b-2c3abdc96f23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.222580 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-client-ca\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.222711 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-proxy-ca-bundles\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.322048 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.322115 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.322207 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.437403 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" event={"ID":"18bdd5fc-7b40-42b0-a25b-2c3abdc96f23","Type":"ContainerDied","Data":"1993f9406507092e575b1f0f4df62719cfa72f813b2f425799ab8a94bbd513a4"} Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.437516 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5xcbl" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.458352 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-serving-cert\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.459621 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-config\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.463642 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgnfc\" (UniqueName: \"kubernetes.io/projected/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-kube-api-access-rgnfc\") pod \"controller-manager-dc4bfd7fd-86mgm\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.482429 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5xcbl"] Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.487714 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5xcbl"] Jan 29 14:05:05 crc kubenswrapper[4753]: I0129 14:05:05.714678 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:06 crc kubenswrapper[4753]: I0129 14:05:06.161497 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18bdd5fc-7b40-42b0-a25b-2c3abdc96f23" path="/var/lib/kubelet/pods/18bdd5fc-7b40-42b0-a25b-2c3abdc96f23/volumes" Jan 29 14:05:06 crc kubenswrapper[4753]: E0129 14:05:06.900631 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 14:05:06 crc kubenswrapper[4753]: E0129 14:05:06.900797 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2hkp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gz65p_openshift-marketplace(6e49742f-8ecc-4140-b477-5a3448e130cc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 14:05:06 crc kubenswrapper[4753]: E0129 14:05:06.901962 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gz65p" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" Jan 29 14:05:07 crc kubenswrapper[4753]: E0129 14:05:07.058578 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 14:05:07 crc kubenswrapper[4753]: E0129 14:05:07.058753 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wd6pv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rj7d8_openshift-marketplace(2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 14:05:07 crc kubenswrapper[4753]: E0129 14:05:07.059963 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rj7d8" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" Jan 29 14:05:07 crc kubenswrapper[4753]: E0129 14:05:07.276960 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 14:05:07 crc kubenswrapper[4753]: E0129 14:05:07.277171 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ctxk6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p7v9s_openshift-marketplace(dc59885a-9b00-47db-b9d8-e857c589abce): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 14:05:07 crc kubenswrapper[4753]: E0129 14:05:07.278378 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p7v9s" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" Jan 29 14:05:10 crc kubenswrapper[4753]: E0129 14:05:10.168920 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rj7d8" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" Jan 29 14:05:10 crc kubenswrapper[4753]: E0129 14:05:10.169105 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p7v9s" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" Jan 29 14:05:10 crc kubenswrapper[4753]: E0129 14:05:10.169193 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gz65p" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" Jan 29 14:05:10 crc kubenswrapper[4753]: I0129 14:05:10.195603 4753 scope.go:117] "RemoveContainer" containerID="b22c5c3be2362c805abdf785d9d3ed0a8e49b9a524ae6159b3e13b2b1a8d945d" Jan 29 14:05:10 crc kubenswrapper[4753]: E0129 14:05:10.286954 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 14:05:10 crc kubenswrapper[4753]: E0129 14:05:10.287278 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2brn2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-b6pp2_openshift-marketplace(4762024a-21e5-4b76-a778-f2a16551c198): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 14:05:10 crc kubenswrapper[4753]: E0129 14:05:10.288609 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-b6pp2" podUID="4762024a-21e5-4b76-a778-f2a16551c198" Jan 29 14:05:10 crc kubenswrapper[4753]: I0129 14:05:10.480777 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz9wv" event={"ID":"e5ef4648-0d9e-487b-a796-476432ec0ca8","Type":"ContainerStarted","Data":"9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37"} Jan 29 14:05:10 crc kubenswrapper[4753]: E0129 14:05:10.482654 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-b6pp2" podUID="4762024a-21e5-4b76-a778-f2a16551c198" Jan 29 14:05:10 crc kubenswrapper[4753]: I0129 14:05:10.519203 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm"] Jan 29 14:05:10 crc kubenswrapper[4753]: E0129 14:05:10.526387 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 14:05:10 crc kubenswrapper[4753]: E0129 14:05:10.526596 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58qbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-2sj8l_openshift-marketplace(48968e19-5dbd-4231-895f-c28e4178bf33): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 14:05:10 crc kubenswrapper[4753]: E0129 14:05:10.527828 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-2sj8l" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" Jan 29 14:05:10 crc kubenswrapper[4753]: W0129 14:05:10.537473 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f9f0d20_5bae_45d7_90c9_fb2e9f2e1c66.slice/crio-1084ebfb20d6b6e893c6301da846d90f1d25fac55fc61ef073a53627b05cf9a2 WatchSource:0}: Error finding container 1084ebfb20d6b6e893c6301da846d90f1d25fac55fc61ef073a53627b05cf9a2: Status 404 returned error can't find the container with id 1084ebfb20d6b6e893c6301da846d90f1d25fac55fc61ef073a53627b05cf9a2 Jan 29 14:05:10 crc kubenswrapper[4753]: I0129 14:05:10.693969 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4"] Jan 29 14:05:10 crc kubenswrapper[4753]: W0129 14:05:10.703672 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a7eab0e_aaeb_4d53_acd4_42c5ca8b2d69.slice/crio-15fe658613db25f9d645dbcb8d4d8ce46052c94b4d2fc5929abae0b74f3daaa2 WatchSource:0}: Error finding container 15fe658613db25f9d645dbcb8d4d8ce46052c94b4d2fc5929abae0b74f3daaa2: Status 404 returned error can't find the container with id 15fe658613db25f9d645dbcb8d4d8ce46052c94b4d2fc5929abae0b74f3daaa2 Jan 29 14:05:10 crc kubenswrapper[4753]: I0129 14:05:10.966325 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 29 14:05:10 crc kubenswrapper[4753]: I0129 14:05:10.967696 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 14:05:10 crc kubenswrapper[4753]: I0129 14:05:10.969920 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 29 14:05:10 crc kubenswrapper[4753]: I0129 14:05:10.970362 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 29 14:05:10 crc kubenswrapper[4753]: I0129 14:05:10.984674 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.016665 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a9fde53-86d6-45bd-8060-5be644476488-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5a9fde53-86d6-45bd-8060-5be644476488\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.016759 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a9fde53-86d6-45bd-8060-5be644476488-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5a9fde53-86d6-45bd-8060-5be644476488\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.117955 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a9fde53-86d6-45bd-8060-5be644476488-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5a9fde53-86d6-45bd-8060-5be644476488\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.118024 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a9fde53-86d6-45bd-8060-5be644476488-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5a9fde53-86d6-45bd-8060-5be644476488\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.118172 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a9fde53-86d6-45bd-8060-5be644476488-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5a9fde53-86d6-45bd-8060-5be644476488\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.143339 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a9fde53-86d6-45bd-8060-5be644476488-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5a9fde53-86d6-45bd-8060-5be644476488\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.285469 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.487617 4753 generic.go:334] "Generic (PLEG): container finished" podID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerID="9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37" exitCode=0 Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.487730 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz9wv" event={"ID":"e5ef4648-0d9e-487b-a796-476432ec0ca8","Type":"ContainerDied","Data":"9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37"} Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.490474 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" event={"ID":"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69","Type":"ContainerStarted","Data":"f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202"} Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.490505 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" event={"ID":"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69","Type":"ContainerStarted","Data":"15fe658613db25f9d645dbcb8d4d8ce46052c94b4d2fc5929abae0b74f3daaa2"} Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.490630 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" podUID="4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69" containerName="route-controller-manager" containerID="cri-o://f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202" gracePeriod=30 Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.490762 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.496778 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" event={"ID":"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66","Type":"ContainerStarted","Data":"6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de"} Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.496816 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" event={"ID":"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66","Type":"ContainerStarted","Data":"1084ebfb20d6b6e893c6301da846d90f1d25fac55fc61ef073a53627b05cf9a2"} Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.496849 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.497081 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:11 crc kubenswrapper[4753]: E0129 14:05:11.499500 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-2sj8l" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.506672 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.545906 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" podStartSLOduration=28.545884096 podStartE2EDuration="28.545884096s" podCreationTimestamp="2026-01-29 14:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:05:11.54420944 +0000 UTC m=+146.238943842" watchObservedRunningTime="2026-01-29 14:05:11.545884096 +0000 UTC m=+146.240618478" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.548553 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" podStartSLOduration=8.54853485 podStartE2EDuration="8.54853485s" podCreationTimestamp="2026-01-29 14:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:05:11.528877513 +0000 UTC m=+146.223611905" watchObservedRunningTime="2026-01-29 14:05:11.54853485 +0000 UTC m=+146.243269232" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.771734 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.850307 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.876342 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d794955c-4298z"] Jan 29 14:05:11 crc kubenswrapper[4753]: E0129 14:05:11.877030 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69" containerName="route-controller-manager" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.877047 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69" containerName="route-controller-manager" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.877606 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69" containerName="route-controller-manager" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.878059 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.896044 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d794955c-4298z"] Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.932640 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-config\") pod \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.932700 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-client-ca\") pod \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.932789 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-serving-cert\") pod \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.932931 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2d5z\" (UniqueName: \"kubernetes.io/projected/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-kube-api-access-t2d5z\") pod \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\" (UID: \"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69\") " Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.933240 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8991923e-c6af-4748-93fa-2735e15a903b-serving-cert\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.933357 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-client-ca\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.933492 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-client-ca" (OuterVolumeSpecName: "client-ca") pod "4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69" (UID: "4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.933544 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-config" (OuterVolumeSpecName: "config") pod "4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69" (UID: "4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.933572 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-config\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.933668 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz8w2\" (UniqueName: \"kubernetes.io/projected/8991923e-c6af-4748-93fa-2735e15a903b-kube-api-access-gz8w2\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.933845 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.933899 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.940570 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-kube-api-access-t2d5z" (OuterVolumeSpecName: "kube-api-access-t2d5z") pod "4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69" (UID: "4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69"). InnerVolumeSpecName "kube-api-access-t2d5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:05:11 crc kubenswrapper[4753]: I0129 14:05:11.940606 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69" (UID: "4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.034490 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz8w2\" (UniqueName: \"kubernetes.io/projected/8991923e-c6af-4748-93fa-2735e15a903b-kube-api-access-gz8w2\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.034727 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8991923e-c6af-4748-93fa-2735e15a903b-serving-cert\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.034852 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-client-ca\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.034940 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-config\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.034992 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2d5z\" (UniqueName: \"kubernetes.io/projected/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-kube-api-access-t2d5z\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.035008 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.035981 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-client-ca\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.036516 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-config\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.039574 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8991923e-c6af-4748-93fa-2735e15a903b-serving-cert\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.051581 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz8w2\" (UniqueName: \"kubernetes.io/projected/8991923e-c6af-4748-93fa-2735e15a903b-kube-api-access-gz8w2\") pod \"route-controller-manager-86d794955c-4298z\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.201784 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.505964 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz9wv" event={"ID":"e5ef4648-0d9e-487b-a796-476432ec0ca8","Type":"ContainerStarted","Data":"50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f"} Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.508000 4753 generic.go:334] "Generic (PLEG): container finished" podID="4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69" containerID="f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202" exitCode=0 Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.508105 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.508894 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" event={"ID":"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69","Type":"ContainerDied","Data":"f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202"} Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.509358 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4" event={"ID":"4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69","Type":"ContainerDied","Data":"15fe658613db25f9d645dbcb8d4d8ce46052c94b4d2fc5929abae0b74f3daaa2"} Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.509386 4753 scope.go:117] "RemoveContainer" containerID="f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.511725 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5a9fde53-86d6-45bd-8060-5be644476488","Type":"ContainerStarted","Data":"83972f3bb349306695fe86cdc96ea10c29cb87dca5560c5edb95d7fb358cabce"} Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.511753 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5a9fde53-86d6-45bd-8060-5be644476488","Type":"ContainerStarted","Data":"17f3acb446081cb1b0367a61982932bd0879c14b84a479e39cfd18c7789b4cab"} Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.567946 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gz9wv" podStartSLOduration=2.19533898 podStartE2EDuration="46.567915895s" podCreationTimestamp="2026-01-29 14:04:26 +0000 UTC" firstStartedPulling="2026-01-29 14:04:27.672131516 +0000 UTC m=+102.366865898" lastFinishedPulling="2026-01-29 14:05:12.044708431 +0000 UTC m=+146.739442813" observedRunningTime="2026-01-29 14:05:12.528776526 +0000 UTC m=+147.223510948" watchObservedRunningTime="2026-01-29 14:05:12.567915895 +0000 UTC m=+147.262650287" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.640671 4753 scope.go:117] "RemoveContainer" containerID="f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.641630 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.641599044 podStartE2EDuration="2.641599044s" podCreationTimestamp="2026-01-29 14:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:05:12.572122211 +0000 UTC m=+147.266856603" watchObservedRunningTime="2026-01-29 14:05:12.641599044 +0000 UTC m=+147.336333436" Jan 29 14:05:12 crc kubenswrapper[4753]: E0129 14:05:12.641806 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202\": container with ID starting with f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202 not found: ID does not exist" containerID="f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.641895 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202"} err="failed to get container status \"f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202\": rpc error: code = NotFound desc = could not find container \"f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202\": container with ID starting with f41577967130f8d56e26044b35186715f8dba8386d4285e4fa82f1473247e202 not found: ID does not exist" Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.653737 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4"] Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.657032 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b7ddd96d-mnql4"] Jan 29 14:05:12 crc kubenswrapper[4753]: I0129 14:05:12.679649 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d794955c-4298z"] Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.054598 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.055216 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.056594 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.058788 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.066785 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.073128 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.156634 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.156746 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.159628 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.170548 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.183593 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.184241 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.268484 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.340917 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.345607 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.536021 4753 generic.go:334] "Generic (PLEG): container finished" podID="5a9fde53-86d6-45bd-8060-5be644476488" containerID="83972f3bb349306695fe86cdc96ea10c29cb87dca5560c5edb95d7fb358cabce" exitCode=0 Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.536101 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5a9fde53-86d6-45bd-8060-5be644476488","Type":"ContainerDied","Data":"83972f3bb349306695fe86cdc96ea10c29cb87dca5560c5edb95d7fb358cabce"} Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.542207 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" event={"ID":"8991923e-c6af-4748-93fa-2735e15a903b","Type":"ContainerStarted","Data":"91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136"} Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.542244 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.542254 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" event={"ID":"8991923e-c6af-4748-93fa-2735e15a903b","Type":"ContainerStarted","Data":"f978b0ea420e90be2f24e06e7d430ec07d51768435264d082c3e2cee966e7e0e"} Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.552850 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:13 crc kubenswrapper[4753]: I0129 14:05:13.573607 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" podStartSLOduration=10.573584558 podStartE2EDuration="10.573584558s" podCreationTimestamp="2026-01-29 14:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:05:13.571250203 +0000 UTC m=+148.265984585" watchObservedRunningTime="2026-01-29 14:05:13.573584558 +0000 UTC m=+148.268318940" Jan 29 14:05:14 crc kubenswrapper[4753]: I0129 14:05:14.157713 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69" path="/var/lib/kubelet/pods/4a7eab0e-aaeb-4d53-acd4-42c5ca8b2d69/volumes" Jan 29 14:05:14 crc kubenswrapper[4753]: I0129 14:05:14.551101 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"08064e87d56595487d4c6d7f66744da1f38a6fb5877fe3bb45dd442ed1a47f63"} Jan 29 14:05:14 crc kubenswrapper[4753]: I0129 14:05:14.551193 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9405f87200f04b79a43b125241317ab67010d46e1f745e291855d1bcef92b9f4"} Jan 29 14:05:14 crc kubenswrapper[4753]: I0129 14:05:14.552506 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:05:14 crc kubenswrapper[4753]: I0129 14:05:14.555264 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7d6cb65e630df70602659398b66a4ff05a323f8cd8299b830d3d04ff20b12d6e"} Jan 29 14:05:14 crc kubenswrapper[4753]: I0129 14:05:14.555303 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"728ed712b04f2698dc1ff8606c2a181a5ffce9224e68db43b74e81b9cb2c4108"} Jan 29 14:05:14 crc kubenswrapper[4753]: I0129 14:05:14.558680 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1dd7a6a8bbc23cae13fa958a8bd42ff4758e27f6ebdc203c2f01a15299a7b3b2"} Jan 29 14:05:14 crc kubenswrapper[4753]: I0129 14:05:14.558713 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2ebdaf1b46e542aedbbbc885b8493809a7930292d644ce0e6d36decabed86f9c"} Jan 29 14:05:14 crc kubenswrapper[4753]: I0129 14:05:14.956772 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 14:05:15 crc kubenswrapper[4753]: I0129 14:05:15.083780 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a9fde53-86d6-45bd-8060-5be644476488-kubelet-dir\") pod \"5a9fde53-86d6-45bd-8060-5be644476488\" (UID: \"5a9fde53-86d6-45bd-8060-5be644476488\") " Jan 29 14:05:15 crc kubenswrapper[4753]: I0129 14:05:15.084332 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a9fde53-86d6-45bd-8060-5be644476488-kube-api-access\") pod \"5a9fde53-86d6-45bd-8060-5be644476488\" (UID: \"5a9fde53-86d6-45bd-8060-5be644476488\") " Jan 29 14:05:15 crc kubenswrapper[4753]: I0129 14:05:15.084067 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a9fde53-86d6-45bd-8060-5be644476488-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5a9fde53-86d6-45bd-8060-5be644476488" (UID: "5a9fde53-86d6-45bd-8060-5be644476488"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:05:15 crc kubenswrapper[4753]: I0129 14:05:15.089958 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9fde53-86d6-45bd-8060-5be644476488-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5a9fde53-86d6-45bd-8060-5be644476488" (UID: "5a9fde53-86d6-45bd-8060-5be644476488"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:05:15 crc kubenswrapper[4753]: I0129 14:05:15.186695 4753 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a9fde53-86d6-45bd-8060-5be644476488-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:15 crc kubenswrapper[4753]: I0129 14:05:15.186743 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a9fde53-86d6-45bd-8060-5be644476488-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:15 crc kubenswrapper[4753]: I0129 14:05:15.569767 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5a9fde53-86d6-45bd-8060-5be644476488","Type":"ContainerDied","Data":"17f3acb446081cb1b0367a61982932bd0879c14b84a479e39cfd18c7789b4cab"} Jan 29 14:05:15 crc kubenswrapper[4753]: I0129 14:05:15.570508 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17f3acb446081cb1b0367a61982932bd0879c14b84a479e39cfd18c7789b4cab" Jan 29 14:05:15 crc kubenswrapper[4753]: I0129 14:05:15.570035 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 29 14:05:16 crc kubenswrapper[4753]: I0129 14:05:16.579641 4753 generic.go:334] "Generic (PLEG): container finished" podID="482babf9-e074-489f-aae6-eb9c48639f25" containerID="68368678617d40b8ffa6c6cfddc8a362f700e20a8fecc05a7af8a27b30737377" exitCode=0 Jan 29 14:05:16 crc kubenswrapper[4753]: I0129 14:05:16.579693 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8rzt" event={"ID":"482babf9-e074-489f-aae6-eb9c48639f25","Type":"ContainerDied","Data":"68368678617d40b8ffa6c6cfddc8a362f700e20a8fecc05a7af8a27b30737377"} Jan 29 14:05:16 crc kubenswrapper[4753]: I0129 14:05:16.749174 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:05:16 crc kubenswrapper[4753]: I0129 14:05:16.749730 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:05:16 crc kubenswrapper[4753]: I0129 14:05:16.913840 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:05:17 crc kubenswrapper[4753]: I0129 14:05:17.589038 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8rzt" event={"ID":"482babf9-e074-489f-aae6-eb9c48639f25","Type":"ContainerStarted","Data":"34fcc8bc0397644d62234824d058eeb14be80513cf6424db0fcf005e5d43a6e8"} Jan 29 14:05:17 crc kubenswrapper[4753]: I0129 14:05:17.591374 4753 generic.go:334] "Generic (PLEG): container finished" podID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerID="31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d" exitCode=0 Jan 29 14:05:17 crc kubenswrapper[4753]: I0129 14:05:17.591488 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmmtj" event={"ID":"5875e879-79f3-4499-b460-b4b4a3e1637c","Type":"ContainerDied","Data":"31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d"} Jan 29 14:05:17 crc kubenswrapper[4753]: I0129 14:05:17.615945 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f8rzt" podStartSLOduration=2.283970505 podStartE2EDuration="51.615920398s" podCreationTimestamp="2026-01-29 14:04:26 +0000 UTC" firstStartedPulling="2026-01-29 14:04:27.728111313 +0000 UTC m=+102.422845685" lastFinishedPulling="2026-01-29 14:05:17.060061166 +0000 UTC m=+151.754795578" observedRunningTime="2026-01-29 14:05:17.610095796 +0000 UTC m=+152.304830178" watchObservedRunningTime="2026-01-29 14:05:17.615920398 +0000 UTC m=+152.310654780" Jan 29 14:05:17 crc kubenswrapper[4753]: I0129 14:05:17.638789 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.160502 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 29 14:05:18 crc kubenswrapper[4753]: E0129 14:05:18.161280 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9fde53-86d6-45bd-8060-5be644476488" containerName="pruner" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.161305 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9fde53-86d6-45bd-8060-5be644476488" containerName="pruner" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.161459 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9fde53-86d6-45bd-8060-5be644476488" containerName="pruner" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.161974 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.164011 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.164563 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.175324 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.337311 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.337353 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kube-api-access\") pod \"installer-9-crc\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.337586 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-var-lock\") pod \"installer-9-crc\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.439320 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.439379 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kube-api-access\") pod \"installer-9-crc\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.439437 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-var-lock\") pod \"installer-9-crc\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.439477 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.439640 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-var-lock\") pod \"installer-9-crc\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.469061 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kube-api-access\") pod \"installer-9-crc\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.476233 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.604111 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmmtj" event={"ID":"5875e879-79f3-4499-b460-b4b4a3e1637c","Type":"ContainerStarted","Data":"c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b"} Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.644335 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jmmtj" podStartSLOduration=2.404167021 podStartE2EDuration="50.644308844s" podCreationTimestamp="2026-01-29 14:04:28 +0000 UTC" firstStartedPulling="2026-01-29 14:04:29.801350522 +0000 UTC m=+104.496084904" lastFinishedPulling="2026-01-29 14:05:18.041492355 +0000 UTC m=+152.736226727" observedRunningTime="2026-01-29 14:05:18.633827001 +0000 UTC m=+153.328561393" watchObservedRunningTime="2026-01-29 14:05:18.644308844 +0000 UTC m=+153.339043226" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.722729 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.722784 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:05:18 crc kubenswrapper[4753]: I0129 14:05:18.890823 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 29 14:05:18 crc kubenswrapper[4753]: W0129 14:05:18.898958 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3b3422bb_6885_4aaf_97d6_0b4e613e81f1.slice/crio-06b6b971e4de35392b1cc341a27f8ce307d46504db467a20eae99e7798c50de6 WatchSource:0}: Error finding container 06b6b971e4de35392b1cc341a27f8ce307d46504db467a20eae99e7798c50de6: Status 404 returned error can't find the container with id 06b6b971e4de35392b1cc341a27f8ce307d46504db467a20eae99e7798c50de6 Jan 29 14:05:19 crc kubenswrapper[4753]: I0129 14:05:19.612265 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3b3422bb-6885-4aaf-97d6-0b4e613e81f1","Type":"ContainerStarted","Data":"35f44418dfb0e39faf6ac413da26904b9775ddcd5b7c2333bf7f1338b2fc58cf"} Jan 29 14:05:19 crc kubenswrapper[4753]: I0129 14:05:19.612747 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3b3422bb-6885-4aaf-97d6-0b4e613e81f1","Type":"ContainerStarted","Data":"06b6b971e4de35392b1cc341a27f8ce307d46504db467a20eae99e7798c50de6"} Jan 29 14:05:19 crc kubenswrapper[4753]: I0129 14:05:19.633399 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.6333833850000001 podStartE2EDuration="1.633383385s" podCreationTimestamp="2026-01-29 14:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:05:19.630231607 +0000 UTC m=+154.324965999" watchObservedRunningTime="2026-01-29 14:05:19.633383385 +0000 UTC m=+154.328117767" Jan 29 14:05:19 crc kubenswrapper[4753]: I0129 14:05:19.762674 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-jmmtj" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerName="registry-server" probeResult="failure" output=< Jan 29 14:05:19 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 14:05:19 crc kubenswrapper[4753]: > Jan 29 14:05:25 crc kubenswrapper[4753]: I0129 14:05:25.655854 4753 generic.go:334] "Generic (PLEG): container finished" podID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerID="8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74" exitCode=0 Jan 29 14:05:25 crc kubenswrapper[4753]: I0129 14:05:25.655910 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz65p" event={"ID":"6e49742f-8ecc-4140-b477-5a3448e130cc","Type":"ContainerDied","Data":"8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74"} Jan 29 14:05:25 crc kubenswrapper[4753]: I0129 14:05:25.659617 4753 generic.go:334] "Generic (PLEG): container finished" podID="dc59885a-9b00-47db-b9d8-e857c589abce" containerID="80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2" exitCode=0 Jan 29 14:05:25 crc kubenswrapper[4753]: I0129 14:05:25.659677 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7v9s" event={"ID":"dc59885a-9b00-47db-b9d8-e857c589abce","Type":"ContainerDied","Data":"80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2"} Jan 29 14:05:25 crc kubenswrapper[4753]: I0129 14:05:25.666579 4753 generic.go:334] "Generic (PLEG): container finished" podID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerID="c3db8665aab739287a49e3fc2b8a6b68d4fbdf39b48aa44713d4c755ea5d85b1" exitCode=0 Jan 29 14:05:25 crc kubenswrapper[4753]: I0129 14:05:25.666758 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj7d8" event={"ID":"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5","Type":"ContainerDied","Data":"c3db8665aab739287a49e3fc2b8a6b68d4fbdf39b48aa44713d4c755ea5d85b1"} Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.673286 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz65p" event={"ID":"6e49742f-8ecc-4140-b477-5a3448e130cc","Type":"ContainerStarted","Data":"793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f"} Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.676375 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6pp2" event={"ID":"4762024a-21e5-4b76-a778-f2a16551c198","Type":"ContainerStarted","Data":"45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9"} Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.679710 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sj8l" event={"ID":"48968e19-5dbd-4231-895f-c28e4178bf33","Type":"ContainerStarted","Data":"1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549"} Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.686335 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7v9s" event={"ID":"dc59885a-9b00-47db-b9d8-e857c589abce","Type":"ContainerStarted","Data":"f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9"} Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.716034 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gz65p" podStartSLOduration=3.229147415 podStartE2EDuration="1m0.716019904s" podCreationTimestamp="2026-01-29 14:04:26 +0000 UTC" firstStartedPulling="2026-01-29 14:04:28.761644631 +0000 UTC m=+103.456379023" lastFinishedPulling="2026-01-29 14:05:26.24851713 +0000 UTC m=+160.943251512" observedRunningTime="2026-01-29 14:05:26.714415669 +0000 UTC m=+161.409150051" watchObservedRunningTime="2026-01-29 14:05:26.716019904 +0000 UTC m=+161.410754286" Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.718709 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj7d8" event={"ID":"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5","Type":"ContainerStarted","Data":"ca7c762e90b6637412c0d6cdaddcf3b29b9cd833da6bb3cf5b70f8755ebe33fb"} Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.773132 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p7v9s" podStartSLOduration=2.117405171 podStartE2EDuration="1m0.773114231s" podCreationTimestamp="2026-01-29 14:04:26 +0000 UTC" firstStartedPulling="2026-01-29 14:04:27.726774975 +0000 UTC m=+102.421509357" lastFinishedPulling="2026-01-29 14:05:26.382484035 +0000 UTC m=+161.077218417" observedRunningTime="2026-01-29 14:05:26.772601767 +0000 UTC m=+161.467336139" watchObservedRunningTime="2026-01-29 14:05:26.773114231 +0000 UTC m=+161.467848613" Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.913241 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.913295 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.913317 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.913328 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.956325 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:05:26 crc kubenswrapper[4753]: I0129 14:05:26.971592 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rj7d8" podStartSLOduration=4.904710263 podStartE2EDuration="58.971573901s" podCreationTimestamp="2026-01-29 14:04:28 +0000 UTC" firstStartedPulling="2026-01-29 14:04:32.035331871 +0000 UTC m=+106.730066243" lastFinishedPulling="2026-01-29 14:05:26.102195499 +0000 UTC m=+160.796929881" observedRunningTime="2026-01-29 14:05:26.829670085 +0000 UTC m=+161.524404467" watchObservedRunningTime="2026-01-29 14:05:26.971573901 +0000 UTC m=+161.666308283" Jan 29 14:05:27 crc kubenswrapper[4753]: I0129 14:05:27.055640 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:05:27 crc kubenswrapper[4753]: I0129 14:05:27.055706 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:05:27 crc kubenswrapper[4753]: I0129 14:05:27.122316 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:05:27 crc kubenswrapper[4753]: I0129 14:05:27.122380 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:05:27 crc kubenswrapper[4753]: I0129 14:05:27.731859 4753 generic.go:334] "Generic (PLEG): container finished" podID="4762024a-21e5-4b76-a778-f2a16551c198" containerID="45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9" exitCode=0 Jan 29 14:05:27 crc kubenswrapper[4753]: I0129 14:05:27.731957 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6pp2" event={"ID":"4762024a-21e5-4b76-a778-f2a16551c198","Type":"ContainerDied","Data":"45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9"} Jan 29 14:05:27 crc kubenswrapper[4753]: I0129 14:05:27.739063 4753 generic.go:334] "Generic (PLEG): container finished" podID="48968e19-5dbd-4231-895f-c28e4178bf33" containerID="1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549" exitCode=0 Jan 29 14:05:27 crc kubenswrapper[4753]: I0129 14:05:27.739476 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sj8l" event={"ID":"48968e19-5dbd-4231-895f-c28e4178bf33","Type":"ContainerDied","Data":"1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549"} Jan 29 14:05:27 crc kubenswrapper[4753]: I0129 14:05:27.791622 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:05:27 crc kubenswrapper[4753]: I0129 14:05:27.946365 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-p7v9s" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" containerName="registry-server" probeResult="failure" output=< Jan 29 14:05:27 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 14:05:27 crc kubenswrapper[4753]: > Jan 29 14:05:28 crc kubenswrapper[4753]: I0129 14:05:28.167988 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-gz65p" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerName="registry-server" probeResult="failure" output=< Jan 29 14:05:28 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 14:05:28 crc kubenswrapper[4753]: > Jan 29 14:05:28 crc kubenswrapper[4753]: I0129 14:05:28.765590 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sj8l" event={"ID":"48968e19-5dbd-4231-895f-c28e4178bf33","Type":"ContainerStarted","Data":"8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2"} Jan 29 14:05:28 crc kubenswrapper[4753]: I0129 14:05:28.770269 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6pp2" event={"ID":"4762024a-21e5-4b76-a778-f2a16551c198","Type":"ContainerStarted","Data":"bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99"} Jan 29 14:05:28 crc kubenswrapper[4753]: I0129 14:05:28.780347 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:05:28 crc kubenswrapper[4753]: I0129 14:05:28.788769 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2sj8l" podStartSLOduration=3.370646858 podStartE2EDuration="59.788753778s" podCreationTimestamp="2026-01-29 14:04:29 +0000 UTC" firstStartedPulling="2026-01-29 14:04:31.991034168 +0000 UTC m=+106.685768540" lastFinishedPulling="2026-01-29 14:05:28.409141078 +0000 UTC m=+163.103875460" observedRunningTime="2026-01-29 14:05:28.78667703 +0000 UTC m=+163.481411412" watchObservedRunningTime="2026-01-29 14:05:28.788753778 +0000 UTC m=+163.483488160" Jan 29 14:05:28 crc kubenswrapper[4753]: I0129 14:05:28.806872 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b6pp2" podStartSLOduration=3.514082308 podStartE2EDuration="59.806857841s" podCreationTimestamp="2026-01-29 14:04:29 +0000 UTC" firstStartedPulling="2026-01-29 14:04:32.062542008 +0000 UTC m=+106.757276390" lastFinishedPulling="2026-01-29 14:05:28.355317541 +0000 UTC m=+163.050051923" observedRunningTime="2026-01-29 14:05:28.805578946 +0000 UTC m=+163.500313328" watchObservedRunningTime="2026-01-29 14:05:28.806857841 +0000 UTC m=+163.501592223" Jan 29 14:05:28 crc kubenswrapper[4753]: I0129 14:05:28.835645 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:05:28 crc kubenswrapper[4753]: I0129 14:05:28.991650 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f8rzt"] Jan 29 14:05:29 crc kubenswrapper[4753]: I0129 14:05:29.166750 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:05:29 crc kubenswrapper[4753]: I0129 14:05:29.166847 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:05:29 crc kubenswrapper[4753]: I0129 14:05:29.217800 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:05:29 crc kubenswrapper[4753]: I0129 14:05:29.775897 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f8rzt" podUID="482babf9-e074-489f-aae6-eb9c48639f25" containerName="registry-server" containerID="cri-o://34fcc8bc0397644d62234824d058eeb14be80513cf6424db0fcf005e5d43a6e8" gracePeriod=2 Jan 29 14:05:29 crc kubenswrapper[4753]: I0129 14:05:29.831491 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:05:29 crc kubenswrapper[4753]: I0129 14:05:29.831641 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:05:30 crc kubenswrapper[4753]: I0129 14:05:30.203900 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:05:30 crc kubenswrapper[4753]: I0129 14:05:30.203965 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:05:30 crc kubenswrapper[4753]: I0129 14:05:30.811332 4753 generic.go:334] "Generic (PLEG): container finished" podID="482babf9-e074-489f-aae6-eb9c48639f25" containerID="34fcc8bc0397644d62234824d058eeb14be80513cf6424db0fcf005e5d43a6e8" exitCode=0 Jan 29 14:05:30 crc kubenswrapper[4753]: I0129 14:05:30.812516 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8rzt" event={"ID":"482babf9-e074-489f-aae6-eb9c48639f25","Type":"ContainerDied","Data":"34fcc8bc0397644d62234824d058eeb14be80513cf6424db0fcf005e5d43a6e8"} Jan 29 14:05:30 crc kubenswrapper[4753]: I0129 14:05:30.879260 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2sj8l" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" containerName="registry-server" probeResult="failure" output=< Jan 29 14:05:30 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 14:05:30 crc kubenswrapper[4753]: > Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.076330 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.124764 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-utilities\") pod \"482babf9-e074-489f-aae6-eb9c48639f25\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.124844 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vdnp\" (UniqueName: \"kubernetes.io/projected/482babf9-e074-489f-aae6-eb9c48639f25-kube-api-access-4vdnp\") pod \"482babf9-e074-489f-aae6-eb9c48639f25\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.124904 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-catalog-content\") pod \"482babf9-e074-489f-aae6-eb9c48639f25\" (UID: \"482babf9-e074-489f-aae6-eb9c48639f25\") " Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.125753 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-utilities" (OuterVolumeSpecName: "utilities") pod "482babf9-e074-489f-aae6-eb9c48639f25" (UID: "482babf9-e074-489f-aae6-eb9c48639f25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.131341 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482babf9-e074-489f-aae6-eb9c48639f25-kube-api-access-4vdnp" (OuterVolumeSpecName: "kube-api-access-4vdnp") pod "482babf9-e074-489f-aae6-eb9c48639f25" (UID: "482babf9-e074-489f-aae6-eb9c48639f25"). InnerVolumeSpecName "kube-api-access-4vdnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.168437 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "482babf9-e074-489f-aae6-eb9c48639f25" (UID: "482babf9-e074-489f-aae6-eb9c48639f25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.226954 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.226992 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vdnp\" (UniqueName: \"kubernetes.io/projected/482babf9-e074-489f-aae6-eb9c48639f25-kube-api-access-4vdnp\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.227002 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/482babf9-e074-489f-aae6-eb9c48639f25-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.253303 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b6pp2" podUID="4762024a-21e5-4b76-a778-f2a16551c198" containerName="registry-server" probeResult="failure" output=< Jan 29 14:05:31 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 14:05:31 crc kubenswrapper[4753]: > Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.850427 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8rzt" event={"ID":"482babf9-e074-489f-aae6-eb9c48639f25","Type":"ContainerDied","Data":"d167c36fafc6b23d600751c65377eb6c5a284c97914b9fbdc1c3885f97b6bf01"} Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.850496 4753 scope.go:117] "RemoveContainer" containerID="34fcc8bc0397644d62234824d058eeb14be80513cf6424db0fcf005e5d43a6e8" Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.850524 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8rzt" Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.891716 4753 scope.go:117] "RemoveContainer" containerID="68368678617d40b8ffa6c6cfddc8a362f700e20a8fecc05a7af8a27b30737377" Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.900538 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f8rzt"] Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.905489 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f8rzt"] Jan 29 14:05:31 crc kubenswrapper[4753]: I0129 14:05:31.923250 4753 scope.go:117] "RemoveContainer" containerID="3e6bd543f9ac97d25415abbd607976b3bab94772a2a7f1fba5587be6558d40a6" Jan 29 14:05:32 crc kubenswrapper[4753]: I0129 14:05:32.156764 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482babf9-e074-489f-aae6-eb9c48639f25" path="/var/lib/kubelet/pods/482babf9-e074-489f-aae6-eb9c48639f25/volumes" Jan 29 14:05:36 crc kubenswrapper[4753]: I0129 14:05:36.957711 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:05:37 crc kubenswrapper[4753]: I0129 14:05:37.007215 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:05:37 crc kubenswrapper[4753]: I0129 14:05:37.159336 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:05:37 crc kubenswrapper[4753]: I0129 14:05:37.205716 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:05:38 crc kubenswrapper[4753]: I0129 14:05:38.601631 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gz65p"] Jan 29 14:05:38 crc kubenswrapper[4753]: I0129 14:05:38.897611 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gz65p" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerName="registry-server" containerID="cri-o://793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f" gracePeriod=2 Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.242542 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.451814 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.550445 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-catalog-content\") pod \"6e49742f-8ecc-4140-b477-5a3448e130cc\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.550529 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-utilities\") pod \"6e49742f-8ecc-4140-b477-5a3448e130cc\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.550738 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2hkp\" (UniqueName: \"kubernetes.io/projected/6e49742f-8ecc-4140-b477-5a3448e130cc-kube-api-access-r2hkp\") pod \"6e49742f-8ecc-4140-b477-5a3448e130cc\" (UID: \"6e49742f-8ecc-4140-b477-5a3448e130cc\") " Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.551590 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-utilities" (OuterVolumeSpecName: "utilities") pod "6e49742f-8ecc-4140-b477-5a3448e130cc" (UID: "6e49742f-8ecc-4140-b477-5a3448e130cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.562369 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e49742f-8ecc-4140-b477-5a3448e130cc-kube-api-access-r2hkp" (OuterVolumeSpecName: "kube-api-access-r2hkp") pod "6e49742f-8ecc-4140-b477-5a3448e130cc" (UID: "6e49742f-8ecc-4140-b477-5a3448e130cc"). InnerVolumeSpecName "kube-api-access-r2hkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.619596 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e49742f-8ecc-4140-b477-5a3448e130cc" (UID: "6e49742f-8ecc-4140-b477-5a3448e130cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.652536 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2hkp\" (UniqueName: \"kubernetes.io/projected/6e49742f-8ecc-4140-b477-5a3448e130cc-kube-api-access-r2hkp\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.652578 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.652591 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e49742f-8ecc-4140-b477-5a3448e130cc-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.900599 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.911372 4753 generic.go:334] "Generic (PLEG): container finished" podID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerID="793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f" exitCode=0 Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.911481 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gz65p" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.911477 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz65p" event={"ID":"6e49742f-8ecc-4140-b477-5a3448e130cc","Type":"ContainerDied","Data":"793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f"} Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.911593 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz65p" event={"ID":"6e49742f-8ecc-4140-b477-5a3448e130cc","Type":"ContainerDied","Data":"7ea8788e793c6aee02dd5275d67004d091cb69d8ad60afba164b0e7a3627fe92"} Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.911657 4753 scope.go:117] "RemoveContainer" containerID="793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.938229 4753 scope.go:117] "RemoveContainer" containerID="8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.954959 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.962381 4753 scope.go:117] "RemoveContainer" containerID="6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522" Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.965365 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gz65p"] Jan 29 14:05:39 crc kubenswrapper[4753]: I0129 14:05:39.969880 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gz65p"] Jan 29 14:05:40 crc kubenswrapper[4753]: I0129 14:05:40.012447 4753 scope.go:117] "RemoveContainer" containerID="793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f" Jan 29 14:05:40 crc kubenswrapper[4753]: E0129 14:05:40.013091 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f\": container with ID starting with 793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f not found: ID does not exist" containerID="793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f" Jan 29 14:05:40 crc kubenswrapper[4753]: I0129 14:05:40.013188 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f"} err="failed to get container status \"793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f\": rpc error: code = NotFound desc = could not find container \"793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f\": container with ID starting with 793a44cd24ec448b60c5eeab43c2131cc7f21918e24d86f05ea73f4bcaad416f not found: ID does not exist" Jan 29 14:05:40 crc kubenswrapper[4753]: I0129 14:05:40.013247 4753 scope.go:117] "RemoveContainer" containerID="8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74" Jan 29 14:05:40 crc kubenswrapper[4753]: E0129 14:05:40.013953 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74\": container with ID starting with 8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74 not found: ID does not exist" containerID="8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74" Jan 29 14:05:40 crc kubenswrapper[4753]: I0129 14:05:40.014006 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74"} err="failed to get container status \"8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74\": rpc error: code = NotFound desc = could not find container \"8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74\": container with ID starting with 8645b03851c20db339c0ddc17044578c1ce4e1a1e213c5460c9065dd06adfa74 not found: ID does not exist" Jan 29 14:05:40 crc kubenswrapper[4753]: I0129 14:05:40.014027 4753 scope.go:117] "RemoveContainer" containerID="6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522" Jan 29 14:05:40 crc kubenswrapper[4753]: E0129 14:05:40.014671 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522\": container with ID starting with 6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522 not found: ID does not exist" containerID="6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522" Jan 29 14:05:40 crc kubenswrapper[4753]: I0129 14:05:40.014704 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522"} err="failed to get container status \"6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522\": rpc error: code = NotFound desc = could not find container \"6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522\": container with ID starting with 6b6c9a6009bef57b77e80ca8734d71e506424d40ff96b70680b562dd0b516522 not found: ID does not exist" Jan 29 14:05:40 crc kubenswrapper[4753]: I0129 14:05:40.158310 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" path="/var/lib/kubelet/pods/6e49742f-8ecc-4140-b477-5a3448e130cc/volumes" Jan 29 14:05:40 crc kubenswrapper[4753]: I0129 14:05:40.256579 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:05:40 crc kubenswrapper[4753]: I0129 14:05:40.314727 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:05:41 crc kubenswrapper[4753]: I0129 14:05:41.595756 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj7d8"] Jan 29 14:05:41 crc kubenswrapper[4753]: I0129 14:05:41.596468 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rj7d8" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerName="registry-server" containerID="cri-o://ca7c762e90b6637412c0d6cdaddcf3b29b9cd833da6bb3cf5b70f8755ebe33fb" gracePeriod=2 Jan 29 14:05:42 crc kubenswrapper[4753]: I0129 14:05:42.934422 4753 generic.go:334] "Generic (PLEG): container finished" podID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerID="ca7c762e90b6637412c0d6cdaddcf3b29b9cd833da6bb3cf5b70f8755ebe33fb" exitCode=0 Jan 29 14:05:42 crc kubenswrapper[4753]: I0129 14:05:42.934535 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj7d8" event={"ID":"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5","Type":"ContainerDied","Data":"ca7c762e90b6637412c0d6cdaddcf3b29b9cd833da6bb3cf5b70f8755ebe33fb"} Jan 29 14:05:42 crc kubenswrapper[4753]: I0129 14:05:42.934768 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj7d8" event={"ID":"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5","Type":"ContainerDied","Data":"43e43a7f20107290566cb56a2d04b835e91336b8e7a300e1a80e696bc6b24236"} Jan 29 14:05:42 crc kubenswrapper[4753]: I0129 14:05:42.934795 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43e43a7f20107290566cb56a2d04b835e91336b8e7a300e1a80e696bc6b24236" Jan 29 14:05:42 crc kubenswrapper[4753]: I0129 14:05:42.965133 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.121691 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm"] Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.121950 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" podUID="1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" containerName="controller-manager" containerID="cri-o://6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de" gracePeriod=30 Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.123502 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd6pv\" (UniqueName: \"kubernetes.io/projected/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-kube-api-access-wd6pv\") pod \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.123629 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-catalog-content\") pod \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.123686 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-utilities\") pod \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\" (UID: \"2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.124628 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-utilities" (OuterVolumeSpecName: "utilities") pod "2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" (UID: "2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.130341 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-kube-api-access-wd6pv" (OuterVolumeSpecName: "kube-api-access-wd6pv") pod "2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" (UID: "2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5"). InnerVolumeSpecName "kube-api-access-wd6pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.157386 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" (UID: "2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.211605 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d794955c-4298z"] Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.211829 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" podUID="8991923e-c6af-4748-93fa-2735e15a903b" containerName="route-controller-manager" containerID="cri-o://91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136" gracePeriod=30 Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.225999 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd6pv\" (UniqueName: \"kubernetes.io/projected/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-kube-api-access-wd6pv\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.226053 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.226073 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.275229 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.712623 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.718371 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.836883 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-config\") pod \"8991923e-c6af-4748-93fa-2735e15a903b\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.836934 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-proxy-ca-bundles\") pod \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.836958 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-client-ca\") pod \"8991923e-c6af-4748-93fa-2735e15a903b\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.836998 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz8w2\" (UniqueName: \"kubernetes.io/projected/8991923e-c6af-4748-93fa-2735e15a903b-kube-api-access-gz8w2\") pod \"8991923e-c6af-4748-93fa-2735e15a903b\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.837064 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgnfc\" (UniqueName: \"kubernetes.io/projected/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-kube-api-access-rgnfc\") pod \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.837084 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-config\") pod \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.837127 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8991923e-c6af-4748-93fa-2735e15a903b-serving-cert\") pod \"8991923e-c6af-4748-93fa-2735e15a903b\" (UID: \"8991923e-c6af-4748-93fa-2735e15a903b\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.837198 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-client-ca\") pod \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.837251 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-serving-cert\") pod \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\" (UID: \"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66\") " Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.838193 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-client-ca" (OuterVolumeSpecName: "client-ca") pod "1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" (UID: "1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.838265 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-config" (OuterVolumeSpecName: "config") pod "1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" (UID: "1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.838518 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-client-ca" (OuterVolumeSpecName: "client-ca") pod "8991923e-c6af-4748-93fa-2735e15a903b" (UID: "8991923e-c6af-4748-93fa-2735e15a903b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.838567 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-config" (OuterVolumeSpecName: "config") pod "8991923e-c6af-4748-93fa-2735e15a903b" (UID: "8991923e-c6af-4748-93fa-2735e15a903b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.838687 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" (UID: "1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.844200 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-kube-api-access-rgnfc" (OuterVolumeSpecName: "kube-api-access-rgnfc") pod "1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" (UID: "1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66"). InnerVolumeSpecName "kube-api-access-rgnfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.844530 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" (UID: "1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.844579 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8991923e-c6af-4748-93fa-2735e15a903b-kube-api-access-gz8w2" (OuterVolumeSpecName: "kube-api-access-gz8w2") pod "8991923e-c6af-4748-93fa-2735e15a903b" (UID: "8991923e-c6af-4748-93fa-2735e15a903b"). InnerVolumeSpecName "kube-api-access-gz8w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.847233 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8991923e-c6af-4748-93fa-2735e15a903b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8991923e-c6af-4748-93fa-2735e15a903b" (UID: "8991923e-c6af-4748-93fa-2735e15a903b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.938239 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8991923e-c6af-4748-93fa-2735e15a903b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.938276 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.938289 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.938302 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.938316 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.938334 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8991923e-c6af-4748-93fa-2735e15a903b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.938347 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz8w2\" (UniqueName: \"kubernetes.io/projected/8991923e-c6af-4748-93fa-2735e15a903b-kube-api-access-gz8w2\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.938359 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgnfc\" (UniqueName: \"kubernetes.io/projected/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-kube-api-access-rgnfc\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.938371 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.941950 4753 generic.go:334] "Generic (PLEG): container finished" podID="8991923e-c6af-4748-93fa-2735e15a903b" containerID="91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136" exitCode=0 Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.941992 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.942033 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" event={"ID":"8991923e-c6af-4748-93fa-2735e15a903b","Type":"ContainerDied","Data":"91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136"} Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.942072 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86d794955c-4298z" event={"ID":"8991923e-c6af-4748-93fa-2735e15a903b","Type":"ContainerDied","Data":"f978b0ea420e90be2f24e06e7d430ec07d51768435264d082c3e2cee966e7e0e"} Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.942140 4753 scope.go:117] "RemoveContainer" containerID="91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.944830 4753 generic.go:334] "Generic (PLEG): container finished" podID="1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" containerID="6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de" exitCode=0 Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.944899 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj7d8" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.944961 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.945343 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" event={"ID":"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66","Type":"ContainerDied","Data":"6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de"} Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.945383 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm" event={"ID":"1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66","Type":"ContainerDied","Data":"1084ebfb20d6b6e893c6301da846d90f1d25fac55fc61ef073a53627b05cf9a2"} Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.962957 4753 scope.go:117] "RemoveContainer" containerID="91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136" Jan 29 14:05:43 crc kubenswrapper[4753]: E0129 14:05:43.964650 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136\": container with ID starting with 91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136 not found: ID does not exist" containerID="91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.964689 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136"} err="failed to get container status \"91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136\": rpc error: code = NotFound desc = could not find container \"91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136\": container with ID starting with 91853c402561a2e2b4d6ee8c60dc761d4feaef8c1704e1bed911950342beb136 not found: ID does not exist" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.964716 4753 scope.go:117] "RemoveContainer" containerID="6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de" Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.976842 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d794955c-4298z"] Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.980763 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86d794955c-4298z"] Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.988808 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj7d8"] Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.993199 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj7d8"] Jan 29 14:05:43 crc kubenswrapper[4753]: I0129 14:05:43.998487 4753 scope.go:117] "RemoveContainer" containerID="6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.000122 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b6pp2"] Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.000784 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b6pp2" podUID="4762024a-21e5-4b76-a778-f2a16551c198" containerName="registry-server" containerID="cri-o://bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99" gracePeriod=2 Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.001423 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de\": container with ID starting with 6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de not found: ID does not exist" containerID="6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.001509 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de"} err="failed to get container status \"6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de\": rpc error: code = NotFound desc = could not find container \"6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de\": container with ID starting with 6896224d42fd635ad00650ae7d068fd78b1f2670a2885a9c3fd715dd439f63de not found: ID does not exist" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.005217 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm"] Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.006909 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-dc4bfd7fd-86mgm"] Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.160659 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" path="/var/lib/kubelet/pods/1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66/volumes" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.161222 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" path="/var/lib/kubelet/pods/2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5/volumes" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.162222 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8991923e-c6af-4748-93fa-2735e15a903b" path="/var/lib/kubelet/pods/8991923e-c6af-4748-93fa-2735e15a903b/volumes" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.292664 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f88585fcb-dkltg"] Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300054 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8991923e-c6af-4748-93fa-2735e15a903b" containerName="route-controller-manager" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300227 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8991923e-c6af-4748-93fa-2735e15a903b" containerName="route-controller-manager" Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300254 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerName="extract-utilities" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300264 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerName="extract-utilities" Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300275 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482babf9-e074-489f-aae6-eb9c48639f25" containerName="extract-utilities" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300283 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="482babf9-e074-489f-aae6-eb9c48639f25" containerName="extract-utilities" Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300298 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482babf9-e074-489f-aae6-eb9c48639f25" containerName="registry-server" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300309 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="482babf9-e074-489f-aae6-eb9c48639f25" containerName="registry-server" Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300323 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerName="extract-content" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300330 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerName="extract-content" Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300349 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" containerName="controller-manager" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300422 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" containerName="controller-manager" Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300464 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerName="registry-server" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300471 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerName="registry-server" Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300487 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482babf9-e074-489f-aae6-eb9c48639f25" containerName="extract-content" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300563 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="482babf9-e074-489f-aae6-eb9c48639f25" containerName="extract-content" Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300580 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerName="registry-server" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300587 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerName="registry-server" Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300604 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerName="extract-content" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300611 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerName="extract-content" Jan 29 14:05:44 crc kubenswrapper[4753]: E0129 14:05:44.300624 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerName="extract-utilities" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.300632 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerName="extract-utilities" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.301192 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e71e0d4-bc4b-4f43-8742-bcfe62d1d7e5" containerName="registry-server" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.301231 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8991923e-c6af-4748-93fa-2735e15a903b" containerName="route-controller-manager" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.301251 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9f0d20-5bae-45d7-90c9-fb2e9f2e1c66" containerName="controller-manager" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.301265 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e49742f-8ecc-4140-b477-5a3448e130cc" containerName="registry-server" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.301287 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="482babf9-e074-489f-aae6-eb9c48639f25" containerName="registry-server" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.310709 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.313329 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.313687 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.314106 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.314367 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.314692 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.314800 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.325928 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8"] Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.327539 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.335810 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.336567 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f88585fcb-dkltg"] Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.339123 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.339336 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.339135 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.340360 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.343313 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.344631 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8"] Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.344959 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.410478 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.454984 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-proxy-ca-bundles\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.455039 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-config\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.455061 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/129fa114-a24c-4ee9-8114-b8990390ecc2-serving-cert\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.455219 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-client-ca\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.455282 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-config\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.455307 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5bba10-4d0a-4096-b80f-797b0e0b77fc-serving-cert\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.455324 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fxkq\" (UniqueName: \"kubernetes.io/projected/129fa114-a24c-4ee9-8114-b8990390ecc2-kube-api-access-4fxkq\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.455340 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-client-ca\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.455385 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkn9p\" (UniqueName: \"kubernetes.io/projected/df5bba10-4d0a-4096-b80f-797b0e0b77fc-kube-api-access-bkn9p\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.556558 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-catalog-content\") pod \"4762024a-21e5-4b76-a778-f2a16551c198\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.556953 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-utilities\") pod \"4762024a-21e5-4b76-a778-f2a16551c198\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.557084 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2brn2\" (UniqueName: \"kubernetes.io/projected/4762024a-21e5-4b76-a778-f2a16551c198-kube-api-access-2brn2\") pod \"4762024a-21e5-4b76-a778-f2a16551c198\" (UID: \"4762024a-21e5-4b76-a778-f2a16551c198\") " Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.557350 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-config\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.558925 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-utilities" (OuterVolumeSpecName: "utilities") pod "4762024a-21e5-4b76-a778-f2a16551c198" (UID: "4762024a-21e5-4b76-a778-f2a16551c198"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.558928 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-config\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.558978 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5bba10-4d0a-4096-b80f-797b0e0b77fc-serving-cert\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.559246 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fxkq\" (UniqueName: \"kubernetes.io/projected/129fa114-a24c-4ee9-8114-b8990390ecc2-kube-api-access-4fxkq\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.559358 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-client-ca\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.559459 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkn9p\" (UniqueName: \"kubernetes.io/projected/df5bba10-4d0a-4096-b80f-797b0e0b77fc-kube-api-access-bkn9p\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.559584 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-proxy-ca-bundles\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.559693 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-config\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.559790 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/129fa114-a24c-4ee9-8114-b8990390ecc2-serving-cert\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.559934 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-client-ca\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.560055 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.561233 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-config\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.562049 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-client-ca\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.562548 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-client-ca\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.562864 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-proxy-ca-bundles\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.568074 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4762024a-21e5-4b76-a778-f2a16551c198-kube-api-access-2brn2" (OuterVolumeSpecName: "kube-api-access-2brn2") pod "4762024a-21e5-4b76-a778-f2a16551c198" (UID: "4762024a-21e5-4b76-a778-f2a16551c198"). InnerVolumeSpecName "kube-api-access-2brn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.569510 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5bba10-4d0a-4096-b80f-797b0e0b77fc-serving-cert\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.570112 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/129fa114-a24c-4ee9-8114-b8990390ecc2-serving-cert\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.580588 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fxkq\" (UniqueName: \"kubernetes.io/projected/129fa114-a24c-4ee9-8114-b8990390ecc2-kube-api-access-4fxkq\") pod \"route-controller-manager-57c48b4dc9-zrtf8\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.590305 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkn9p\" (UniqueName: \"kubernetes.io/projected/df5bba10-4d0a-4096-b80f-797b0e0b77fc-kube-api-access-bkn9p\") pod \"controller-manager-f88585fcb-dkltg\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.654482 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.660284 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.663769 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2brn2\" (UniqueName: \"kubernetes.io/projected/4762024a-21e5-4b76-a778-f2a16551c198-kube-api-access-2brn2\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.735332 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4762024a-21e5-4b76-a778-f2a16551c198" (UID: "4762024a-21e5-4b76-a778-f2a16551c198"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.764989 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4762024a-21e5-4b76-a778-f2a16551c198-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.918269 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f88585fcb-dkltg"] Jan 29 14:05:44 crc kubenswrapper[4753]: W0129 14:05:44.920598 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf5bba10_4d0a_4096_b80f_797b0e0b77fc.slice/crio-0d831bd2939efaafa4a5e0d5f8f1697b5e0f442fca6cbc7b6f2cd3d59c9b3a6c WatchSource:0}: Error finding container 0d831bd2939efaafa4a5e0d5f8f1697b5e0f442fca6cbc7b6f2cd3d59c9b3a6c: Status 404 returned error can't find the container with id 0d831bd2939efaafa4a5e0d5f8f1697b5e0f442fca6cbc7b6f2cd3d59c9b3a6c Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.960294 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8"] Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.960340 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" event={"ID":"df5bba10-4d0a-4096-b80f-797b0e0b77fc","Type":"ContainerStarted","Data":"0d831bd2939efaafa4a5e0d5f8f1697b5e0f442fca6cbc7b6f2cd3d59c9b3a6c"} Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.963826 4753 generic.go:334] "Generic (PLEG): container finished" podID="4762024a-21e5-4b76-a778-f2a16551c198" containerID="bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99" exitCode=0 Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.963891 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6pp2" event={"ID":"4762024a-21e5-4b76-a778-f2a16551c198","Type":"ContainerDied","Data":"bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99"} Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.963905 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6pp2" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.963974 4753 scope.go:117] "RemoveContainer" containerID="bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.963913 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6pp2" event={"ID":"4762024a-21e5-4b76-a778-f2a16551c198","Type":"ContainerDied","Data":"3b7c4052d6f2dd18311c8daa0a73df3610a1a88cad683ad029ae5c0b8c4e66e9"} Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.991236 4753 scope.go:117] "RemoveContainer" containerID="45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9" Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.993457 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b6pp2"] Jan 29 14:05:44 crc kubenswrapper[4753]: I0129 14:05:44.996357 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b6pp2"] Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.023564 4753 scope.go:117] "RemoveContainer" containerID="14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b" Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.041510 4753 scope.go:117] "RemoveContainer" containerID="bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99" Jan 29 14:05:45 crc kubenswrapper[4753]: E0129 14:05:45.042290 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99\": container with ID starting with bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99 not found: ID does not exist" containerID="bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99" Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.042345 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99"} err="failed to get container status \"bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99\": rpc error: code = NotFound desc = could not find container \"bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99\": container with ID starting with bf1a99e0b1105d67c88c677020b671d351136f5a6bc04441613291678947ed99 not found: ID does not exist" Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.042383 4753 scope.go:117] "RemoveContainer" containerID="45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9" Jan 29 14:05:45 crc kubenswrapper[4753]: E0129 14:05:45.042701 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9\": container with ID starting with 45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9 not found: ID does not exist" containerID="45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9" Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.042728 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9"} err="failed to get container status \"45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9\": rpc error: code = NotFound desc = could not find container \"45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9\": container with ID starting with 45c32d268e9e72832373f06af622a2a15faa084d791eda90dca3dc64c6decbe9 not found: ID does not exist" Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.042751 4753 scope.go:117] "RemoveContainer" containerID="14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b" Jan 29 14:05:45 crc kubenswrapper[4753]: E0129 14:05:45.043006 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b\": container with ID starting with 14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b not found: ID does not exist" containerID="14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b" Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.043029 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b"} err="failed to get container status \"14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b\": rpc error: code = NotFound desc = could not find container \"14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b\": container with ID starting with 14d27914f6c16222ee94325b2d0a20250582137f1fe7362cc9f147fedd64a65b not found: ID does not exist" Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.718425 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gn6zd"] Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.974478 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" event={"ID":"129fa114-a24c-4ee9-8114-b8990390ecc2","Type":"ContainerStarted","Data":"a917063a80a9049cf2c3926e9a8ef9a7974f3e9e100f76e143968187e7c2b20f"} Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.974829 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" event={"ID":"129fa114-a24c-4ee9-8114-b8990390ecc2","Type":"ContainerStarted","Data":"a0ca251311a41939423046628070e4889e0f16aff380a907dd179e71061c7e8a"} Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.976320 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.978583 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" event={"ID":"df5bba10-4d0a-4096-b80f-797b0e0b77fc","Type":"ContainerStarted","Data":"c5c7b72b10965d94ee5234624a630ec76425dc4db504272e7ca12d00e6e07589"} Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.979303 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:45 crc kubenswrapper[4753]: I0129 14:05:45.985950 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:05:46 crc kubenswrapper[4753]: I0129 14:05:46.003053 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" podStartSLOduration=3.003036958 podStartE2EDuration="3.003036958s" podCreationTimestamp="2026-01-29 14:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:05:45.999842169 +0000 UTC m=+180.694576551" watchObservedRunningTime="2026-01-29 14:05:46.003036958 +0000 UTC m=+180.697771340" Jan 29 14:05:46 crc kubenswrapper[4753]: I0129 14:05:46.006440 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:05:46 crc kubenswrapper[4753]: I0129 14:05:46.023028 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" podStartSLOduration=3.023015014 podStartE2EDuration="3.023015014s" podCreationTimestamp="2026-01-29 14:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:05:46.018288293 +0000 UTC m=+180.713022685" watchObservedRunningTime="2026-01-29 14:05:46.023015014 +0000 UTC m=+180.717749396" Jan 29 14:05:46 crc kubenswrapper[4753]: I0129 14:05:46.155099 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4762024a-21e5-4b76-a778-f2a16551c198" path="/var/lib/kubelet/pods/4762024a-21e5-4b76-a778-f2a16551c198/volumes" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.940081 4753 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 14:05:56 crc kubenswrapper[4753]: E0129 14:05:56.941367 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4762024a-21e5-4b76-a778-f2a16551c198" containerName="extract-content" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.941406 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4762024a-21e5-4b76-a778-f2a16551c198" containerName="extract-content" Jan 29 14:05:56 crc kubenswrapper[4753]: E0129 14:05:56.941466 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4762024a-21e5-4b76-a778-f2a16551c198" containerName="registry-server" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.941475 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4762024a-21e5-4b76-a778-f2a16551c198" containerName="registry-server" Jan 29 14:05:56 crc kubenswrapper[4753]: E0129 14:05:56.941498 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4762024a-21e5-4b76-a778-f2a16551c198" containerName="extract-utilities" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.941505 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4762024a-21e5-4b76-a778-f2a16551c198" containerName="extract-utilities" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.941951 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4762024a-21e5-4b76-a778-f2a16551c198" containerName="registry-server" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.943115 4753 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.943421 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.947990 4753 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 14:05:56 crc kubenswrapper[4753]: E0129 14:05:56.948370 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948388 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 29 14:05:56 crc kubenswrapper[4753]: E0129 14:05:56.948415 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948425 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 29 14:05:56 crc kubenswrapper[4753]: E0129 14:05:56.948445 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948455 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 29 14:05:56 crc kubenswrapper[4753]: E0129 14:05:56.948470 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948478 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 14:05:56 crc kubenswrapper[4753]: E0129 14:05:56.948497 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948506 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 29 14:05:56 crc kubenswrapper[4753]: E0129 14:05:56.948531 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948540 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 14:05:56 crc kubenswrapper[4753]: E0129 14:05:56.948558 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948568 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948872 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948887 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948905 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948923 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948945 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.948960 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.955417 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78" gracePeriod=15 Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.955750 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8" gracePeriod=15 Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.955853 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d" gracePeriod=15 Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.955939 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b" gracePeriod=15 Jan 29 14:05:56 crc kubenswrapper[4753]: I0129 14:05:56.956017 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b" gracePeriod=15 Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.012087 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.055257 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.055330 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:05:57 crc kubenswrapper[4753]: E0129 14:05:57.056385 4753 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events/machine-config-daemon-x6rpz.188f38b4a2c085fb\": dial tcp 38.102.83.142:6443: connect: connection refused" event=< Jan 29 14:05:57 crc kubenswrapper[4753]: &Event{ObjectMeta:{machine-config-daemon-x6rpz.188f38b4a2c085fb openshift-machine-config-operator 29495 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-x6rpz,UID:49d14260-5f77-47b9-97e1-c843cf322a0f,APIVersion:v1,ResourceVersion:26696,FieldPath:spec.containers{machine-config-daemon},},Reason:ProbeError,Message:Liveness probe error: Get "http://127.0.0.1:8798/health": dial tcp 127.0.0.1:8798: connect: connection refused Jan 29 14:05:57 crc kubenswrapper[4753]: body: Jan 29 14:05:57 crc kubenswrapper[4753]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-29 14:05:27 +0000 UTC,LastTimestamp:2026-01-29 14:05:57.055306104 +0000 UTC m=+191.750040496,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Jan 29 14:05:57 crc kubenswrapper[4753]: > Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.131794 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.132058 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.132178 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.132538 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.132642 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.132667 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.132822 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.132905 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.234626 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.234696 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.234771 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.234780 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.234830 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.234856 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.234967 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.234992 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.235074 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.235111 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.235273 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.235418 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.235426 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.235491 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.235530 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.235635 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.314963 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.655870 4753 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 29 14:05:57 crc kubenswrapper[4753]: I0129 14:05:57.656512 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.056963 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.058899 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.059702 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8" exitCode=0 Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.059781 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d" exitCode=0 Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.059823 4753 scope.go:117] "RemoveContainer" containerID="e961c2c1ca8d9509252be94e9b8f512d8d0ec1b312e88eddc501c2b600b2f792" Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.059852 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b" exitCode=0 Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.060029 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b" exitCode=2 Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.085694 4753 generic.go:334] "Generic (PLEG): container finished" podID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" containerID="35f44418dfb0e39faf6ac413da26904b9775ddcd5b7c2333bf7f1338b2fc58cf" exitCode=0 Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.085869 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3b3422bb-6885-4aaf-97d6-0b4e613e81f1","Type":"ContainerDied","Data":"35f44418dfb0e39faf6ac413da26904b9775ddcd5b7c2333bf7f1338b2fc58cf"} Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.086574 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.087029 4753 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.088003 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.088726 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be"} Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.088897 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2df83e24b4bde49b25a2463c2e8b636dca9f89caf6f3cdca26464eb9f1908b9b"} Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.089809 4753 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.090216 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:58 crc kubenswrapper[4753]: I0129 14:05:58.090578 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.129254 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.442225 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.443480 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.443999 4753 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.444339 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.444713 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.553879 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.554896 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.555394 4753 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.555731 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.573857 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.573954 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.574001 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.574113 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.574110 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.574222 4753 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.574204 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.675775 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kube-api-access\") pod \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.675868 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kubelet-dir\") pod \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.675921 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-var-lock\") pod \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\" (UID: \"3b3422bb-6885-4aaf-97d6-0b4e613e81f1\") " Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.676025 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3b3422bb-6885-4aaf-97d6-0b4e613e81f1" (UID: "3b3422bb-6885-4aaf-97d6-0b4e613e81f1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.676311 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-var-lock" (OuterVolumeSpecName: "var-lock") pod "3b3422bb-6885-4aaf-97d6-0b4e613e81f1" (UID: "3b3422bb-6885-4aaf-97d6-0b4e613e81f1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.676649 4753 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.676686 4753 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.676714 4753 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-var-lock\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.676736 4753 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.682368 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3b3422bb-6885-4aaf-97d6-0b4e613e81f1" (UID: "3b3422bb-6885-4aaf-97d6-0b4e613e81f1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:05:59 crc kubenswrapper[4753]: I0129 14:05:59.778471 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b3422bb-6885-4aaf-97d6-0b4e613e81f1-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.160037 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.165438 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.165651 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.166768 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3b3422bb-6885-4aaf-97d6-0b4e613e81f1","Type":"ContainerDied","Data":"06b6b971e4de35392b1cc341a27f8ce307d46504db467a20eae99e7798c50de6"} Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.166873 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b6b971e4de35392b1cc341a27f8ce307d46504db467a20eae99e7798c50de6" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.167011 4753 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78" exitCode=0 Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.167128 4753 scope.go:117] "RemoveContainer" containerID="71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.167196 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.168375 4753 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.168929 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.169198 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.201313 4753 scope.go:117] "RemoveContainer" containerID="dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.206246 4753 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.207112 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.207592 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.208328 4753 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.208695 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.209012 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.224743 4753 scope.go:117] "RemoveContainer" containerID="c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.245588 4753 scope.go:117] "RemoveContainer" containerID="28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.268313 4753 scope.go:117] "RemoveContainer" containerID="38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.292632 4753 scope.go:117] "RemoveContainer" containerID="0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.311473 4753 scope.go:117] "RemoveContainer" containerID="71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8" Jan 29 14:06:00 crc kubenswrapper[4753]: E0129 14:06:00.311998 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\": container with ID starting with 71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8 not found: ID does not exist" containerID="71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.312042 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8"} err="failed to get container status \"71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\": rpc error: code = NotFound desc = could not find container \"71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8\": container with ID starting with 71b0b4a7ab1fd697dfc62002efef04c22d73587d75f0ded733fb3731b1962cc8 not found: ID does not exist" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.312079 4753 scope.go:117] "RemoveContainer" containerID="dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d" Jan 29 14:06:00 crc kubenswrapper[4753]: E0129 14:06:00.312533 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\": container with ID starting with dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d not found: ID does not exist" containerID="dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.312601 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d"} err="failed to get container status \"dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\": rpc error: code = NotFound desc = could not find container \"dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d\": container with ID starting with dce2b17396198b209a8571b4a18be061e19e314d707df89e5424e5756f958c4d not found: ID does not exist" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.312646 4753 scope.go:117] "RemoveContainer" containerID="c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b" Jan 29 14:06:00 crc kubenswrapper[4753]: E0129 14:06:00.313018 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\": container with ID starting with c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b not found: ID does not exist" containerID="c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.313044 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b"} err="failed to get container status \"c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\": rpc error: code = NotFound desc = could not find container \"c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b\": container with ID starting with c8a4137c1dbdf276810cff7a0fe368d80d16f8c8867d71de57503605a9d97d1b not found: ID does not exist" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.313059 4753 scope.go:117] "RemoveContainer" containerID="28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b" Jan 29 14:06:00 crc kubenswrapper[4753]: E0129 14:06:00.313404 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\": container with ID starting with 28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b not found: ID does not exist" containerID="28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.313433 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b"} err="failed to get container status \"28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\": rpc error: code = NotFound desc = could not find container \"28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b\": container with ID starting with 28a379e240e1a38a13991a898218d094e255a8fcb668f85f7237f3088d02249b not found: ID does not exist" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.313450 4753 scope.go:117] "RemoveContainer" containerID="38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78" Jan 29 14:06:00 crc kubenswrapper[4753]: E0129 14:06:00.313688 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\": container with ID starting with 38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78 not found: ID does not exist" containerID="38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.313743 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78"} err="failed to get container status \"38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\": rpc error: code = NotFound desc = could not find container \"38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78\": container with ID starting with 38dd99f2a78cf979f0a1c842da30a7eefa9c71c8e651bfbfa940a14514a20a78 not found: ID does not exist" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.313772 4753 scope.go:117] "RemoveContainer" containerID="0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70" Jan 29 14:06:00 crc kubenswrapper[4753]: E0129 14:06:00.314086 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\": container with ID starting with 0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70 not found: ID does not exist" containerID="0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70" Jan 29 14:06:00 crc kubenswrapper[4753]: I0129 14:06:00.314131 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70"} err="failed to get container status \"0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\": rpc error: code = NotFound desc = could not find container \"0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70\": container with ID starting with 0e96cb9e9144d661b5990e65d76cd15f9cc7b1d29b8db3e57f50850efb1ebe70 not found: ID does not exist" Jan 29 14:06:02 crc kubenswrapper[4753]: E0129 14:06:02.130010 4753 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events/machine-config-daemon-x6rpz.188f38b4a2c085fb\": dial tcp 38.102.83.142:6443: connect: connection refused" event=< Jan 29 14:06:02 crc kubenswrapper[4753]: &Event{ObjectMeta:{machine-config-daemon-x6rpz.188f38b4a2c085fb openshift-machine-config-operator 29495 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-x6rpz,UID:49d14260-5f77-47b9-97e1-c843cf322a0f,APIVersion:v1,ResourceVersion:26696,FieldPath:spec.containers{machine-config-daemon},},Reason:ProbeError,Message:Liveness probe error: Get "http://127.0.0.1:8798/health": dial tcp 127.0.0.1:8798: connect: connection refused Jan 29 14:06:02 crc kubenswrapper[4753]: body: Jan 29 14:06:02 crc kubenswrapper[4753]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-29 14:05:27 +0000 UTC,LastTimestamp:2026-01-29 14:05:57.055306104 +0000 UTC m=+191.750040496,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Jan 29 14:06:02 crc kubenswrapper[4753]: > Jan 29 14:06:04 crc kubenswrapper[4753]: E0129 14:06:04.493768 4753 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:04 crc kubenswrapper[4753]: E0129 14:06:04.494965 4753 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:04 crc kubenswrapper[4753]: E0129 14:06:04.495970 4753 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:04 crc kubenswrapper[4753]: E0129 14:06:04.496673 4753 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:04 crc kubenswrapper[4753]: E0129 14:06:04.497200 4753 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:04 crc kubenswrapper[4753]: I0129 14:06:04.497306 4753 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 29 14:06:04 crc kubenswrapper[4753]: E0129 14:06:04.497802 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="200ms" Jan 29 14:06:04 crc kubenswrapper[4753]: E0129 14:06:04.699727 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="400ms" Jan 29 14:06:05 crc kubenswrapper[4753]: E0129 14:06:05.101031 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="800ms" Jan 29 14:06:05 crc kubenswrapper[4753]: E0129 14:06:05.902113 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="1.6s" Jan 29 14:06:06 crc kubenswrapper[4753]: I0129 14:06:06.161144 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:06 crc kubenswrapper[4753]: I0129 14:06:06.161988 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:07 crc kubenswrapper[4753]: E0129 14:06:07.502990 4753 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.142:6443: connect: connection refused" interval="3.2s" Jan 29 14:06:08 crc kubenswrapper[4753]: I0129 14:06:08.149572 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:08 crc kubenswrapper[4753]: I0129 14:06:08.150630 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:08 crc kubenswrapper[4753]: I0129 14:06:08.151193 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:08 crc kubenswrapper[4753]: I0129 14:06:08.184809 4753 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a5840803-8eed-4fb0-9307-9a39df0f7603" Jan 29 14:06:08 crc kubenswrapper[4753]: I0129 14:06:08.184857 4753 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a5840803-8eed-4fb0-9307-9a39df0f7603" Jan 29 14:06:08 crc kubenswrapper[4753]: E0129 14:06:08.185481 4753 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:08 crc kubenswrapper[4753]: I0129 14:06:08.187203 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.238423 4753 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2fc9ad66463449c488d6e7c10e82e687ee746c519a3be74aa9f9b03e7792a1b0" exitCode=0 Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.238595 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2fc9ad66463449c488d6e7c10e82e687ee746c519a3be74aa9f9b03e7792a1b0"} Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.239245 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"89307a3ab1d4864a20508a8898f25cad8e67edb55d6a5da041ea2672ec08688b"} Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.240123 4753 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a5840803-8eed-4fb0-9307-9a39df0f7603" Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.240225 4753 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a5840803-8eed-4fb0-9307-9a39df0f7603" Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.240829 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:09 crc kubenswrapper[4753]: E0129 14:06:09.241083 4753 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.241451 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.244131 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.244202 4753 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438" exitCode=1 Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.244239 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438"} Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.244915 4753 scope.go:117] "RemoveContainer" containerID="4c025e891825daa4d494775e85163f9f2deb2f382beee99ba9db0e7882e43438" Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.245861 4753 status_manager.go:851] "Failed to get status for pod" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.246020 4753 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.246176 4753 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.142:6443: connect: connection refused" Jan 29 14:06:09 crc kubenswrapper[4753]: I0129 14:06:09.467213 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:06:10 crc kubenswrapper[4753]: I0129 14:06:10.272229 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 29 14:06:10 crc kubenswrapper[4753]: I0129 14:06:10.272841 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"35b0487881ffb68a3168c2302af8477b3d138ab9ba3467cc35d2f84f13936c1b"} Jan 29 14:06:10 crc kubenswrapper[4753]: I0129 14:06:10.280980 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"586a45fc0c7d052129776bef2e3412d78bb358052df7e497fc9e3bc8d93db782"} Jan 29 14:06:10 crc kubenswrapper[4753]: I0129 14:06:10.281031 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4a9b31ead7202788ee5a533ee23b3f961bea2d118df26578abe364db89b15af8"} Jan 29 14:06:10 crc kubenswrapper[4753]: I0129 14:06:10.281044 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b1bf8cfd4ba99be101994777cc25aaf4284fd68903a9c68e334349fc36c2e69d"} Jan 29 14:06:10 crc kubenswrapper[4753]: I0129 14:06:10.744473 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" podUID="4a83492d-36e3-4400-a969-71934ecfc9f7" containerName="oauth-openshift" containerID="cri-o://bab852b08fa726a717811425c76977d4b8a81a9e88e319395564a351fec56207" gracePeriod=15 Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.292931 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ce3508550fa27c643765c06c5e657d67ae361b9379af610ef5ed5eaef085b044"} Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.293287 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5833fc4a6dfffa77d97014ec717a0fc1f2e1bba40e9c78210f531b2db4ad4391"} Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.293311 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.293202 4753 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a5840803-8eed-4fb0-9307-9a39df0f7603" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.293338 4753 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a5840803-8eed-4fb0-9307-9a39df0f7603" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.295929 4753 generic.go:334] "Generic (PLEG): container finished" podID="4a83492d-36e3-4400-a969-71934ecfc9f7" containerID="bab852b08fa726a717811425c76977d4b8a81a9e88e319395564a351fec56207" exitCode=0 Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.296013 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" event={"ID":"4a83492d-36e3-4400-a969-71934ecfc9f7","Type":"ContainerDied","Data":"bab852b08fa726a717811425c76977d4b8a81a9e88e319395564a351fec56207"} Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.296042 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" event={"ID":"4a83492d-36e3-4400-a969-71934ecfc9f7","Type":"ContainerDied","Data":"6b0d82125df57a7442de78d5a13fc3a210b3ef4dd79f2a9c1088e432cec5775b"} Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.296056 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b0d82125df57a7442de78d5a13fc3a210b3ef4dd79f2a9c1088e432cec5775b" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.299366 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.475858 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-provider-selection\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.475925 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-service-ca\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.475968 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-dir\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.475994 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-serving-cert\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476021 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-session\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476055 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-trusted-ca-bundle\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476091 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-policies\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476111 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476125 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9rtg\" (UniqueName: \"kubernetes.io/projected/4a83492d-36e3-4400-a969-71934ecfc9f7-kube-api-access-z9rtg\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476687 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-router-certs\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476732 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-idp-0-file-data\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476751 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-error\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476790 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-login\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476811 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-ocp-branding-template\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.476835 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-cliconfig\") pod \"4a83492d-36e3-4400-a969-71934ecfc9f7\" (UID: \"4a83492d-36e3-4400-a969-71934ecfc9f7\") " Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.477287 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.477349 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.477495 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.477955 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.484501 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.484509 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a83492d-36e3-4400-a969-71934ecfc9f7-kube-api-access-z9rtg" (OuterVolumeSpecName: "kube-api-access-z9rtg") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "kube-api-access-z9rtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.485045 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.485049 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.485519 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.485975 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.488827 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.496427 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.497051 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4a83492d-36e3-4400-a969-71934ecfc9f7" (UID: "4a83492d-36e3-4400-a969-71934ecfc9f7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581251 4753 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581330 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581349 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581364 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581376 4753 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581388 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9rtg\" (UniqueName: \"kubernetes.io/projected/4a83492d-36e3-4400-a969-71934ecfc9f7-kube-api-access-z9rtg\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581402 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581416 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581432 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581444 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581455 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581466 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581479 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:11 crc kubenswrapper[4753]: I0129 14:06:11.581492 4753 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a83492d-36e3-4400-a969-71934ecfc9f7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:12 crc kubenswrapper[4753]: I0129 14:06:12.304014 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gn6zd" Jan 29 14:06:13 crc kubenswrapper[4753]: I0129 14:06:13.187370 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:13 crc kubenswrapper[4753]: I0129 14:06:13.187860 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:13 crc kubenswrapper[4753]: I0129 14:06:13.197542 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:15 crc kubenswrapper[4753]: I0129 14:06:15.475490 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:06:15 crc kubenswrapper[4753]: I0129 14:06:15.479624 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:06:16 crc kubenswrapper[4753]: I0129 14:06:16.302694 4753 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:16 crc kubenswrapper[4753]: I0129 14:06:16.326660 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:06:16 crc kubenswrapper[4753]: I0129 14:06:16.326956 4753 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a5840803-8eed-4fb0-9307-9a39df0f7603" Jan 29 14:06:16 crc kubenswrapper[4753]: I0129 14:06:16.326979 4753 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a5840803-8eed-4fb0-9307-9a39df0f7603" Jan 29 14:06:16 crc kubenswrapper[4753]: I0129 14:06:16.332385 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:16 crc kubenswrapper[4753]: I0129 14:06:16.337779 4753 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bc1385a7-0303-44e2-b56e-e27494b04538" Jan 29 14:06:17 crc kubenswrapper[4753]: I0129 14:06:17.334627 4753 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a5840803-8eed-4fb0-9307-9a39df0f7603" Jan 29 14:06:17 crc kubenswrapper[4753]: I0129 14:06:17.334680 4753 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a5840803-8eed-4fb0-9307-9a39df0f7603" Jan 29 14:06:24 crc kubenswrapper[4753]: I0129 14:06:24.007569 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 29 14:06:26 crc kubenswrapper[4753]: I0129 14:06:26.171633 4753 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bc1385a7-0303-44e2-b56e-e27494b04538" Jan 29 14:06:26 crc kubenswrapper[4753]: I0129 14:06:26.245654 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 29 14:06:26 crc kubenswrapper[4753]: I0129 14:06:26.968912 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.031918 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.055660 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.055742 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.055808 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.056626 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.056712 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257" gracePeriod=600 Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.301581 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.419302 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257" exitCode=0 Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.419447 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257"} Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.434788 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.761243 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.772611 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.907565 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 29 14:06:27 crc kubenswrapper[4753]: I0129 14:06:27.957581 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 29 14:06:28 crc kubenswrapper[4753]: I0129 14:06:28.243819 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 29 14:06:28 crc kubenswrapper[4753]: I0129 14:06:28.340229 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 29 14:06:28 crc kubenswrapper[4753]: I0129 14:06:28.428305 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"75366cb6f8276e320c7a54930212ad89a510f8fd57854ceefd9052e46fbae159"} Jan 29 14:06:28 crc kubenswrapper[4753]: I0129 14:06:28.687659 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 29 14:06:28 crc kubenswrapper[4753]: I0129 14:06:28.923705 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 29 14:06:29 crc kubenswrapper[4753]: I0129 14:06:29.092722 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 29 14:06:29 crc kubenswrapper[4753]: I0129 14:06:29.198732 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 29 14:06:29 crc kubenswrapper[4753]: I0129 14:06:29.303038 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 29 14:06:29 crc kubenswrapper[4753]: I0129 14:06:29.362126 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 29 14:06:29 crc kubenswrapper[4753]: I0129 14:06:29.595249 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 29 14:06:29 crc kubenswrapper[4753]: I0129 14:06:29.693111 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 29 14:06:29 crc kubenswrapper[4753]: I0129 14:06:29.784234 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 29 14:06:29 crc kubenswrapper[4753]: I0129 14:06:29.803693 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 14:06:29 crc kubenswrapper[4753]: I0129 14:06:29.939536 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.008571 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.034780 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.187380 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.228895 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.363209 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.482890 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.489817 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.553732 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.579702 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.599165 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.867021 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 29 14:06:30 crc kubenswrapper[4753]: I0129 14:06:30.939370 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.032105 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.155344 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.159642 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.191212 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.350601 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.364004 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.375945 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.418433 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.448857 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.475060 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.570935 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.734324 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.734459 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 29 14:06:31 crc kubenswrapper[4753]: I0129 14:06:31.826296 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.024777 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.079022 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.155820 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.176916 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.203555 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.236406 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.271480 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.311241 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.493692 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.534825 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.540568 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.554020 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.631128 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.777922 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.796807 4753 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.854407 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.864393 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.866548 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.946031 4753 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 29 14:06:32 crc kubenswrapper[4753]: I0129 14:06:32.995682 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.054487 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.192853 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.243224 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.259849 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.359348 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.544336 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.572105 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.592137 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.624655 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.627537 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.777426 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.867391 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.897760 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.906220 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 29 14:06:33 crc kubenswrapper[4753]: I0129 14:06:33.946244 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.068708 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.122879 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.234280 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.279538 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.300679 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.381817 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.407397 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.442075 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.614873 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.621366 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.636681 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.669003 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.783030 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.793186 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.825575 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.952928 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 29 14:06:34 crc kubenswrapper[4753]: I0129 14:06:34.965268 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.058975 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.066780 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.066956 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.179834 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.189384 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.203493 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.225505 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.312113 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.348995 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.755203 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.802131 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.830611 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.874588 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.927235 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.927610 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.954534 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 29 14:06:35 crc kubenswrapper[4753]: I0129 14:06:35.991290 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.138485 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.175737 4753 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.207968 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.441660 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.500457 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.617531 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.658089 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.671969 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.749107 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.961615 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.964752 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.966987 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 29 14:06:36 crc kubenswrapper[4753]: I0129 14:06:36.971567 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.044827 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.171468 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.172122 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.200802 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.226285 4753 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.251047 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.288375 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.298928 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.348993 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.350534 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.478097 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.629796 4753 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.634831 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.634809092 podStartE2EDuration="41.634809092s" podCreationTimestamp="2026-01-29 14:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:06:16.205906082 +0000 UTC m=+210.900640464" watchObservedRunningTime="2026-01-29 14:06:37.634809092 +0000 UTC m=+232.329543474" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.635215 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gn6zd","openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.635280 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.640734 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.655964 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.65594327 podStartE2EDuration="21.65594327s" podCreationTimestamp="2026-01-29 14:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:06:37.653234965 +0000 UTC m=+232.347969377" watchObservedRunningTime="2026-01-29 14:06:37.65594327 +0000 UTC m=+232.350677662" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.728724 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.759295 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.764354 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.809437 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.823283 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.890443 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.900761 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.958187 4753 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.982642 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 14:06:37 crc kubenswrapper[4753]: I0129 14:06:37.983819 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.022840 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.122482 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.156750 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a83492d-36e3-4400-a969-71934ecfc9f7" path="/var/lib/kubelet/pods/4a83492d-36e3-4400-a969-71934ecfc9f7/volumes" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.167858 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.205981 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.229134 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.239937 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.253736 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.263400 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.326258 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.415535 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.438813 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.474906 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.504299 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.507550 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.620022 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.856239 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.874341 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.934622 4753 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.934859 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be" gracePeriod=5 Jan 29 14:06:38 crc kubenswrapper[4753]: I0129 14:06:38.970421 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.020934 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.179305 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.210256 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.258383 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.427426 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.513440 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.568223 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.613519 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.614768 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.623938 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.632815 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.662286 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.695593 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.836830 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 29 14:06:39 crc kubenswrapper[4753]: I0129 14:06:39.868032 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.008949 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.020705 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.049141 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.092543 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.143928 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.189230 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.194206 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.292344 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.326779 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.347082 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.523001 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 29 14:06:40 crc kubenswrapper[4753]: I0129 14:06:40.660304 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.056918 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.193479 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.252465 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.293089 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.307590 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.346505 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.355650 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.496581 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.505068 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.533030 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.668474 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.716773 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.741388 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.773742 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 29 14:06:41 crc kubenswrapper[4753]: I0129 14:06:41.804822 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 29 14:06:42 crc kubenswrapper[4753]: I0129 14:06:42.035035 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 29 14:06:42 crc kubenswrapper[4753]: I0129 14:06:42.080604 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 29 14:06:42 crc kubenswrapper[4753]: I0129 14:06:42.165356 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 29 14:06:42 crc kubenswrapper[4753]: I0129 14:06:42.176499 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 14:06:42 crc kubenswrapper[4753]: I0129 14:06:42.280974 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 29 14:06:42 crc kubenswrapper[4753]: I0129 14:06:42.318875 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 29 14:06:42 crc kubenswrapper[4753]: I0129 14:06:42.375511 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 29 14:06:42 crc kubenswrapper[4753]: I0129 14:06:42.376049 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 29 14:06:42 crc kubenswrapper[4753]: I0129 14:06:42.756788 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 29 14:06:42 crc kubenswrapper[4753]: I0129 14:06:42.794765 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.005410 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.031055 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.172649 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f88585fcb-dkltg"] Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.172961 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" podUID="df5bba10-4d0a-4096-b80f-797b0e0b77fc" containerName="controller-manager" containerID="cri-o://c5c7b72b10965d94ee5234624a630ec76425dc4db504272e7ca12d00e6e07589" gracePeriod=30 Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.184761 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8"] Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.185096 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" podUID="129fa114-a24c-4ee9-8114-b8990390ecc2" containerName="route-controller-manager" containerID="cri-o://a917063a80a9049cf2c3926e9a8ef9a7974f3e9e100f76e143968187e7c2b20f" gracePeriod=30 Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.210396 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.334545 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-849dbf65f-s795x"] Jan 29 14:06:43 crc kubenswrapper[4753]: E0129 14:06:43.335236 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" containerName="installer" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.335253 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" containerName="installer" Jan 29 14:06:43 crc kubenswrapper[4753]: E0129 14:06:43.335281 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.335290 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 29 14:06:43 crc kubenswrapper[4753]: E0129 14:06:43.335304 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a83492d-36e3-4400-a969-71934ecfc9f7" containerName="oauth-openshift" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.335311 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a83492d-36e3-4400-a969-71934ecfc9f7" containerName="oauth-openshift" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.335427 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b3422bb-6885-4aaf-97d6-0b4e613e81f1" containerName="installer" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.335447 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.335460 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a83492d-36e3-4400-a969-71934ecfc9f7" containerName="oauth-openshift" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.335979 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.338414 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.339449 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.342627 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.343746 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.343867 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.346003 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.346341 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.346710 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.346839 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.350211 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.351569 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-849dbf65f-s795x"] Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.351865 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.352070 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.352845 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.357352 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.357873 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.379527 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.465247 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489187 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-service-ca\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489230 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489582 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489620 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489656 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-audit-policies\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489680 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-router-certs\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489705 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489729 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whmrp\" (UniqueName: \"kubernetes.io/projected/ffeafba7-00ef-4d05-96a0-6dceeece860d-kube-api-access-whmrp\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489760 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-template-error\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489786 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489810 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-template-login\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489830 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-session\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489875 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.489902 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffeafba7-00ef-4d05-96a0-6dceeece860d-audit-dir\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.496634 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.545562 4753 generic.go:334] "Generic (PLEG): container finished" podID="df5bba10-4d0a-4096-b80f-797b0e0b77fc" containerID="c5c7b72b10965d94ee5234624a630ec76425dc4db504272e7ca12d00e6e07589" exitCode=0 Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.545644 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" event={"ID":"df5bba10-4d0a-4096-b80f-797b0e0b77fc","Type":"ContainerDied","Data":"c5c7b72b10965d94ee5234624a630ec76425dc4db504272e7ca12d00e6e07589"} Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.547022 4753 generic.go:334] "Generic (PLEG): container finished" podID="129fa114-a24c-4ee9-8114-b8990390ecc2" containerID="a917063a80a9049cf2c3926e9a8ef9a7974f3e9e100f76e143968187e7c2b20f" exitCode=0 Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.547062 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" event={"ID":"129fa114-a24c-4ee9-8114-b8990390ecc2","Type":"ContainerDied","Data":"a917063a80a9049cf2c3926e9a8ef9a7974f3e9e100f76e143968187e7c2b20f"} Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.590975 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-session\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.591090 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.591137 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffeafba7-00ef-4d05-96a0-6dceeece860d-audit-dir\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593213 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-service-ca\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593264 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593309 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593356 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593402 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-audit-policies\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593434 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-router-certs\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593471 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593508 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whmrp\" (UniqueName: \"kubernetes.io/projected/ffeafba7-00ef-4d05-96a0-6dceeece860d-kube-api-access-whmrp\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593553 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-template-error\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593589 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593624 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-template-login\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.593108 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.591345 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ffeafba7-00ef-4d05-96a0-6dceeece860d-audit-dir\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.594909 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-audit-policies\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.595567 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-service-ca\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.595800 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.598380 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-template-error\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.598731 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-session\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.598808 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.599215 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.600549 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.600879 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.607132 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-system-router-certs\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.608293 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ffeafba7-00ef-4d05-96a0-6dceeece860d-v4-0-config-user-template-login\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.616335 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whmrp\" (UniqueName: \"kubernetes.io/projected/ffeafba7-00ef-4d05-96a0-6dceeece860d-kube-api-access-whmrp\") pod \"oauth-openshift-849dbf65f-s795x\" (UID: \"ffeafba7-00ef-4d05-96a0-6dceeece860d\") " pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.629059 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.665678 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.684422 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.694079 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkn9p\" (UniqueName: \"kubernetes.io/projected/df5bba10-4d0a-4096-b80f-797b0e0b77fc-kube-api-access-bkn9p\") pod \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.694208 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-proxy-ca-bundles\") pod \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.694234 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-client-ca\") pod \"129fa114-a24c-4ee9-8114-b8990390ecc2\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.694285 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-config\") pod \"129fa114-a24c-4ee9-8114-b8990390ecc2\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.694315 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/129fa114-a24c-4ee9-8114-b8990390ecc2-serving-cert\") pod \"129fa114-a24c-4ee9-8114-b8990390ecc2\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.694373 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5bba10-4d0a-4096-b80f-797b0e0b77fc-serving-cert\") pod \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.694406 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-client-ca\") pod \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.694454 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-config\") pod \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\" (UID: \"df5bba10-4d0a-4096-b80f-797b0e0b77fc\") " Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.694477 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fxkq\" (UniqueName: \"kubernetes.io/projected/129fa114-a24c-4ee9-8114-b8990390ecc2-kube-api-access-4fxkq\") pod \"129fa114-a24c-4ee9-8114-b8990390ecc2\" (UID: \"129fa114-a24c-4ee9-8114-b8990390ecc2\") " Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.696247 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-config" (OuterVolumeSpecName: "config") pod "129fa114-a24c-4ee9-8114-b8990390ecc2" (UID: "129fa114-a24c-4ee9-8114-b8990390ecc2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.698173 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129fa114-a24c-4ee9-8114-b8990390ecc2-kube-api-access-4fxkq" (OuterVolumeSpecName: "kube-api-access-4fxkq") pod "129fa114-a24c-4ee9-8114-b8990390ecc2" (UID: "129fa114-a24c-4ee9-8114-b8990390ecc2"). InnerVolumeSpecName "kube-api-access-4fxkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.699586 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5bba10-4d0a-4096-b80f-797b0e0b77fc-kube-api-access-bkn9p" (OuterVolumeSpecName: "kube-api-access-bkn9p") pod "df5bba10-4d0a-4096-b80f-797b0e0b77fc" (UID: "df5bba10-4d0a-4096-b80f-797b0e0b77fc"). InnerVolumeSpecName "kube-api-access-bkn9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.699976 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "df5bba10-4d0a-4096-b80f-797b0e0b77fc" (UID: "df5bba10-4d0a-4096-b80f-797b0e0b77fc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.700321 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-client-ca" (OuterVolumeSpecName: "client-ca") pod "129fa114-a24c-4ee9-8114-b8990390ecc2" (UID: "129fa114-a24c-4ee9-8114-b8990390ecc2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.701996 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "df5bba10-4d0a-4096-b80f-797b0e0b77fc" (UID: "df5bba10-4d0a-4096-b80f-797b0e0b77fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.702924 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129fa114-a24c-4ee9-8114-b8990390ecc2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "129fa114-a24c-4ee9-8114-b8990390ecc2" (UID: "129fa114-a24c-4ee9-8114-b8990390ecc2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.703855 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-config" (OuterVolumeSpecName: "config") pod "df5bba10-4d0a-4096-b80f-797b0e0b77fc" (UID: "df5bba10-4d0a-4096-b80f-797b0e0b77fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.704402 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df5bba10-4d0a-4096-b80f-797b0e0b77fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "df5bba10-4d0a-4096-b80f-797b0e0b77fc" (UID: "df5bba10-4d0a-4096-b80f-797b0e0b77fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.734142 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.795562 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df5bba10-4d0a-4096-b80f-797b0e0b77fc-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.795598 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.795613 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.795626 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fxkq\" (UniqueName: \"kubernetes.io/projected/129fa114-a24c-4ee9-8114-b8990390ecc2-kube-api-access-4fxkq\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.795641 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkn9p\" (UniqueName: \"kubernetes.io/projected/df5bba10-4d0a-4096-b80f-797b0e0b77fc-kube-api-access-bkn9p\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.795652 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.795664 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df5bba10-4d0a-4096-b80f-797b0e0b77fc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.795676 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/129fa114-a24c-4ee9-8114-b8990390ecc2-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.795687 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/129fa114-a24c-4ee9-8114-b8990390ecc2-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.870310 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 29 14:06:43 crc kubenswrapper[4753]: I0129 14:06:43.896447 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.063982 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.176134 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-849dbf65f-s795x"] Jan 29 14:06:44 crc kubenswrapper[4753]: W0129 14:06:44.185644 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffeafba7_00ef_4d05_96a0_6dceeece860d.slice/crio-0b0647797b2b9bf9fafe1c6fd5bde7c1eefcc59c8cdb2705f6908eee85324b23 WatchSource:0}: Error finding container 0b0647797b2b9bf9fafe1c6fd5bde7c1eefcc59c8cdb2705f6908eee85324b23: Status 404 returned error can't find the container with id 0b0647797b2b9bf9fafe1c6fd5bde7c1eefcc59c8cdb2705f6908eee85324b23 Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.336377 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5"] Jan 29 14:06:44 crc kubenswrapper[4753]: E0129 14:06:44.336850 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5bba10-4d0a-4096-b80f-797b0e0b77fc" containerName="controller-manager" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.336884 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5bba10-4d0a-4096-b80f-797b0e0b77fc" containerName="controller-manager" Jan 29 14:06:44 crc kubenswrapper[4753]: E0129 14:06:44.336916 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129fa114-a24c-4ee9-8114-b8990390ecc2" containerName="route-controller-manager" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.336929 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="129fa114-a24c-4ee9-8114-b8990390ecc2" containerName="route-controller-manager" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.337134 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="129fa114-a24c-4ee9-8114-b8990390ecc2" containerName="route-controller-manager" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.337195 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5bba10-4d0a-4096-b80f-797b0e0b77fc" containerName="controller-manager" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.338806 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.343179 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5"] Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.503616 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-client-ca\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.503785 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f36ebdc1-69ee-40c7-a775-fe8382995712-serving-cert\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.503904 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-config\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.503942 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7tgk\" (UniqueName: \"kubernetes.io/projected/f36ebdc1-69ee-40c7-a775-fe8382995712-kube-api-access-v7tgk\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.506001 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.506074 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.549446 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.554917 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" event={"ID":"ffeafba7-00ef-4d05-96a0-6dceeece860d","Type":"ContainerStarted","Data":"148ebab43787a459f9f8b24c508bbf0c8664da87f9b60420157106c8c137acc8"} Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.554974 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" event={"ID":"ffeafba7-00ef-4d05-96a0-6dceeece860d","Type":"ContainerStarted","Data":"0b0647797b2b9bf9fafe1c6fd5bde7c1eefcc59c8cdb2705f6908eee85324b23"} Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.557364 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.557565 4753 patch_prober.go:28] interesting pod/oauth-openshift-849dbf65f-s795x container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": dial tcp 10.217.0.62:6443: connect: connection refused" start-of-body= Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.557620 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" podUID="ffeafba7-00ef-4d05-96a0-6dceeece860d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": dial tcp 10.217.0.62:6443: connect: connection refused" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.558919 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.558981 4753 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be" exitCode=137 Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.559043 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.559090 4753 scope.go:117] "RemoveContainer" containerID="f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.561758 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" event={"ID":"df5bba10-4d0a-4096-b80f-797b0e0b77fc","Type":"ContainerDied","Data":"0d831bd2939efaafa4a5e0d5f8f1697b5e0f442fca6cbc7b6f2cd3d59c9b3a6c"} Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.562036 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f88585fcb-dkltg" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.563425 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" event={"ID":"129fa114-a24c-4ee9-8114-b8990390ecc2","Type":"ContainerDied","Data":"a0ca251311a41939423046628070e4889e0f16aff380a907dd179e71061c7e8a"} Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.563497 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.577113 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" podStartSLOduration=59.577100116 podStartE2EDuration="59.577100116s" podCreationTimestamp="2026-01-29 14:05:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:06:44.57436994 +0000 UTC m=+239.269104422" watchObservedRunningTime="2026-01-29 14:06:44.577100116 +0000 UTC m=+239.271834498" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.584458 4753 scope.go:117] "RemoveContainer" containerID="f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be" Jan 29 14:06:44 crc kubenswrapper[4753]: E0129 14:06:44.584984 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be\": container with ID starting with f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be not found: ID does not exist" containerID="f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.585046 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be"} err="failed to get container status \"f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be\": rpc error: code = NotFound desc = could not find container \"f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be\": container with ID starting with f670a729bea63fe237248a8240e2dc59f4c3498778e3a84d71274820dc3462be not found: ID does not exist" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.585081 4753 scope.go:117] "RemoveContainer" containerID="c5c7b72b10965d94ee5234624a630ec76425dc4db504272e7ca12d00e6e07589" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.604642 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f88585fcb-dkltg"] Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.605688 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-client-ca\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.606355 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f88585fcb-dkltg"] Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.607590 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-client-ca\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.607761 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f36ebdc1-69ee-40c7-a775-fe8382995712-serving-cert\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.609608 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-config\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.609883 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7tgk\" (UniqueName: \"kubernetes.io/projected/f36ebdc1-69ee-40c7-a775-fe8382995712-kube-api-access-v7tgk\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.611744 4753 scope.go:117] "RemoveContainer" containerID="a917063a80a9049cf2c3926e9a8ef9a7974f3e9e100f76e143968187e7c2b20f" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.611846 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-config\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.621711 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f36ebdc1-69ee-40c7-a775-fe8382995712-serving-cert\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.625267 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8"] Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.627088 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c48b4dc9-zrtf8"] Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.629815 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.630843 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7tgk\" (UniqueName: \"kubernetes.io/projected/f36ebdc1-69ee-40c7-a775-fe8382995712-kube-api-access-v7tgk\") pod \"route-controller-manager-7686c96dcf-f6md5\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.665634 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711236 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711297 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711316 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711338 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711317 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711357 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711382 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711440 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711577 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711676 4753 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711693 4753 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711706 4753 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.711717 4753 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.718966 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:06:44 crc kubenswrapper[4753]: I0129 14:06:44.813306 4753 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.003752 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.073966 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5"] Jan 29 14:06:45 crc kubenswrapper[4753]: W0129 14:06:45.086634 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf36ebdc1_69ee_40c7_a775_fe8382995712.slice/crio-fef86bed45256223c89b3535cd2b73ed09a622658ea0ad89b652deb1cd558efe WatchSource:0}: Error finding container fef86bed45256223c89b3535cd2b73ed09a622658ea0ad89b652deb1cd558efe: Status 404 returned error can't find the container with id fef86bed45256223c89b3535cd2b73ed09a622658ea0ad89b652deb1cd558efe Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.333077 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-548968b897-62pzt"] Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.335233 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.341018 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.341416 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.341595 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.341768 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.342972 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.343467 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.352845 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-548968b897-62pzt"] Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.354425 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.422377 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbe0a176-211b-4530-af4e-17b3f9a97552-serving-cert\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.422446 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-client-ca\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.422506 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx2kg\" (UniqueName: \"kubernetes.io/projected/dbe0a176-211b-4530-af4e-17b3f9a97552-kube-api-access-dx2kg\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.422553 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-proxy-ca-bundles\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.422636 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-config\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.524397 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-client-ca\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.524803 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx2kg\" (UniqueName: \"kubernetes.io/projected/dbe0a176-211b-4530-af4e-17b3f9a97552-kube-api-access-dx2kg\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.524835 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-proxy-ca-bundles\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.524889 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-config\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.524947 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbe0a176-211b-4530-af4e-17b3f9a97552-serving-cert\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.527669 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-proxy-ca-bundles\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.527800 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-config\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.528562 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-client-ca\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.532407 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbe0a176-211b-4530-af4e-17b3f9a97552-serving-cert\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.557127 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx2kg\" (UniqueName: \"kubernetes.io/projected/dbe0a176-211b-4530-af4e-17b3f9a97552-kube-api-access-dx2kg\") pod \"controller-manager-548968b897-62pzt\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.574411 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" event={"ID":"f36ebdc1-69ee-40c7-a775-fe8382995712","Type":"ContainerStarted","Data":"0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c"} Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.574492 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" event={"ID":"f36ebdc1-69ee-40c7-a775-fe8382995712","Type":"ContainerStarted","Data":"fef86bed45256223c89b3535cd2b73ed09a622658ea0ad89b652deb1cd558efe"} Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.576403 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.587219 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-849dbf65f-s795x" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.588841 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.604644 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" podStartSLOduration=2.604618108 podStartE2EDuration="2.604618108s" podCreationTimestamp="2026-01-29 14:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:06:45.59857688 +0000 UTC m=+240.293311272" watchObservedRunningTime="2026-01-29 14:06:45.604618108 +0000 UTC m=+240.299352530" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.671418 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:45 crc kubenswrapper[4753]: I0129 14:06:45.823099 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.070289 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-548968b897-62pzt"] Jan 29 14:06:46 crc kubenswrapper[4753]: W0129 14:06:46.078900 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbe0a176_211b_4530_af4e_17b3f9a97552.slice/crio-057bd0ebb10269a346cfd7a8248d7d4a3a002948d65df346f86da1b219188af1 WatchSource:0}: Error finding container 057bd0ebb10269a346cfd7a8248d7d4a3a002948d65df346f86da1b219188af1: Status 404 returned error can't find the container with id 057bd0ebb10269a346cfd7a8248d7d4a3a002948d65df346f86da1b219188af1 Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.095081 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.167704 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129fa114-a24c-4ee9-8114-b8990390ecc2" path="/var/lib/kubelet/pods/129fa114-a24c-4ee9-8114-b8990390ecc2/volumes" Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.169021 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5bba10-4d0a-4096-b80f-797b0e0b77fc" path="/var/lib/kubelet/pods/df5bba10-4d0a-4096-b80f-797b0e0b77fc/volumes" Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.169801 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.170452 4753 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.188415 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.188519 4753 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3ef1e01c-ab3b-448f-a5e2-12af53ddc64b" Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.204423 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.204511 4753 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3ef1e01c-ab3b-448f-a5e2-12af53ddc64b" Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.584326 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" event={"ID":"dbe0a176-211b-4530-af4e-17b3f9a97552","Type":"ContainerStarted","Data":"9d607c732f58c7f071355c8094fce7154f999c6496f7baebdeead59ae4dc4e98"} Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.584412 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" event={"ID":"dbe0a176-211b-4530-af4e-17b3f9a97552","Type":"ContainerStarted","Data":"057bd0ebb10269a346cfd7a8248d7d4a3a002948d65df346f86da1b219188af1"} Jan 29 14:06:46 crc kubenswrapper[4753]: I0129 14:06:46.612825 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" podStartSLOduration=3.612801101 podStartE2EDuration="3.612801101s" podCreationTimestamp="2026-01-29 14:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:06:46.609823748 +0000 UTC m=+241.304558130" watchObservedRunningTime="2026-01-29 14:06:46.612801101 +0000 UTC m=+241.307535483" Jan 29 14:06:47 crc kubenswrapper[4753]: I0129 14:06:47.593497 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:47 crc kubenswrapper[4753]: I0129 14:06:47.598518 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.585633 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7v9s"] Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.586984 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p7v9s" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" containerName="registry-server" containerID="cri-o://f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9" gracePeriod=30 Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.597881 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gz9wv"] Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.598521 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gz9wv" podUID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerName="registry-server" containerID="cri-o://50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f" gracePeriod=30 Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.617515 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkq94"] Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.632178 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jmmtj"] Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.632435 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jmmtj" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerName="registry-server" containerID="cri-o://c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b" gracePeriod=30 Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.641293 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2sj8l"] Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.642522 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" podUID="19f60d75-bda2-4816-b146-f5e29203ffbc" containerName="marketplace-operator" containerID="cri-o://e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385" gracePeriod=30 Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.644302 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2sj8l" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" containerName="registry-server" containerID="cri-o://8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2" gracePeriod=30 Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.650488 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xnjnx"] Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.651109 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.662443 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xnjnx"] Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.674109 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/62f2be0b-c83f-4b74-80cc-504f1221b322-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xnjnx\" (UID: \"62f2be0b-c83f-4b74-80cc-504f1221b322\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.674715 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27hqs\" (UniqueName: \"kubernetes.io/projected/62f2be0b-c83f-4b74-80cc-504f1221b322-kube-api-access-27hqs\") pod \"marketplace-operator-79b997595-xnjnx\" (UID: \"62f2be0b-c83f-4b74-80cc-504f1221b322\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.674758 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62f2be0b-c83f-4b74-80cc-504f1221b322-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xnjnx\" (UID: \"62f2be0b-c83f-4b74-80cc-504f1221b322\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.776056 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/62f2be0b-c83f-4b74-80cc-504f1221b322-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xnjnx\" (UID: \"62f2be0b-c83f-4b74-80cc-504f1221b322\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.776145 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27hqs\" (UniqueName: \"kubernetes.io/projected/62f2be0b-c83f-4b74-80cc-504f1221b322-kube-api-access-27hqs\") pod \"marketplace-operator-79b997595-xnjnx\" (UID: \"62f2be0b-c83f-4b74-80cc-504f1221b322\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.776215 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62f2be0b-c83f-4b74-80cc-504f1221b322-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xnjnx\" (UID: \"62f2be0b-c83f-4b74-80cc-504f1221b322\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.781444 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62f2be0b-c83f-4b74-80cc-504f1221b322-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xnjnx\" (UID: \"62f2be0b-c83f-4b74-80cc-504f1221b322\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.788479 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/62f2be0b-c83f-4b74-80cc-504f1221b322-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xnjnx\" (UID: \"62f2be0b-c83f-4b74-80cc-504f1221b322\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:54 crc kubenswrapper[4753]: I0129 14:06:54.794946 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27hqs\" (UniqueName: \"kubernetes.io/projected/62f2be0b-c83f-4b74-80cc-504f1221b322-kube-api-access-27hqs\") pod \"marketplace-operator-79b997595-xnjnx\" (UID: \"62f2be0b-c83f-4b74-80cc-504f1221b322\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.073118 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.188104 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.284535 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-catalog-content\") pod \"dc59885a-9b00-47db-b9d8-e857c589abce\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.284702 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-utilities\") pod \"dc59885a-9b00-47db-b9d8-e857c589abce\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.284767 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctxk6\" (UniqueName: \"kubernetes.io/projected/dc59885a-9b00-47db-b9d8-e857c589abce-kube-api-access-ctxk6\") pod \"dc59885a-9b00-47db-b9d8-e857c589abce\" (UID: \"dc59885a-9b00-47db-b9d8-e857c589abce\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.288100 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-utilities" (OuterVolumeSpecName: "utilities") pod "dc59885a-9b00-47db-b9d8-e857c589abce" (UID: "dc59885a-9b00-47db-b9d8-e857c589abce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.291407 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc59885a-9b00-47db-b9d8-e857c589abce-kube-api-access-ctxk6" (OuterVolumeSpecName: "kube-api-access-ctxk6") pod "dc59885a-9b00-47db-b9d8-e857c589abce" (UID: "dc59885a-9b00-47db-b9d8-e857c589abce"). InnerVolumeSpecName "kube-api-access-ctxk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.349081 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc59885a-9b00-47db-b9d8-e857c589abce" (UID: "dc59885a-9b00-47db-b9d8-e857c589abce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.387262 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.388067 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc59885a-9b00-47db-b9d8-e857c589abce-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.388100 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctxk6\" (UniqueName: \"kubernetes.io/projected/dc59885a-9b00-47db-b9d8-e857c589abce-kube-api-access-ctxk6\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.392939 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.400902 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.402318 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.407042 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489231 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58qbh\" (UniqueName: \"kubernetes.io/projected/48968e19-5dbd-4231-895f-c28e4178bf33-kube-api-access-58qbh\") pod \"48968e19-5dbd-4231-895f-c28e4178bf33\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489295 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-trusted-ca\") pod \"19f60d75-bda2-4816-b146-f5e29203ffbc\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489355 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2zwr\" (UniqueName: \"kubernetes.io/projected/e5ef4648-0d9e-487b-a796-476432ec0ca8-kube-api-access-f2zwr\") pod \"e5ef4648-0d9e-487b-a796-476432ec0ca8\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489391 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-catalog-content\") pod \"48968e19-5dbd-4231-895f-c28e4178bf33\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489573 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-utilities\") pod \"48968e19-5dbd-4231-895f-c28e4178bf33\" (UID: \"48968e19-5dbd-4231-895f-c28e4178bf33\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489639 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-utilities\") pod \"e5ef4648-0d9e-487b-a796-476432ec0ca8\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489692 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-catalog-content\") pod \"5875e879-79f3-4499-b460-b4b4a3e1637c\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489719 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-operator-metrics\") pod \"19f60d75-bda2-4816-b146-f5e29203ffbc\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489762 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f82mk\" (UniqueName: \"kubernetes.io/projected/19f60d75-bda2-4816-b146-f5e29203ffbc-kube-api-access-f82mk\") pod \"19f60d75-bda2-4816-b146-f5e29203ffbc\" (UID: \"19f60d75-bda2-4816-b146-f5e29203ffbc\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489786 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-utilities\") pod \"5875e879-79f3-4499-b460-b4b4a3e1637c\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489816 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw7pb\" (UniqueName: \"kubernetes.io/projected/5875e879-79f3-4499-b460-b4b4a3e1637c-kube-api-access-sw7pb\") pod \"5875e879-79f3-4499-b460-b4b4a3e1637c\" (UID: \"5875e879-79f3-4499-b460-b4b4a3e1637c\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.489842 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-catalog-content\") pod \"e5ef4648-0d9e-487b-a796-476432ec0ca8\" (UID: \"e5ef4648-0d9e-487b-a796-476432ec0ca8\") " Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.493481 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-utilities" (OuterVolumeSpecName: "utilities") pod "e5ef4648-0d9e-487b-a796-476432ec0ca8" (UID: "e5ef4648-0d9e-487b-a796-476432ec0ca8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.493686 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "19f60d75-bda2-4816-b146-f5e29203ffbc" (UID: "19f60d75-bda2-4816-b146-f5e29203ffbc"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.494241 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-utilities" (OuterVolumeSpecName: "utilities") pod "48968e19-5dbd-4231-895f-c28e4178bf33" (UID: "48968e19-5dbd-4231-895f-c28e4178bf33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.496178 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-utilities" (OuterVolumeSpecName: "utilities") pod "5875e879-79f3-4499-b460-b4b4a3e1637c" (UID: "5875e879-79f3-4499-b460-b4b4a3e1637c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.505743 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48968e19-5dbd-4231-895f-c28e4178bf33-kube-api-access-58qbh" (OuterVolumeSpecName: "kube-api-access-58qbh") pod "48968e19-5dbd-4231-895f-c28e4178bf33" (UID: "48968e19-5dbd-4231-895f-c28e4178bf33"). InnerVolumeSpecName "kube-api-access-58qbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.505858 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.505899 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.505913 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.505924 4753 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.506092 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5875e879-79f3-4499-b460-b4b4a3e1637c-kube-api-access-sw7pb" (OuterVolumeSpecName: "kube-api-access-sw7pb") pod "5875e879-79f3-4499-b460-b4b4a3e1637c" (UID: "5875e879-79f3-4499-b460-b4b4a3e1637c"). InnerVolumeSpecName "kube-api-access-sw7pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.505854 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5ef4648-0d9e-487b-a796-476432ec0ca8-kube-api-access-f2zwr" (OuterVolumeSpecName: "kube-api-access-f2zwr") pod "e5ef4648-0d9e-487b-a796-476432ec0ca8" (UID: "e5ef4648-0d9e-487b-a796-476432ec0ca8"). InnerVolumeSpecName "kube-api-access-f2zwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.518846 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "19f60d75-bda2-4816-b146-f5e29203ffbc" (UID: "19f60d75-bda2-4816-b146-f5e29203ffbc"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.518961 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f60d75-bda2-4816-b146-f5e29203ffbc-kube-api-access-f82mk" (OuterVolumeSpecName: "kube-api-access-f82mk") pod "19f60d75-bda2-4816-b146-f5e29203ffbc" (UID: "19f60d75-bda2-4816-b146-f5e29203ffbc"). InnerVolumeSpecName "kube-api-access-f82mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.544921 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5875e879-79f3-4499-b460-b4b4a3e1637c" (UID: "5875e879-79f3-4499-b460-b4b4a3e1637c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.571165 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5ef4648-0d9e-487b-a796-476432ec0ca8" (UID: "e5ef4648-0d9e-487b-a796-476432ec0ca8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.606994 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2zwr\" (UniqueName: \"kubernetes.io/projected/e5ef4648-0d9e-487b-a796-476432ec0ca8-kube-api-access-f2zwr\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.607038 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5875e879-79f3-4499-b460-b4b4a3e1637c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.607048 4753 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/19f60d75-bda2-4816-b146-f5e29203ffbc-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.607057 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f82mk\" (UniqueName: \"kubernetes.io/projected/19f60d75-bda2-4816-b146-f5e29203ffbc-kube-api-access-f82mk\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.607066 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw7pb\" (UniqueName: \"kubernetes.io/projected/5875e879-79f3-4499-b460-b4b4a3e1637c-kube-api-access-sw7pb\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.607074 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ef4648-0d9e-487b-a796-476432ec0ca8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.607083 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58qbh\" (UniqueName: \"kubernetes.io/projected/48968e19-5dbd-4231-895f-c28e4178bf33-kube-api-access-58qbh\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.626923 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48968e19-5dbd-4231-895f-c28e4178bf33" (UID: "48968e19-5dbd-4231-895f-c28e4178bf33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.648827 4753 generic.go:334] "Generic (PLEG): container finished" podID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerID="50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f" exitCode=0 Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.648889 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz9wv" event={"ID":"e5ef4648-0d9e-487b-a796-476432ec0ca8","Type":"ContainerDied","Data":"50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f"} Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.648914 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gz9wv" event={"ID":"e5ef4648-0d9e-487b-a796-476432ec0ca8","Type":"ContainerDied","Data":"84f12de06eec476466b6a8134fb427aa0c66259e4d26ae1eaca7c66107cf7818"} Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.648931 4753 scope.go:117] "RemoveContainer" containerID="50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.649075 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gz9wv" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.653989 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jmmtj" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.654234 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmmtj" event={"ID":"5875e879-79f3-4499-b460-b4b4a3e1637c","Type":"ContainerDied","Data":"c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b"} Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.653769 4753 generic.go:334] "Generic (PLEG): container finished" podID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerID="c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b" exitCode=0 Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.656432 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmmtj" event={"ID":"5875e879-79f3-4499-b460-b4b4a3e1637c","Type":"ContainerDied","Data":"2f776b87193d1f11d63f3e3d43b41495f4ea10e54efbe7e0669a24a6262c2ac2"} Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.659598 4753 generic.go:334] "Generic (PLEG): container finished" podID="48968e19-5dbd-4231-895f-c28e4178bf33" containerID="8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2" exitCode=0 Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.659657 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sj8l" event={"ID":"48968e19-5dbd-4231-895f-c28e4178bf33","Type":"ContainerDied","Data":"8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2"} Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.659679 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sj8l" event={"ID":"48968e19-5dbd-4231-895f-c28e4178bf33","Type":"ContainerDied","Data":"48fac7314c91c0f85ccbb2ec22413e2ef7bfc191fc14520534d19d9313983eaa"} Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.659773 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sj8l" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.668363 4753 generic.go:334] "Generic (PLEG): container finished" podID="dc59885a-9b00-47db-b9d8-e857c589abce" containerID="f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9" exitCode=0 Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.668444 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7v9s" event={"ID":"dc59885a-9b00-47db-b9d8-e857c589abce","Type":"ContainerDied","Data":"f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9"} Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.668452 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7v9s" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.668472 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7v9s" event={"ID":"dc59885a-9b00-47db-b9d8-e857c589abce","Type":"ContainerDied","Data":"e517ac2d5af94ab915954b5c6acdaa747a469aacd4e800a628a255502dfa777c"} Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.669740 4753 generic.go:334] "Generic (PLEG): container finished" podID="19f60d75-bda2-4816-b146-f5e29203ffbc" containerID="e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385" exitCode=0 Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.669781 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" event={"ID":"19f60d75-bda2-4816-b146-f5e29203ffbc","Type":"ContainerDied","Data":"e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385"} Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.669789 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.669808 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vkq94" event={"ID":"19f60d75-bda2-4816-b146-f5e29203ffbc","Type":"ContainerDied","Data":"ce986b1a47f994b5d1b8ce1640b0b95878642b06ebf70c4321f2dd4c2b0d61cc"} Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.685808 4753 scope.go:117] "RemoveContainer" containerID="9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.693242 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gz9wv"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.697059 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gz9wv"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.705033 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jmmtj"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.708449 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48968e19-5dbd-4231-895f-c28e4178bf33-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.712511 4753 scope.go:117] "RemoveContainer" containerID="6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.720349 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xnjnx"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.723901 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jmmtj"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.728349 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkq94"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.732410 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkq94"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.738545 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7v9s"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.744599 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p7v9s"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.760331 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2sj8l"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.763857 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2sj8l"] Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.777252 4753 scope.go:117] "RemoveContainer" containerID="50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.777687 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f\": container with ID starting with 50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f not found: ID does not exist" containerID="50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.777896 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f"} err="failed to get container status \"50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f\": rpc error: code = NotFound desc = could not find container \"50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f\": container with ID starting with 50b3d36a1e022ef681910a42bbdd3591a863dc1d8f0fc99611b8d2982e12994f not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.778193 4753 scope.go:117] "RemoveContainer" containerID="9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.778752 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37\": container with ID starting with 9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37 not found: ID does not exist" containerID="9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.778788 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37"} err="failed to get container status \"9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37\": rpc error: code = NotFound desc = could not find container \"9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37\": container with ID starting with 9154529fbfc6cb8ebee3c2a6585536d550bc2dd3ec99f25d81bf8e7993295f37 not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.778813 4753 scope.go:117] "RemoveContainer" containerID="6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.779014 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc\": container with ID starting with 6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc not found: ID does not exist" containerID="6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.779041 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc"} err="failed to get container status \"6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc\": rpc error: code = NotFound desc = could not find container \"6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc\": container with ID starting with 6e8f331d5acb21ae6a7ba95bc0aafd8b3dcfacb8dcfc6553a75f3f6deed29adc not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.779052 4753 scope.go:117] "RemoveContainer" containerID="c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.812727 4753 scope.go:117] "RemoveContainer" containerID="31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.828378 4753 scope.go:117] "RemoveContainer" containerID="3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.845526 4753 scope.go:117] "RemoveContainer" containerID="c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.846554 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b\": container with ID starting with c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b not found: ID does not exist" containerID="c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.846583 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b"} err="failed to get container status \"c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b\": rpc error: code = NotFound desc = could not find container \"c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b\": container with ID starting with c48ce2873e7a7b82563ac92f91b8646559e786cb9117beeb5b7388ee65361b4b not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.846605 4753 scope.go:117] "RemoveContainer" containerID="31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.847351 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d\": container with ID starting with 31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d not found: ID does not exist" containerID="31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.847404 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d"} err="failed to get container status \"31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d\": rpc error: code = NotFound desc = could not find container \"31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d\": container with ID starting with 31a07786b6eb5bc481dfe78d651f199e06bafed6cf8b5ccf0d3de5cb8907b34d not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.847441 4753 scope.go:117] "RemoveContainer" containerID="3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.847773 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662\": container with ID starting with 3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662 not found: ID does not exist" containerID="3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.847801 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662"} err="failed to get container status \"3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662\": rpc error: code = NotFound desc = could not find container \"3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662\": container with ID starting with 3e7047dad5e16a0a1fadbaac25e95dc02afbd4ff4b3c7285f35330f5fb57e662 not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.847816 4753 scope.go:117] "RemoveContainer" containerID="8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.863716 4753 scope.go:117] "RemoveContainer" containerID="1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.880573 4753 scope.go:117] "RemoveContainer" containerID="c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.897145 4753 scope.go:117] "RemoveContainer" containerID="8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.897953 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2\": container with ID starting with 8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2 not found: ID does not exist" containerID="8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.897986 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2"} err="failed to get container status \"8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2\": rpc error: code = NotFound desc = could not find container \"8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2\": container with ID starting with 8d7f5875684f1e0bd285c3e34af342cd1f3b91dc3729759ee20e5d80dd8b57b2 not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.898008 4753 scope.go:117] "RemoveContainer" containerID="1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.898593 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549\": container with ID starting with 1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549 not found: ID does not exist" containerID="1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.898634 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549"} err="failed to get container status \"1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549\": rpc error: code = NotFound desc = could not find container \"1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549\": container with ID starting with 1060bbd64684ed5c03eaa78eeb709292877bd744de1314fee9630f048ab1b549 not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.898664 4753 scope.go:117] "RemoveContainer" containerID="c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.899061 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e\": container with ID starting with c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e not found: ID does not exist" containerID="c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.899310 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e"} err="failed to get container status \"c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e\": rpc error: code = NotFound desc = could not find container \"c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e\": container with ID starting with c8ca6faebf2a7c46ad4412d8701441ce5670dec51344edf7ff524f17fe8ee19e not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.899503 4753 scope.go:117] "RemoveContainer" containerID="f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.912016 4753 scope.go:117] "RemoveContainer" containerID="80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.927328 4753 scope.go:117] "RemoveContainer" containerID="f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.946854 4753 scope.go:117] "RemoveContainer" containerID="f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.947298 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9\": container with ID starting with f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9 not found: ID does not exist" containerID="f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.947506 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9"} err="failed to get container status \"f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9\": rpc error: code = NotFound desc = could not find container \"f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9\": container with ID starting with f14d90a27c7e66717d70ecfecac91b97116b3574721b0da5e5acf948c87cbbc9 not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.947682 4753 scope.go:117] "RemoveContainer" containerID="80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.948235 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2\": container with ID starting with 80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2 not found: ID does not exist" containerID="80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.948272 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2"} err="failed to get container status \"80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2\": rpc error: code = NotFound desc = could not find container \"80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2\": container with ID starting with 80ae9785c2385c9f1c68272a56568de0d3efee2f12bfaa718d454fbd7a96e2c2 not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.948301 4753 scope.go:117] "RemoveContainer" containerID="f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.948679 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963\": container with ID starting with f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963 not found: ID does not exist" containerID="f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.948850 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963"} err="failed to get container status \"f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963\": rpc error: code = NotFound desc = could not find container \"f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963\": container with ID starting with f2510cbe030b1ab8d02b9372fb9cfb68d1d405b10e3f0148a4352492f7f28963 not found: ID does not exist" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.949008 4753 scope.go:117] "RemoveContainer" containerID="e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.978833 4753 scope.go:117] "RemoveContainer" containerID="e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385" Jan 29 14:06:55 crc kubenswrapper[4753]: E0129 14:06:55.982564 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385\": container with ID starting with e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385 not found: ID does not exist" containerID="e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385" Jan 29 14:06:55 crc kubenswrapper[4753]: I0129 14:06:55.982613 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385"} err="failed to get container status \"e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385\": rpc error: code = NotFound desc = could not find container \"e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385\": container with ID starting with e66aeaf66039076401d16defacd429061e38a8506d57b5210ecda3973f79c385 not found: ID does not exist" Jan 29 14:06:56 crc kubenswrapper[4753]: I0129 14:06:56.156687 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f60d75-bda2-4816-b146-f5e29203ffbc" path="/var/lib/kubelet/pods/19f60d75-bda2-4816-b146-f5e29203ffbc/volumes" Jan 29 14:06:56 crc kubenswrapper[4753]: I0129 14:06:56.157237 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" path="/var/lib/kubelet/pods/48968e19-5dbd-4231-895f-c28e4178bf33/volumes" Jan 29 14:06:56 crc kubenswrapper[4753]: I0129 14:06:56.157913 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" path="/var/lib/kubelet/pods/5875e879-79f3-4499-b460-b4b4a3e1637c/volumes" Jan 29 14:06:56 crc kubenswrapper[4753]: I0129 14:06:56.158984 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" path="/var/lib/kubelet/pods/dc59885a-9b00-47db-b9d8-e857c589abce/volumes" Jan 29 14:06:56 crc kubenswrapper[4753]: I0129 14:06:56.159571 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5ef4648-0d9e-487b-a796-476432ec0ca8" path="/var/lib/kubelet/pods/e5ef4648-0d9e-487b-a796-476432ec0ca8/volumes" Jan 29 14:06:56 crc kubenswrapper[4753]: I0129 14:06:56.681400 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" event={"ID":"62f2be0b-c83f-4b74-80cc-504f1221b322","Type":"ContainerStarted","Data":"4e4e2b8329a85bdc0f8df98e36e0d94a63738b651d79113715ca43061a5bf068"} Jan 29 14:06:56 crc kubenswrapper[4753]: I0129 14:06:56.681458 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" event={"ID":"62f2be0b-c83f-4b74-80cc-504f1221b322","Type":"ContainerStarted","Data":"dbd6c9f2f6fea79ada6d293d774eeccee1240a26a5ec76057cb1936f054c5fb8"} Jan 29 14:06:56 crc kubenswrapper[4753]: I0129 14:06:56.684289 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:56 crc kubenswrapper[4753]: I0129 14:06:56.688844 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" Jan 29 14:06:56 crc kubenswrapper[4753]: I0129 14:06:56.712634 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xnjnx" podStartSLOduration=2.712600963 podStartE2EDuration="2.712600963s" podCreationTimestamp="2026-01-29 14:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:06:56.704091017 +0000 UTC m=+251.398825429" watchObservedRunningTime="2026-01-29 14:06:56.712600963 +0000 UTC m=+251.407335365" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.091438 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-548968b897-62pzt"] Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.092207 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" podUID="dbe0a176-211b-4530-af4e-17b3f9a97552" containerName="controller-manager" containerID="cri-o://9d607c732f58c7f071355c8094fce7154f999c6496f7baebdeead59ae4dc4e98" gracePeriod=30 Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.115993 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5"] Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.116494 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" podUID="f36ebdc1-69ee-40c7-a775-fe8382995712" containerName="route-controller-manager" containerID="cri-o://0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c" gracePeriod=30 Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.715904 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.741429 4753 generic.go:334] "Generic (PLEG): container finished" podID="dbe0a176-211b-4530-af4e-17b3f9a97552" containerID="9d607c732f58c7f071355c8094fce7154f999c6496f7baebdeead59ae4dc4e98" exitCode=0 Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.741566 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" event={"ID":"dbe0a176-211b-4530-af4e-17b3f9a97552","Type":"ContainerDied","Data":"9d607c732f58c7f071355c8094fce7154f999c6496f7baebdeead59ae4dc4e98"} Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.746940 4753 generic.go:334] "Generic (PLEG): container finished" podID="f36ebdc1-69ee-40c7-a775-fe8382995712" containerID="0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c" exitCode=0 Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.746995 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.747019 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" event={"ID":"f36ebdc1-69ee-40c7-a775-fe8382995712","Type":"ContainerDied","Data":"0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c"} Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.747079 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5" event={"ID":"f36ebdc1-69ee-40c7-a775-fe8382995712","Type":"ContainerDied","Data":"fef86bed45256223c89b3535cd2b73ed09a622658ea0ad89b652deb1cd558efe"} Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.747109 4753 scope.go:117] "RemoveContainer" containerID="0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.776032 4753 scope.go:117] "RemoveContainer" containerID="0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c" Jan 29 14:07:03 crc kubenswrapper[4753]: E0129 14:07:03.776563 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c\": container with ID starting with 0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c not found: ID does not exist" containerID="0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.776607 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c"} err="failed to get container status \"0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c\": rpc error: code = NotFound desc = could not find container \"0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c\": container with ID starting with 0af4a2c2eabbaaa41847b6c7ea9e3b6c9f7b60360d4db0fa4de6c22852eee24c not found: ID does not exist" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.822233 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f36ebdc1-69ee-40c7-a775-fe8382995712-serving-cert\") pod \"f36ebdc1-69ee-40c7-a775-fe8382995712\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.822329 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7tgk\" (UniqueName: \"kubernetes.io/projected/f36ebdc1-69ee-40c7-a775-fe8382995712-kube-api-access-v7tgk\") pod \"f36ebdc1-69ee-40c7-a775-fe8382995712\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.822412 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-client-ca\") pod \"f36ebdc1-69ee-40c7-a775-fe8382995712\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.822437 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-config\") pod \"f36ebdc1-69ee-40c7-a775-fe8382995712\" (UID: \"f36ebdc1-69ee-40c7-a775-fe8382995712\") " Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.823660 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-client-ca" (OuterVolumeSpecName: "client-ca") pod "f36ebdc1-69ee-40c7-a775-fe8382995712" (UID: "f36ebdc1-69ee-40c7-a775-fe8382995712"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.823828 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-config" (OuterVolumeSpecName: "config") pod "f36ebdc1-69ee-40c7-a775-fe8382995712" (UID: "f36ebdc1-69ee-40c7-a775-fe8382995712"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.828680 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36ebdc1-69ee-40c7-a775-fe8382995712-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f36ebdc1-69ee-40c7-a775-fe8382995712" (UID: "f36ebdc1-69ee-40c7-a775-fe8382995712"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.829269 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36ebdc1-69ee-40c7-a775-fe8382995712-kube-api-access-v7tgk" (OuterVolumeSpecName: "kube-api-access-v7tgk") pod "f36ebdc1-69ee-40c7-a775-fe8382995712" (UID: "f36ebdc1-69ee-40c7-a775-fe8382995712"). InnerVolumeSpecName "kube-api-access-v7tgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.842764 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.924592 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-client-ca\") pod \"dbe0a176-211b-4530-af4e-17b3f9a97552\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.924650 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx2kg\" (UniqueName: \"kubernetes.io/projected/dbe0a176-211b-4530-af4e-17b3f9a97552-kube-api-access-dx2kg\") pod \"dbe0a176-211b-4530-af4e-17b3f9a97552\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.924680 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-config\") pod \"dbe0a176-211b-4530-af4e-17b3f9a97552\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.924711 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-proxy-ca-bundles\") pod \"dbe0a176-211b-4530-af4e-17b3f9a97552\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.924735 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbe0a176-211b-4530-af4e-17b3f9a97552-serving-cert\") pod \"dbe0a176-211b-4530-af4e-17b3f9a97552\" (UID: \"dbe0a176-211b-4530-af4e-17b3f9a97552\") " Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.924927 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f36ebdc1-69ee-40c7-a775-fe8382995712-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.924940 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7tgk\" (UniqueName: \"kubernetes.io/projected/f36ebdc1-69ee-40c7-a775-fe8382995712-kube-api-access-v7tgk\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.925066 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.925076 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f36ebdc1-69ee-40c7-a775-fe8382995712-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.925492 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-client-ca" (OuterVolumeSpecName: "client-ca") pod "dbe0a176-211b-4530-af4e-17b3f9a97552" (UID: "dbe0a176-211b-4530-af4e-17b3f9a97552"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.925505 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dbe0a176-211b-4530-af4e-17b3f9a97552" (UID: "dbe0a176-211b-4530-af4e-17b3f9a97552"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.925843 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-config" (OuterVolumeSpecName: "config") pod "dbe0a176-211b-4530-af4e-17b3f9a97552" (UID: "dbe0a176-211b-4530-af4e-17b3f9a97552"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.927984 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbe0a176-211b-4530-af4e-17b3f9a97552-kube-api-access-dx2kg" (OuterVolumeSpecName: "kube-api-access-dx2kg") pod "dbe0a176-211b-4530-af4e-17b3f9a97552" (UID: "dbe0a176-211b-4530-af4e-17b3f9a97552"). InnerVolumeSpecName "kube-api-access-dx2kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:07:03 crc kubenswrapper[4753]: I0129 14:07:03.929661 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbe0a176-211b-4530-af4e-17b3f9a97552-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dbe0a176-211b-4530-af4e-17b3f9a97552" (UID: "dbe0a176-211b-4530-af4e-17b3f9a97552"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.026647 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx2kg\" (UniqueName: \"kubernetes.io/projected/dbe0a176-211b-4530-af4e-17b3f9a97552-kube-api-access-dx2kg\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.026705 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.026721 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.026733 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbe0a176-211b-4530-af4e-17b3f9a97552-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.026746 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbe0a176-211b-4530-af4e-17b3f9a97552-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.092411 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5"] Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.098470 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686c96dcf-f6md5"] Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.159829 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f36ebdc1-69ee-40c7-a775-fe8382995712" path="/var/lib/kubelet/pods/f36ebdc1-69ee-40c7-a775-fe8382995712/volumes" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353419 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd"] Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353682 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerName="extract-utilities" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353696 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerName="extract-utilities" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353706 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353711 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353721 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerName="extract-utilities" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353727 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerName="extract-utilities" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353733 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerName="extract-content" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353741 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerName="extract-content" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353749 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" containerName="extract-content" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353756 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" containerName="extract-content" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353764 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe0a176-211b-4530-af4e-17b3f9a97552" containerName="controller-manager" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353770 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe0a176-211b-4530-af4e-17b3f9a97552" containerName="controller-manager" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353776 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerName="extract-content" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353782 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerName="extract-content" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353790 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" containerName="extract-utilities" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353796 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" containerName="extract-utilities" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353807 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f60d75-bda2-4816-b146-f5e29203ffbc" containerName="marketplace-operator" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353812 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f60d75-bda2-4816-b146-f5e29203ffbc" containerName="marketplace-operator" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353821 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" containerName="extract-utilities" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353827 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" containerName="extract-utilities" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353835 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353842 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353850 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353856 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353863 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353868 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353878 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36ebdc1-69ee-40c7-a775-fe8382995712" containerName="route-controller-manager" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353884 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36ebdc1-69ee-40c7-a775-fe8382995712" containerName="route-controller-manager" Jan 29 14:07:04 crc kubenswrapper[4753]: E0129 14:07:04.353892 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" containerName="extract-content" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353898 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" containerName="extract-content" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353977 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="48968e19-5dbd-4231-895f-c28e4178bf33" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353987 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f60d75-bda2-4816-b146-f5e29203ffbc" containerName="marketplace-operator" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.353996 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ef4648-0d9e-487b-a796-476432ec0ca8" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.354005 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc59885a-9b00-47db-b9d8-e857c589abce" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.354013 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f36ebdc1-69ee-40c7-a775-fe8382995712" containerName="route-controller-manager" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.354022 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbe0a176-211b-4530-af4e-17b3f9a97552" containerName="controller-manager" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.354031 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5875e879-79f3-4499-b460-b4b4a3e1637c" containerName="registry-server" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.354414 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.370278 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc"] Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.371340 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.376360 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.376393 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.376951 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.377223 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.377722 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.378740 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc"] Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.378969 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.383228 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd"] Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.430579 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-config\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.430622 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-proxy-ca-bundles\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.430645 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrxcz\" (UniqueName: \"kubernetes.io/projected/53a58427-36ab-4310-bcb4-9d70376435d9-kube-api-access-xrxcz\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.430670 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df8272e7-785d-4800-aa1e-7c40f655bb23-serving-cert\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.430705 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a58427-36ab-4310-bcb4-9d70376435d9-serving-cert\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.430733 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-config\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.430754 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-client-ca\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.430788 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk8qb\" (UniqueName: \"kubernetes.io/projected/df8272e7-785d-4800-aa1e-7c40f655bb23-kube-api-access-zk8qb\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.430818 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-client-ca\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.532527 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-proxy-ca-bundles\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.532877 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrxcz\" (UniqueName: \"kubernetes.io/projected/53a58427-36ab-4310-bcb4-9d70376435d9-kube-api-access-xrxcz\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.532939 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df8272e7-785d-4800-aa1e-7c40f655bb23-serving-cert\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.532973 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a58427-36ab-4310-bcb4-9d70376435d9-serving-cert\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.533087 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-config\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.533109 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-client-ca\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.533205 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk8qb\" (UniqueName: \"kubernetes.io/projected/df8272e7-785d-4800-aa1e-7c40f655bb23-kube-api-access-zk8qb\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.533350 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-client-ca\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.533421 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-config\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.535665 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-config\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.537908 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-client-ca\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.542658 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-client-ca\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.542874 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a58427-36ab-4310-bcb4-9d70376435d9-serving-cert\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.548067 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-proxy-ca-bundles\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.551626 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk8qb\" (UniqueName: \"kubernetes.io/projected/df8272e7-785d-4800-aa1e-7c40f655bb23-kube-api-access-zk8qb\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.553097 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-config\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.558249 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df8272e7-785d-4800-aa1e-7c40f655bb23-serving-cert\") pod \"route-controller-manager-866f46fcdc-bkzmc\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.566957 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrxcz\" (UniqueName: \"kubernetes.io/projected/53a58427-36ab-4310-bcb4-9d70376435d9-kube-api-access-xrxcz\") pod \"controller-manager-9b9b64d5f-s8ttd\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.672804 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.761247 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" event={"ID":"dbe0a176-211b-4530-af4e-17b3f9a97552","Type":"ContainerDied","Data":"057bd0ebb10269a346cfd7a8248d7d4a3a002948d65df346f86da1b219188af1"} Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.761565 4753 scope.go:117] "RemoveContainer" containerID="9d607c732f58c7f071355c8094fce7154f999c6496f7baebdeead59ae4dc4e98" Jan 29 14:07:04 crc kubenswrapper[4753]: I0129 14:07:04.761342 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548968b897-62pzt" Jan 29 14:07:05 crc kubenswrapper[4753]: I0129 14:07:05.107606 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd"] Jan 29 14:07:05 crc kubenswrapper[4753]: W0129 14:07:05.117053 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53a58427_36ab_4310_bcb4_9d70376435d9.slice/crio-b8c298dae090d852e12b2becffba8abf5429e3ab0fbc2e5842ff5c1fce5bfe89 WatchSource:0}: Error finding container b8c298dae090d852e12b2becffba8abf5429e3ab0fbc2e5842ff5c1fce5bfe89: Status 404 returned error can't find the container with id b8c298dae090d852e12b2becffba8abf5429e3ab0fbc2e5842ff5c1fce5bfe89 Jan 29 14:07:05 crc kubenswrapper[4753]: I0129 14:07:05.622532 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:05 crc kubenswrapper[4753]: I0129 14:07:05.636126 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-548968b897-62pzt"] Jan 29 14:07:05 crc kubenswrapper[4753]: I0129 14:07:05.640077 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-548968b897-62pzt"] Jan 29 14:07:05 crc kubenswrapper[4753]: I0129 14:07:05.771855 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" event={"ID":"53a58427-36ab-4310-bcb4-9d70376435d9","Type":"ContainerStarted","Data":"d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879"} Jan 29 14:07:05 crc kubenswrapper[4753]: I0129 14:07:05.772475 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" event={"ID":"53a58427-36ab-4310-bcb4-9d70376435d9","Type":"ContainerStarted","Data":"b8c298dae090d852e12b2becffba8abf5429e3ab0fbc2e5842ff5c1fce5bfe89"} Jan 29 14:07:05 crc kubenswrapper[4753]: I0129 14:07:05.772507 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:05 crc kubenswrapper[4753]: I0129 14:07:05.782294 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:07:05 crc kubenswrapper[4753]: I0129 14:07:05.800943 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" podStartSLOduration=2.800921717 podStartE2EDuration="2.800921717s" podCreationTimestamp="2026-01-29 14:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:07:05.795361485 +0000 UTC m=+260.490095867" watchObservedRunningTime="2026-01-29 14:07:05.800921717 +0000 UTC m=+260.495656109" Jan 29 14:07:06 crc kubenswrapper[4753]: I0129 14:07:06.033563 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc"] Jan 29 14:07:06 crc kubenswrapper[4753]: W0129 14:07:06.048337 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf8272e7_785d_4800_aa1e_7c40f655bb23.slice/crio-5eee4adcc41d7cd69a128fc876be1852db544dd0e082b594427cdc0f816cc102 WatchSource:0}: Error finding container 5eee4adcc41d7cd69a128fc876be1852db544dd0e082b594427cdc0f816cc102: Status 404 returned error can't find the container with id 5eee4adcc41d7cd69a128fc876be1852db544dd0e082b594427cdc0f816cc102 Jan 29 14:07:06 crc kubenswrapper[4753]: I0129 14:07:06.162069 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbe0a176-211b-4530-af4e-17b3f9a97552" path="/var/lib/kubelet/pods/dbe0a176-211b-4530-af4e-17b3f9a97552/volumes" Jan 29 14:07:06 crc kubenswrapper[4753]: I0129 14:07:06.801572 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" event={"ID":"df8272e7-785d-4800-aa1e-7c40f655bb23","Type":"ContainerStarted","Data":"a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde"} Jan 29 14:07:06 crc kubenswrapper[4753]: I0129 14:07:06.802080 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" event={"ID":"df8272e7-785d-4800-aa1e-7c40f655bb23","Type":"ContainerStarted","Data":"5eee4adcc41d7cd69a128fc876be1852db544dd0e082b594427cdc0f816cc102"} Jan 29 14:07:06 crc kubenswrapper[4753]: I0129 14:07:06.824646 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" podStartSLOduration=3.824621632 podStartE2EDuration="3.824621632s" podCreationTimestamp="2026-01-29 14:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:07:06.821545799 +0000 UTC m=+261.516280191" watchObservedRunningTime="2026-01-29 14:07:06.824621632 +0000 UTC m=+261.519356014" Jan 29 14:07:07 crc kubenswrapper[4753]: I0129 14:07:07.807532 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:07 crc kubenswrapper[4753]: I0129 14:07:07.813131 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.124736 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc"] Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.125692 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" podUID="df8272e7-785d-4800-aa1e-7c40f655bb23" containerName="route-controller-manager" containerID="cri-o://a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde" gracePeriod=30 Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.615236 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.808433 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk8qb\" (UniqueName: \"kubernetes.io/projected/df8272e7-785d-4800-aa1e-7c40f655bb23-kube-api-access-zk8qb\") pod \"df8272e7-785d-4800-aa1e-7c40f655bb23\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.808544 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df8272e7-785d-4800-aa1e-7c40f655bb23-serving-cert\") pod \"df8272e7-785d-4800-aa1e-7c40f655bb23\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.808588 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-config\") pod \"df8272e7-785d-4800-aa1e-7c40f655bb23\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.808607 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-client-ca\") pod \"df8272e7-785d-4800-aa1e-7c40f655bb23\" (UID: \"df8272e7-785d-4800-aa1e-7c40f655bb23\") " Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.809554 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-client-ca" (OuterVolumeSpecName: "client-ca") pod "df8272e7-785d-4800-aa1e-7c40f655bb23" (UID: "df8272e7-785d-4800-aa1e-7c40f655bb23"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.810313 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-config" (OuterVolumeSpecName: "config") pod "df8272e7-785d-4800-aa1e-7c40f655bb23" (UID: "df8272e7-785d-4800-aa1e-7c40f655bb23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.814194 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df8272e7-785d-4800-aa1e-7c40f655bb23-kube-api-access-zk8qb" (OuterVolumeSpecName: "kube-api-access-zk8qb") pod "df8272e7-785d-4800-aa1e-7c40f655bb23" (UID: "df8272e7-785d-4800-aa1e-7c40f655bb23"). InnerVolumeSpecName "kube-api-access-zk8qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.815904 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8272e7-785d-4800-aa1e-7c40f655bb23-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "df8272e7-785d-4800-aa1e-7c40f655bb23" (UID: "df8272e7-785d-4800-aa1e-7c40f655bb23"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.909355 4753 generic.go:334] "Generic (PLEG): container finished" podID="df8272e7-785d-4800-aa1e-7c40f655bb23" containerID="a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde" exitCode=0 Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.909440 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" event={"ID":"df8272e7-785d-4800-aa1e-7c40f655bb23","Type":"ContainerDied","Data":"a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde"} Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.909492 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" event={"ID":"df8272e7-785d-4800-aa1e-7c40f655bb23","Type":"ContainerDied","Data":"5eee4adcc41d7cd69a128fc876be1852db544dd0e082b594427cdc0f816cc102"} Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.909505 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.909531 4753 scope.go:117] "RemoveContainer" containerID="a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.910090 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk8qb\" (UniqueName: \"kubernetes.io/projected/df8272e7-785d-4800-aa1e-7c40f655bb23-kube-api-access-zk8qb\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.910279 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df8272e7-785d-4800-aa1e-7c40f655bb23-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.910325 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.910350 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df8272e7-785d-4800-aa1e-7c40f655bb23-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.943451 4753 scope.go:117] "RemoveContainer" containerID="a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde" Jan 29 14:07:23 crc kubenswrapper[4753]: E0129 14:07:23.943914 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde\": container with ID starting with a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde not found: ID does not exist" containerID="a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.943988 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde"} err="failed to get container status \"a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde\": rpc error: code = NotFound desc = could not find container \"a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde\": container with ID starting with a51552d227dc2790e8697422a8632e7a29833b7739550968bb479ef6aae0adde not found: ID does not exist" Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.957351 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc"] Jan 29 14:07:23 crc kubenswrapper[4753]: I0129 14:07:23.963028 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-866f46fcdc-bkzmc"] Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.161750 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df8272e7-785d-4800-aa1e-7c40f655bb23" path="/var/lib/kubelet/pods/df8272e7-785d-4800-aa1e-7c40f655bb23/volumes" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.364404 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4"] Jan 29 14:07:24 crc kubenswrapper[4753]: E0129 14:07:24.364698 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8272e7-785d-4800-aa1e-7c40f655bb23" containerName="route-controller-manager" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.365253 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8272e7-785d-4800-aa1e-7c40f655bb23" containerName="route-controller-manager" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.365750 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="df8272e7-785d-4800-aa1e-7c40f655bb23" containerName="route-controller-manager" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.366516 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.378986 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4"] Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.401851 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.402440 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.402721 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.402845 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.403232 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.403658 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.416266 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-serving-cert\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.416328 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-client-ca\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.416375 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-config\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.416423 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlsl8\" (UniqueName: \"kubernetes.io/projected/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-kube-api-access-hlsl8\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.517237 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlsl8\" (UniqueName: \"kubernetes.io/projected/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-kube-api-access-hlsl8\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.517332 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-serving-cert\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.517358 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-client-ca\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.517382 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-config\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.519000 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-config\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.519325 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-client-ca\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.526815 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-serving-cert\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.553292 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlsl8\" (UniqueName: \"kubernetes.io/projected/5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3-kube-api-access-hlsl8\") pod \"route-controller-manager-7686c96dcf-7pps4\" (UID: \"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3\") " pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:24 crc kubenswrapper[4753]: I0129 14:07:24.722712 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:25 crc kubenswrapper[4753]: I0129 14:07:25.166516 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4"] Jan 29 14:07:25 crc kubenswrapper[4753]: I0129 14:07:25.924049 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" event={"ID":"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3","Type":"ContainerStarted","Data":"02f7d12e2e8471ca5ee7bc248dd7f34b1f572dee32871b425d40d9cecf8c36c1"} Jan 29 14:07:25 crc kubenswrapper[4753]: I0129 14:07:25.924400 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" event={"ID":"5f8bf0c8-bcd5-4816-965b-bc7cabfdceb3","Type":"ContainerStarted","Data":"3a34d9f65aab48a43a1b28a7cf10ad5450a23395b7a5a6b17cbe7d9a4687d1e3"} Jan 29 14:07:25 crc kubenswrapper[4753]: I0129 14:07:25.925308 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:25 crc kubenswrapper[4753]: I0129 14:07:25.931935 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" Jan 29 14:07:25 crc kubenswrapper[4753]: I0129 14:07:25.943356 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7686c96dcf-7pps4" podStartSLOduration=2.943332754 podStartE2EDuration="2.943332754s" podCreationTimestamp="2026-01-29 14:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:07:25.941251278 +0000 UTC m=+280.635985660" watchObservedRunningTime="2026-01-29 14:07:25.943332754 +0000 UTC m=+280.638067146" Jan 29 14:07:45 crc kubenswrapper[4753]: I0129 14:07:45.936698 4753 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 29 14:07:50 crc kubenswrapper[4753]: I0129 14:07:50.832659 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l6qbh"] Jan 29 14:07:50 crc kubenswrapper[4753]: I0129 14:07:50.835637 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:50 crc kubenswrapper[4753]: I0129 14:07:50.839705 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 29 14:07:50 crc kubenswrapper[4753]: I0129 14:07:50.849020 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l6qbh"] Jan 29 14:07:50 crc kubenswrapper[4753]: I0129 14:07:50.981399 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a8e6cb9-8e67-42dd-9827-812a46628fb5-catalog-content\") pod \"redhat-operators-l6qbh\" (UID: \"3a8e6cb9-8e67-42dd-9827-812a46628fb5\") " pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:50 crc kubenswrapper[4753]: I0129 14:07:50.981458 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcjd6\" (UniqueName: \"kubernetes.io/projected/3a8e6cb9-8e67-42dd-9827-812a46628fb5-kube-api-access-dcjd6\") pod \"redhat-operators-l6qbh\" (UID: \"3a8e6cb9-8e67-42dd-9827-812a46628fb5\") " pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:50 crc kubenswrapper[4753]: I0129 14:07:50.981554 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a8e6cb9-8e67-42dd-9827-812a46628fb5-utilities\") pod \"redhat-operators-l6qbh\" (UID: \"3a8e6cb9-8e67-42dd-9827-812a46628fb5\") " pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.015956 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8spmk"] Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.017311 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.019396 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.033652 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8spmk"] Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.083255 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcjd6\" (UniqueName: \"kubernetes.io/projected/3a8e6cb9-8e67-42dd-9827-812a46628fb5-kube-api-access-dcjd6\") pod \"redhat-operators-l6qbh\" (UID: \"3a8e6cb9-8e67-42dd-9827-812a46628fb5\") " pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.083320 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a8e6cb9-8e67-42dd-9827-812a46628fb5-utilities\") pod \"redhat-operators-l6qbh\" (UID: \"3a8e6cb9-8e67-42dd-9827-812a46628fb5\") " pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.083405 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a8e6cb9-8e67-42dd-9827-812a46628fb5-catalog-content\") pod \"redhat-operators-l6qbh\" (UID: \"3a8e6cb9-8e67-42dd-9827-812a46628fb5\") " pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.083910 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a8e6cb9-8e67-42dd-9827-812a46628fb5-catalog-content\") pod \"redhat-operators-l6qbh\" (UID: \"3a8e6cb9-8e67-42dd-9827-812a46628fb5\") " pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.084202 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a8e6cb9-8e67-42dd-9827-812a46628fb5-utilities\") pod \"redhat-operators-l6qbh\" (UID: \"3a8e6cb9-8e67-42dd-9827-812a46628fb5\") " pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.107986 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcjd6\" (UniqueName: \"kubernetes.io/projected/3a8e6cb9-8e67-42dd-9827-812a46628fb5-kube-api-access-dcjd6\") pod \"redhat-operators-l6qbh\" (UID: \"3a8e6cb9-8e67-42dd-9827-812a46628fb5\") " pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.185001 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9371f46-b818-44df-9f4f-4c04ac5fd78d-utilities\") pod \"redhat-marketplace-8spmk\" (UID: \"b9371f46-b818-44df-9f4f-4c04ac5fd78d\") " pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.185077 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz8c7\" (UniqueName: \"kubernetes.io/projected/b9371f46-b818-44df-9f4f-4c04ac5fd78d-kube-api-access-nz8c7\") pod \"redhat-marketplace-8spmk\" (UID: \"b9371f46-b818-44df-9f4f-4c04ac5fd78d\") " pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.185133 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9371f46-b818-44df-9f4f-4c04ac5fd78d-catalog-content\") pod \"redhat-marketplace-8spmk\" (UID: \"b9371f46-b818-44df-9f4f-4c04ac5fd78d\") " pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.194961 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.286722 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9371f46-b818-44df-9f4f-4c04ac5fd78d-utilities\") pod \"redhat-marketplace-8spmk\" (UID: \"b9371f46-b818-44df-9f4f-4c04ac5fd78d\") " pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.286783 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz8c7\" (UniqueName: \"kubernetes.io/projected/b9371f46-b818-44df-9f4f-4c04ac5fd78d-kube-api-access-nz8c7\") pod \"redhat-marketplace-8spmk\" (UID: \"b9371f46-b818-44df-9f4f-4c04ac5fd78d\") " pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.286831 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9371f46-b818-44df-9f4f-4c04ac5fd78d-catalog-content\") pod \"redhat-marketplace-8spmk\" (UID: \"b9371f46-b818-44df-9f4f-4c04ac5fd78d\") " pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.287870 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9371f46-b818-44df-9f4f-4c04ac5fd78d-catalog-content\") pod \"redhat-marketplace-8spmk\" (UID: \"b9371f46-b818-44df-9f4f-4c04ac5fd78d\") " pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.287873 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9371f46-b818-44df-9f4f-4c04ac5fd78d-utilities\") pod \"redhat-marketplace-8spmk\" (UID: \"b9371f46-b818-44df-9f4f-4c04ac5fd78d\") " pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.321298 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz8c7\" (UniqueName: \"kubernetes.io/projected/b9371f46-b818-44df-9f4f-4c04ac5fd78d-kube-api-access-nz8c7\") pod \"redhat-marketplace-8spmk\" (UID: \"b9371f46-b818-44df-9f4f-4c04ac5fd78d\") " pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.344908 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.637535 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l6qbh"] Jan 29 14:07:51 crc kubenswrapper[4753]: I0129 14:07:51.742643 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8spmk"] Jan 29 14:07:52 crc kubenswrapper[4753]: I0129 14:07:52.103754 4753 generic.go:334] "Generic (PLEG): container finished" podID="3a8e6cb9-8e67-42dd-9827-812a46628fb5" containerID="4b4d20adfeaa77ec9f8b8302405e457f1fe4d1cfb8937f1ca428e8dbf47d1afc" exitCode=0 Jan 29 14:07:52 crc kubenswrapper[4753]: I0129 14:07:52.103838 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6qbh" event={"ID":"3a8e6cb9-8e67-42dd-9827-812a46628fb5","Type":"ContainerDied","Data":"4b4d20adfeaa77ec9f8b8302405e457f1fe4d1cfb8937f1ca428e8dbf47d1afc"} Jan 29 14:07:52 crc kubenswrapper[4753]: I0129 14:07:52.104188 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6qbh" event={"ID":"3a8e6cb9-8e67-42dd-9827-812a46628fb5","Type":"ContainerStarted","Data":"33d8dc1619f4f659b67e68f9d11fd650758e74b4cb959886b254d24b8bc0a243"} Jan 29 14:07:52 crc kubenswrapper[4753]: I0129 14:07:52.106956 4753 generic.go:334] "Generic (PLEG): container finished" podID="b9371f46-b818-44df-9f4f-4c04ac5fd78d" containerID="548dcc4e86956c4d9e4f561d6f2165a1f87059df0962f3fa8ab078ffcb959067" exitCode=0 Jan 29 14:07:52 crc kubenswrapper[4753]: I0129 14:07:52.107000 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8spmk" event={"ID":"b9371f46-b818-44df-9f4f-4c04ac5fd78d","Type":"ContainerDied","Data":"548dcc4e86956c4d9e4f561d6f2165a1f87059df0962f3fa8ab078ffcb959067"} Jan 29 14:07:52 crc kubenswrapper[4753]: I0129 14:07:52.107031 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8spmk" event={"ID":"b9371f46-b818-44df-9f4f-4c04ac5fd78d","Type":"ContainerStarted","Data":"7e60a2641df1362f4b06683b97e37bbade60c902913fc1a2657dada6a31356eb"} Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.115817 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8spmk" event={"ID":"b9371f46-b818-44df-9f4f-4c04ac5fd78d","Type":"ContainerStarted","Data":"0eaad309e3a9245227fa43be465f6d68512d71bf06de6a4b410fdf3bd5d4c636"} Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.118810 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6qbh" event={"ID":"3a8e6cb9-8e67-42dd-9827-812a46628fb5","Type":"ContainerStarted","Data":"939c45372e6592b7f45870dff771a1cd96e86f029b558d81c74b3d54c500c933"} Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.219821 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-shp7j"] Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.221268 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.226424 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.234798 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shp7j"] Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.318795 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78aeb34e-8507-419c-ae21-144a722afc4a-utilities\") pod \"certified-operators-shp7j\" (UID: \"78aeb34e-8507-419c-ae21-144a722afc4a\") " pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.318847 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmchl\" (UniqueName: \"kubernetes.io/projected/78aeb34e-8507-419c-ae21-144a722afc4a-kube-api-access-kmchl\") pod \"certified-operators-shp7j\" (UID: \"78aeb34e-8507-419c-ae21-144a722afc4a\") " pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.318901 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78aeb34e-8507-419c-ae21-144a722afc4a-catalog-content\") pod \"certified-operators-shp7j\" (UID: \"78aeb34e-8507-419c-ae21-144a722afc4a\") " pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.419755 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78aeb34e-8507-419c-ae21-144a722afc4a-utilities\") pod \"certified-operators-shp7j\" (UID: \"78aeb34e-8507-419c-ae21-144a722afc4a\") " pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.419803 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmchl\" (UniqueName: \"kubernetes.io/projected/78aeb34e-8507-419c-ae21-144a722afc4a-kube-api-access-kmchl\") pod \"certified-operators-shp7j\" (UID: \"78aeb34e-8507-419c-ae21-144a722afc4a\") " pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.419845 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78aeb34e-8507-419c-ae21-144a722afc4a-catalog-content\") pod \"certified-operators-shp7j\" (UID: \"78aeb34e-8507-419c-ae21-144a722afc4a\") " pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.420439 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78aeb34e-8507-419c-ae21-144a722afc4a-utilities\") pod \"certified-operators-shp7j\" (UID: \"78aeb34e-8507-419c-ae21-144a722afc4a\") " pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.420505 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78aeb34e-8507-419c-ae21-144a722afc4a-catalog-content\") pod \"certified-operators-shp7j\" (UID: \"78aeb34e-8507-419c-ae21-144a722afc4a\") " pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.426591 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bzksl"] Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.428296 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.433029 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.440793 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bzksl"] Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.452741 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmchl\" (UniqueName: \"kubernetes.io/projected/78aeb34e-8507-419c-ae21-144a722afc4a-kube-api-access-kmchl\") pod \"certified-operators-shp7j\" (UID: \"78aeb34e-8507-419c-ae21-144a722afc4a\") " pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.465529 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5sqmr"] Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.466242 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.474356 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5sqmr"] Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.520697 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg72g\" (UniqueName: \"kubernetes.io/projected/02b5ed5f-a363-45d9-b107-1d33890a617c-kube-api-access-bg72g\") pod \"community-operators-bzksl\" (UID: \"02b5ed5f-a363-45d9-b107-1d33890a617c\") " pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.520747 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02b5ed5f-a363-45d9-b107-1d33890a617c-utilities\") pod \"community-operators-bzksl\" (UID: \"02b5ed5f-a363-45d9-b107-1d33890a617c\") " pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.520819 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02b5ed5f-a363-45d9-b107-1d33890a617c-catalog-content\") pod \"community-operators-bzksl\" (UID: \"02b5ed5f-a363-45d9-b107-1d33890a617c\") " pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.544667 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.624907 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02b5ed5f-a363-45d9-b107-1d33890a617c-utilities\") pod \"community-operators-bzksl\" (UID: \"02b5ed5f-a363-45d9-b107-1d33890a617c\") " pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625437 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625471 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625496 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02b5ed5f-a363-45d9-b107-1d33890a617c-catalog-content\") pod \"community-operators-bzksl\" (UID: \"02b5ed5f-a363-45d9-b107-1d33890a617c\") " pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625521 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-registry-tls\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625546 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625564 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-registry-certificates\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625601 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-trusted-ca\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625619 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-bound-sa-token\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625636 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqqkl\" (UniqueName: \"kubernetes.io/projected/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-kube-api-access-dqqkl\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625744 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02b5ed5f-a363-45d9-b107-1d33890a617c-utilities\") pod \"community-operators-bzksl\" (UID: \"02b5ed5f-a363-45d9-b107-1d33890a617c\") " pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625810 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg72g\" (UniqueName: \"kubernetes.io/projected/02b5ed5f-a363-45d9-b107-1d33890a617c-kube-api-access-bg72g\") pod \"community-operators-bzksl\" (UID: \"02b5ed5f-a363-45d9-b107-1d33890a617c\") " pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.625931 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02b5ed5f-a363-45d9-b107-1d33890a617c-catalog-content\") pod \"community-operators-bzksl\" (UID: \"02b5ed5f-a363-45d9-b107-1d33890a617c\") " pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.669142 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.670168 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg72g\" (UniqueName: \"kubernetes.io/projected/02b5ed5f-a363-45d9-b107-1d33890a617c-kube-api-access-bg72g\") pod \"community-operators-bzksl\" (UID: \"02b5ed5f-a363-45d9-b107-1d33890a617c\") " pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.739291 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-registry-certificates\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.739385 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-trusted-ca\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.739417 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-bound-sa-token\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.739443 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqqkl\" (UniqueName: \"kubernetes.io/projected/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-kube-api-access-dqqkl\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.739548 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.740640 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-registry-certificates\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.741005 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-trusted-ca\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.741097 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-registry-tls\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.741163 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.741356 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.744853 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-registry-tls\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.746751 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.746889 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.760954 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-bound-sa-token\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.764846 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqqkl\" (UniqueName: \"kubernetes.io/projected/a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678-kube-api-access-dqqkl\") pod \"image-registry-66df7c8f76-5sqmr\" (UID: \"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.780690 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:53 crc kubenswrapper[4753]: I0129 14:07:53.929452 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shp7j"] Jan 29 14:07:53 crc kubenswrapper[4753]: W0129 14:07:53.936126 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78aeb34e_8507_419c_ae21_144a722afc4a.slice/crio-caa6d6be98da9e3d66688e614e4f8be9b309d211fd4af544d08597c326bca097 WatchSource:0}: Error finding container caa6d6be98da9e3d66688e614e4f8be9b309d211fd4af544d08597c326bca097: Status 404 returned error can't find the container with id caa6d6be98da9e3d66688e614e4f8be9b309d211fd4af544d08597c326bca097 Jan 29 14:07:54 crc kubenswrapper[4753]: I0129 14:07:54.010904 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5sqmr"] Jan 29 14:07:54 crc kubenswrapper[4753]: W0129 14:07:54.019634 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9c9d1d3_79d7_4076_9d2e_3ec9d98dc678.slice/crio-20c12cbcb61656f05a33bda83a5827531c985aba2c5701b69b558fc2a12b6edc WatchSource:0}: Error finding container 20c12cbcb61656f05a33bda83a5827531c985aba2c5701b69b558fc2a12b6edc: Status 404 returned error can't find the container with id 20c12cbcb61656f05a33bda83a5827531c985aba2c5701b69b558fc2a12b6edc Jan 29 14:07:54 crc kubenswrapper[4753]: I0129 14:07:54.126122 4753 generic.go:334] "Generic (PLEG): container finished" podID="b9371f46-b818-44df-9f4f-4c04ac5fd78d" containerID="0eaad309e3a9245227fa43be465f6d68512d71bf06de6a4b410fdf3bd5d4c636" exitCode=0 Jan 29 14:07:54 crc kubenswrapper[4753]: I0129 14:07:54.126656 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8spmk" event={"ID":"b9371f46-b818-44df-9f4f-4c04ac5fd78d","Type":"ContainerDied","Data":"0eaad309e3a9245227fa43be465f6d68512d71bf06de6a4b410fdf3bd5d4c636"} Jan 29 14:07:54 crc kubenswrapper[4753]: I0129 14:07:54.130475 4753 generic.go:334] "Generic (PLEG): container finished" podID="3a8e6cb9-8e67-42dd-9827-812a46628fb5" containerID="939c45372e6592b7f45870dff771a1cd96e86f029b558d81c74b3d54c500c933" exitCode=0 Jan 29 14:07:54 crc kubenswrapper[4753]: I0129 14:07:54.130536 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6qbh" event={"ID":"3a8e6cb9-8e67-42dd-9827-812a46628fb5","Type":"ContainerDied","Data":"939c45372e6592b7f45870dff771a1cd96e86f029b558d81c74b3d54c500c933"} Jan 29 14:07:54 crc kubenswrapper[4753]: I0129 14:07:54.136946 4753 generic.go:334] "Generic (PLEG): container finished" podID="78aeb34e-8507-419c-ae21-144a722afc4a" containerID="613407f31392f90622a3c9bee626f8fb12195c0b0272eda25580db569d9810ce" exitCode=0 Jan 29 14:07:54 crc kubenswrapper[4753]: I0129 14:07:54.137008 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shp7j" event={"ID":"78aeb34e-8507-419c-ae21-144a722afc4a","Type":"ContainerDied","Data":"613407f31392f90622a3c9bee626f8fb12195c0b0272eda25580db569d9810ce"} Jan 29 14:07:54 crc kubenswrapper[4753]: I0129 14:07:54.137029 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shp7j" event={"ID":"78aeb34e-8507-419c-ae21-144a722afc4a","Type":"ContainerStarted","Data":"caa6d6be98da9e3d66688e614e4f8be9b309d211fd4af544d08597c326bca097"} Jan 29 14:07:54 crc kubenswrapper[4753]: I0129 14:07:54.142085 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" event={"ID":"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678","Type":"ContainerStarted","Data":"20c12cbcb61656f05a33bda83a5827531c985aba2c5701b69b558fc2a12b6edc"} Jan 29 14:07:54 crc kubenswrapper[4753]: I0129 14:07:54.169296 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bzksl"] Jan 29 14:07:54 crc kubenswrapper[4753]: W0129 14:07:54.177490 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02b5ed5f_a363_45d9_b107_1d33890a617c.slice/crio-06090ebdde5f7e86e17260b05b80a4ea5e52cc714ecb351121e1d993d4302e91 WatchSource:0}: Error finding container 06090ebdde5f7e86e17260b05b80a4ea5e52cc714ecb351121e1d993d4302e91: Status 404 returned error can't find the container with id 06090ebdde5f7e86e17260b05b80a4ea5e52cc714ecb351121e1d993d4302e91 Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.149956 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8spmk" event={"ID":"b9371f46-b818-44df-9f4f-4c04ac5fd78d","Type":"ContainerStarted","Data":"00297101a329c0849e419824768d2228e97dbb16e28b493b5f31ad4fb27bdd88"} Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.151567 4753 generic.go:334] "Generic (PLEG): container finished" podID="02b5ed5f-a363-45d9-b107-1d33890a617c" containerID="604f13162d250b55a90d74cede1a4347f244ac1c3b0c73be11dca0b720735e1c" exitCode=0 Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.151634 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bzksl" event={"ID":"02b5ed5f-a363-45d9-b107-1d33890a617c","Type":"ContainerDied","Data":"604f13162d250b55a90d74cede1a4347f244ac1c3b0c73be11dca0b720735e1c"} Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.151654 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bzksl" event={"ID":"02b5ed5f-a363-45d9-b107-1d33890a617c","Type":"ContainerStarted","Data":"06090ebdde5f7e86e17260b05b80a4ea5e52cc714ecb351121e1d993d4302e91"} Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.154741 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6qbh" event={"ID":"3a8e6cb9-8e67-42dd-9827-812a46628fb5","Type":"ContainerStarted","Data":"0241befec170c51cf42f8a9d374b107581e314b0a008b0e00d18a28b3be7ac97"} Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.157209 4753 generic.go:334] "Generic (PLEG): container finished" podID="78aeb34e-8507-419c-ae21-144a722afc4a" containerID="cd7946f7586e7346798f930c48719f1efbb46c177b9a31ab73a0c3c07e699cec" exitCode=0 Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.157278 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shp7j" event={"ID":"78aeb34e-8507-419c-ae21-144a722afc4a","Type":"ContainerDied","Data":"cd7946f7586e7346798f930c48719f1efbb46c177b9a31ab73a0c3c07e699cec"} Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.159088 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" event={"ID":"a9c9d1d3-79d7-4076-9d2e-3ec9d98dc678","Type":"ContainerStarted","Data":"9455ba1c3cf496c5cda0270392ea61f2ef5c9ff347ea26d0c11be1c4241a43c4"} Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.159228 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.177353 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8spmk" podStartSLOduration=1.6234221610000001 podStartE2EDuration="4.177336471s" podCreationTimestamp="2026-01-29 14:07:51 +0000 UTC" firstStartedPulling="2026-01-29 14:07:52.109306027 +0000 UTC m=+306.804040459" lastFinishedPulling="2026-01-29 14:07:54.663220387 +0000 UTC m=+309.357954769" observedRunningTime="2026-01-29 14:07:55.173904088 +0000 UTC m=+309.868638470" watchObservedRunningTime="2026-01-29 14:07:55.177336471 +0000 UTC m=+309.872070853" Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.201775 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l6qbh" podStartSLOduration=2.694073799 podStartE2EDuration="5.201759908s" podCreationTimestamp="2026-01-29 14:07:50 +0000 UTC" firstStartedPulling="2026-01-29 14:07:52.106268815 +0000 UTC m=+306.801003237" lastFinishedPulling="2026-01-29 14:07:54.613954954 +0000 UTC m=+309.308689346" observedRunningTime="2026-01-29 14:07:55.198605482 +0000 UTC m=+309.893339874" watchObservedRunningTime="2026-01-29 14:07:55.201759908 +0000 UTC m=+309.896494290" Jan 29 14:07:55 crc kubenswrapper[4753]: I0129 14:07:55.269198 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" podStartSLOduration=2.269178665 podStartE2EDuration="2.269178665s" podCreationTimestamp="2026-01-29 14:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:07:55.265754462 +0000 UTC m=+309.960488854" watchObservedRunningTime="2026-01-29 14:07:55.269178665 +0000 UTC m=+309.963913047" Jan 29 14:07:56 crc kubenswrapper[4753]: I0129 14:07:56.172170 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shp7j" event={"ID":"78aeb34e-8507-419c-ae21-144a722afc4a","Type":"ContainerStarted","Data":"09478ba105b92f86b95095ef0a0aa21d4a7f7d63fd413b00c9df5ec866d082ca"} Jan 29 14:07:56 crc kubenswrapper[4753]: I0129 14:07:56.192261 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-shp7j" podStartSLOduration=1.793754695 podStartE2EDuration="3.192245738s" podCreationTimestamp="2026-01-29 14:07:53 +0000 UTC" firstStartedPulling="2026-01-29 14:07:54.139230133 +0000 UTC m=+308.833964515" lastFinishedPulling="2026-01-29 14:07:55.537721166 +0000 UTC m=+310.232455558" observedRunningTime="2026-01-29 14:07:56.188039553 +0000 UTC m=+310.882773935" watchObservedRunningTime="2026-01-29 14:07:56.192245738 +0000 UTC m=+310.886980120" Jan 29 14:07:57 crc kubenswrapper[4753]: I0129 14:07:57.181139 4753 generic.go:334] "Generic (PLEG): container finished" podID="02b5ed5f-a363-45d9-b107-1d33890a617c" containerID="881460e664deaf5464c8a3dc2651f91b716967b02407816e770b2abdd173328d" exitCode=0 Jan 29 14:07:57 crc kubenswrapper[4753]: I0129 14:07:57.181226 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bzksl" event={"ID":"02b5ed5f-a363-45d9-b107-1d33890a617c","Type":"ContainerDied","Data":"881460e664deaf5464c8a3dc2651f91b716967b02407816e770b2abdd173328d"} Jan 29 14:07:58 crc kubenswrapper[4753]: I0129 14:07:58.189183 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bzksl" event={"ID":"02b5ed5f-a363-45d9-b107-1d33890a617c","Type":"ContainerStarted","Data":"e95bf7cbcc4e56498390f58d2d7ef0ddbdc6135c6cab848d1e8ae3dc7fc7e2c3"} Jan 29 14:07:58 crc kubenswrapper[4753]: I0129 14:07:58.218543 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bzksl" podStartSLOduration=2.772093536 podStartE2EDuration="5.218522795s" podCreationTimestamp="2026-01-29 14:07:53 +0000 UTC" firstStartedPulling="2026-01-29 14:07:55.152905166 +0000 UTC m=+309.847639548" lastFinishedPulling="2026-01-29 14:07:57.599334425 +0000 UTC m=+312.294068807" observedRunningTime="2026-01-29 14:07:58.21577906 +0000 UTC m=+312.910513442" watchObservedRunningTime="2026-01-29 14:07:58.218522795 +0000 UTC m=+312.913257177" Jan 29 14:08:01 crc kubenswrapper[4753]: I0129 14:08:01.196026 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:08:01 crc kubenswrapper[4753]: I0129 14:08:01.198833 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:08:01 crc kubenswrapper[4753]: I0129 14:08:01.345840 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:08:01 crc kubenswrapper[4753]: I0129 14:08:01.345930 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:08:01 crc kubenswrapper[4753]: I0129 14:08:01.402961 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:08:02 crc kubenswrapper[4753]: I0129 14:08:02.269510 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l6qbh" podUID="3a8e6cb9-8e67-42dd-9827-812a46628fb5" containerName="registry-server" probeResult="failure" output=< Jan 29 14:08:02 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 14:08:02 crc kubenswrapper[4753]: > Jan 29 14:08:02 crc kubenswrapper[4753]: I0129 14:08:02.292227 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8spmk" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.109522 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd"] Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.109759 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" podUID="53a58427-36ab-4310-bcb4-9d70376435d9" containerName="controller-manager" containerID="cri-o://d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879" gracePeriod=30 Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.545261 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.545662 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.560027 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.617048 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.686686 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-proxy-ca-bundles\") pod \"53a58427-36ab-4310-bcb4-9d70376435d9\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.686770 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-config\") pod \"53a58427-36ab-4310-bcb4-9d70376435d9\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.686817 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-client-ca\") pod \"53a58427-36ab-4310-bcb4-9d70376435d9\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.686859 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrxcz\" (UniqueName: \"kubernetes.io/projected/53a58427-36ab-4310-bcb4-9d70376435d9-kube-api-access-xrxcz\") pod \"53a58427-36ab-4310-bcb4-9d70376435d9\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.687437 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a58427-36ab-4310-bcb4-9d70376435d9-serving-cert\") pod \"53a58427-36ab-4310-bcb4-9d70376435d9\" (UID: \"53a58427-36ab-4310-bcb4-9d70376435d9\") " Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.687863 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "53a58427-36ab-4310-bcb4-9d70376435d9" (UID: "53a58427-36ab-4310-bcb4-9d70376435d9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.688112 4753 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.688203 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-client-ca" (OuterVolumeSpecName: "client-ca") pod "53a58427-36ab-4310-bcb4-9d70376435d9" (UID: "53a58427-36ab-4310-bcb4-9d70376435d9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.688730 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-config" (OuterVolumeSpecName: "config") pod "53a58427-36ab-4310-bcb4-9d70376435d9" (UID: "53a58427-36ab-4310-bcb4-9d70376435d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.695069 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a58427-36ab-4310-bcb4-9d70376435d9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "53a58427-36ab-4310-bcb4-9d70376435d9" (UID: "53a58427-36ab-4310-bcb4-9d70376435d9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.695227 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a58427-36ab-4310-bcb4-9d70376435d9-kube-api-access-xrxcz" (OuterVolumeSpecName: "kube-api-access-xrxcz") pod "53a58427-36ab-4310-bcb4-9d70376435d9" (UID: "53a58427-36ab-4310-bcb4-9d70376435d9"). InnerVolumeSpecName "kube-api-access-xrxcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.747660 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.748566 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.790334 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.790392 4753 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53a58427-36ab-4310-bcb4-9d70376435d9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.790418 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrxcz\" (UniqueName: \"kubernetes.io/projected/53a58427-36ab-4310-bcb4-9d70376435d9-kube-api-access-xrxcz\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.790444 4753 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a58427-36ab-4310-bcb4-9d70376435d9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:03 crc kubenswrapper[4753]: I0129 14:08:03.809272 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.234536 4753 generic.go:334] "Generic (PLEG): container finished" podID="53a58427-36ab-4310-bcb4-9d70376435d9" containerID="d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879" exitCode=0 Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.234600 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" event={"ID":"53a58427-36ab-4310-bcb4-9d70376435d9","Type":"ContainerDied","Data":"d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879"} Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.234663 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" event={"ID":"53a58427-36ab-4310-bcb4-9d70376435d9","Type":"ContainerDied","Data":"b8c298dae090d852e12b2becffba8abf5429e3ab0fbc2e5842ff5c1fce5bfe89"} Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.234695 4753 scope.go:117] "RemoveContainer" containerID="d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.235410 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.256993 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd"] Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.263982 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9b9b64d5f-s8ttd"] Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.274108 4753 scope.go:117] "RemoveContainer" containerID="d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879" Jan 29 14:08:04 crc kubenswrapper[4753]: E0129 14:08:04.274732 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879\": container with ID starting with d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879 not found: ID does not exist" containerID="d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.274977 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879"} err="failed to get container status \"d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879\": rpc error: code = NotFound desc = could not find container \"d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879\": container with ID starting with d88cf55069027a0508b307fa23b0a40c5089d50f1df22903249037f28721f879 not found: ID does not exist" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.305197 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-shp7j" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.305553 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bzksl" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.401313 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-548968b897-6dbxj"] Jan 29 14:08:04 crc kubenswrapper[4753]: E0129 14:08:04.401599 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a58427-36ab-4310-bcb4-9d70376435d9" containerName="controller-manager" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.401618 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a58427-36ab-4310-bcb4-9d70376435d9" containerName="controller-manager" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.401722 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a58427-36ab-4310-bcb4-9d70376435d9" containerName="controller-manager" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.402109 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.408141 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.409494 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-548968b897-6dbxj"] Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.409871 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.410066 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.410345 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.410747 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.410765 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.416539 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.501620 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/369c886d-650b-4ddd-ac79-7e56676e8403-proxy-ca-bundles\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.501675 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369c886d-650b-4ddd-ac79-7e56676e8403-config\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.501706 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369c886d-650b-4ddd-ac79-7e56676e8403-serving-cert\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.501730 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/369c886d-650b-4ddd-ac79-7e56676e8403-client-ca\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.501771 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8xs7\" (UniqueName: \"kubernetes.io/projected/369c886d-650b-4ddd-ac79-7e56676e8403-kube-api-access-k8xs7\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.603209 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/369c886d-650b-4ddd-ac79-7e56676e8403-proxy-ca-bundles\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.603331 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369c886d-650b-4ddd-ac79-7e56676e8403-config\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.603380 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369c886d-650b-4ddd-ac79-7e56676e8403-serving-cert\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.603413 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/369c886d-650b-4ddd-ac79-7e56676e8403-client-ca\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.603478 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8xs7\" (UniqueName: \"kubernetes.io/projected/369c886d-650b-4ddd-ac79-7e56676e8403-kube-api-access-k8xs7\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.604702 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/369c886d-650b-4ddd-ac79-7e56676e8403-proxy-ca-bundles\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.605507 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/369c886d-650b-4ddd-ac79-7e56676e8403-client-ca\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.606643 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369c886d-650b-4ddd-ac79-7e56676e8403-config\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.610578 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369c886d-650b-4ddd-ac79-7e56676e8403-serving-cert\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.626640 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8xs7\" (UniqueName: \"kubernetes.io/projected/369c886d-650b-4ddd-ac79-7e56676e8403-kube-api-access-k8xs7\") pod \"controller-manager-548968b897-6dbxj\" (UID: \"369c886d-650b-4ddd-ac79-7e56676e8403\") " pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.718983 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:04 crc kubenswrapper[4753]: I0129 14:08:04.942060 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-548968b897-6dbxj"] Jan 29 14:08:05 crc kubenswrapper[4753]: I0129 14:08:05.242872 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" event={"ID":"369c886d-650b-4ddd-ac79-7e56676e8403","Type":"ContainerStarted","Data":"05c4d95b1a8f8002a19e48e4f76d51d9fdbcbd98a017951951bdcc2c4c5ed88f"} Jan 29 14:08:05 crc kubenswrapper[4753]: I0129 14:08:05.242971 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" event={"ID":"369c886d-650b-4ddd-ac79-7e56676e8403","Type":"ContainerStarted","Data":"88470e691a3f2d3fc89accef20b239c83d868a5b2ad426d3948893d3050d5ba1"} Jan 29 14:08:05 crc kubenswrapper[4753]: I0129 14:08:05.243626 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:05 crc kubenswrapper[4753]: I0129 14:08:05.250859 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" Jan 29 14:08:05 crc kubenswrapper[4753]: I0129 14:08:05.293772 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-548968b897-6dbxj" podStartSLOduration=2.293743653 podStartE2EDuration="2.293743653s" podCreationTimestamp="2026-01-29 14:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:08:05.266774187 +0000 UTC m=+319.961508609" watchObservedRunningTime="2026-01-29 14:08:05.293743653 +0000 UTC m=+319.988478075" Jan 29 14:08:06 crc kubenswrapper[4753]: I0129 14:08:06.159449 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a58427-36ab-4310-bcb4-9d70376435d9" path="/var/lib/kubelet/pods/53a58427-36ab-4310-bcb4-9d70376435d9/volumes" Jan 29 14:08:11 crc kubenswrapper[4753]: I0129 14:08:11.255031 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:08:11 crc kubenswrapper[4753]: I0129 14:08:11.327537 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l6qbh" Jan 29 14:08:13 crc kubenswrapper[4753]: I0129 14:08:13.787441 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-5sqmr" Jan 29 14:08:13 crc kubenswrapper[4753]: I0129 14:08:13.859206 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rf798"] Jan 29 14:08:38 crc kubenswrapper[4753]: I0129 14:08:38.911941 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" podUID="6a18fef7-00cd-4027-bec1-91ded07e3bfb" containerName="registry" containerID="cri-o://3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c" gracePeriod=30 Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.435330 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.480341 4753 generic.go:334] "Generic (PLEG): container finished" podID="6a18fef7-00cd-4027-bec1-91ded07e3bfb" containerID="3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c" exitCode=0 Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.480393 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" event={"ID":"6a18fef7-00cd-4027-bec1-91ded07e3bfb","Type":"ContainerDied","Data":"3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c"} Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.480466 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" event={"ID":"6a18fef7-00cd-4027-bec1-91ded07e3bfb","Type":"ContainerDied","Data":"f52becb34a6d57151b96fa50e01a058ff47cb6a6c2f9c31fcc35f199487d48b3"} Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.480489 4753 scope.go:117] "RemoveContainer" containerID="3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.480676 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rf798" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.503359 4753 scope.go:117] "RemoveContainer" containerID="3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c" Jan 29 14:08:39 crc kubenswrapper[4753]: E0129 14:08:39.503941 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c\": container with ID starting with 3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c not found: ID does not exist" containerID="3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.503977 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c"} err="failed to get container status \"3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c\": rpc error: code = NotFound desc = could not find container \"3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c\": container with ID starting with 3e79a5091b4edba000c08b7df23a217366de1d7dcba4b0c0e4ca08d44687735c not found: ID does not exist" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.517406 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-trusted-ca\") pod \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.517454 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfcv2\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-kube-api-access-jfcv2\") pod \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.517479 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-tls\") pod \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.517507 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6a18fef7-00cd-4027-bec1-91ded07e3bfb-installation-pull-secrets\") pod \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.517553 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-certificates\") pod \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.517744 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.517788 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-bound-sa-token\") pod \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.517814 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6a18fef7-00cd-4027-bec1-91ded07e3bfb-ca-trust-extracted\") pod \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\" (UID: \"6a18fef7-00cd-4027-bec1-91ded07e3bfb\") " Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.518846 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6a18fef7-00cd-4027-bec1-91ded07e3bfb" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.519008 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6a18fef7-00cd-4027-bec1-91ded07e3bfb" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.530243 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6a18fef7-00cd-4027-bec1-91ded07e3bfb" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.530341 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a18fef7-00cd-4027-bec1-91ded07e3bfb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6a18fef7-00cd-4027-bec1-91ded07e3bfb" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.530646 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6a18fef7-00cd-4027-bec1-91ded07e3bfb" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.530872 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-kube-api-access-jfcv2" (OuterVolumeSpecName: "kube-api-access-jfcv2") pod "6a18fef7-00cd-4027-bec1-91ded07e3bfb" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb"). InnerVolumeSpecName "kube-api-access-jfcv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.531203 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6a18fef7-00cd-4027-bec1-91ded07e3bfb" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.538296 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a18fef7-00cd-4027-bec1-91ded07e3bfb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6a18fef7-00cd-4027-bec1-91ded07e3bfb" (UID: "6a18fef7-00cd-4027-bec1-91ded07e3bfb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.619118 4753 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6a18fef7-00cd-4027-bec1-91ded07e3bfb-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.619215 4753 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.619238 4753 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.619258 4753 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6a18fef7-00cd-4027-bec1-91ded07e3bfb-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.619280 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a18fef7-00cd-4027-bec1-91ded07e3bfb-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.619299 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfcv2\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-kube-api-access-jfcv2\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.619320 4753 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6a18fef7-00cd-4027-bec1-91ded07e3bfb-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.835079 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rf798"] Jan 29 14:08:39 crc kubenswrapper[4753]: I0129 14:08:39.843663 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rf798"] Jan 29 14:08:40 crc kubenswrapper[4753]: I0129 14:08:40.163199 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a18fef7-00cd-4027-bec1-91ded07e3bfb" path="/var/lib/kubelet/pods/6a18fef7-00cd-4027-bec1-91ded07e3bfb/volumes" Jan 29 14:08:57 crc kubenswrapper[4753]: I0129 14:08:57.055313 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:08:57 crc kubenswrapper[4753]: I0129 14:08:57.056792 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:09:27 crc kubenswrapper[4753]: I0129 14:09:27.055662 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:09:27 crc kubenswrapper[4753]: I0129 14:09:27.056638 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:09:57 crc kubenswrapper[4753]: I0129 14:09:57.054977 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:09:57 crc kubenswrapper[4753]: I0129 14:09:57.056041 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:09:57 crc kubenswrapper[4753]: I0129 14:09:57.056120 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:09:57 crc kubenswrapper[4753]: I0129 14:09:57.057101 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75366cb6f8276e320c7a54930212ad89a510f8fd57854ceefd9052e46fbae159"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:09:57 crc kubenswrapper[4753]: I0129 14:09:57.057242 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://75366cb6f8276e320c7a54930212ad89a510f8fd57854ceefd9052e46fbae159" gracePeriod=600 Jan 29 14:09:58 crc kubenswrapper[4753]: I0129 14:09:58.069095 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="75366cb6f8276e320c7a54930212ad89a510f8fd57854ceefd9052e46fbae159" exitCode=0 Jan 29 14:09:58 crc kubenswrapper[4753]: I0129 14:09:58.069714 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"75366cb6f8276e320c7a54930212ad89a510f8fd57854ceefd9052e46fbae159"} Jan 29 14:09:58 crc kubenswrapper[4753]: I0129 14:09:58.069766 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"90491981003addce6f1b9660cc7a2bd6006504000ac2c231afb7b0bfc2f931be"} Jan 29 14:09:58 crc kubenswrapper[4753]: I0129 14:09:58.069799 4753 scope.go:117] "RemoveContainer" containerID="0a776e0c0530554ac1c5bb596b978197a529fc4081e52e28495217149f826257" Jan 29 14:10:46 crc kubenswrapper[4753]: I0129 14:10:46.518280 4753 scope.go:117] "RemoveContainer" containerID="bab852b08fa726a717811425c76977d4b8a81a9e88e319395564a351fec56207" Jan 29 14:10:46 crc kubenswrapper[4753]: I0129 14:10:46.548316 4753 scope.go:117] "RemoveContainer" containerID="f4731879011cb220b7ab9edaf8cb098ee14c9ae2a83105767e8531e6ff1d333b" Jan 29 14:11:46 crc kubenswrapper[4753]: I0129 14:11:46.599406 4753 scope.go:117] "RemoveContainer" containerID="c3db8665aab739287a49e3fc2b8a6b68d4fbdf39b48aa44713d4c755ea5d85b1" Jan 29 14:11:46 crc kubenswrapper[4753]: I0129 14:11:46.627357 4753 scope.go:117] "RemoveContainer" containerID="ca7c762e90b6637412c0d6cdaddcf3b29b9cd833da6bb3cf5b70f8755ebe33fb" Jan 29 14:11:57 crc kubenswrapper[4753]: I0129 14:11:57.054889 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:11:57 crc kubenswrapper[4753]: I0129 14:11:57.055663 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:12:27 crc kubenswrapper[4753]: I0129 14:12:27.055196 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:12:27 crc kubenswrapper[4753]: I0129 14:12:27.055976 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:12:57 crc kubenswrapper[4753]: I0129 14:12:57.055359 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:12:57 crc kubenswrapper[4753]: I0129 14:12:57.056599 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:12:57 crc kubenswrapper[4753]: I0129 14:12:57.056696 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:12:57 crc kubenswrapper[4753]: I0129 14:12:57.057646 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90491981003addce6f1b9660cc7a2bd6006504000ac2c231afb7b0bfc2f931be"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:12:57 crc kubenswrapper[4753]: I0129 14:12:57.057735 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://90491981003addce6f1b9660cc7a2bd6006504000ac2c231afb7b0bfc2f931be" gracePeriod=600 Jan 29 14:12:57 crc kubenswrapper[4753]: I0129 14:12:57.452934 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="90491981003addce6f1b9660cc7a2bd6006504000ac2c231afb7b0bfc2f931be" exitCode=0 Jan 29 14:12:57 crc kubenswrapper[4753]: I0129 14:12:57.453035 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"90491981003addce6f1b9660cc7a2bd6006504000ac2c231afb7b0bfc2f931be"} Jan 29 14:12:57 crc kubenswrapper[4753]: I0129 14:12:57.453590 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"30a83f4047e7a21e63740d93f083e75525a0e3fe674659ba74e59493ea388ecf"} Jan 29 14:12:57 crc kubenswrapper[4753]: I0129 14:12:57.453623 4753 scope.go:117] "RemoveContainer" containerID="75366cb6f8276e320c7a54930212ad89a510f8fd57854ceefd9052e46fbae159" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.173592 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9pd9r"] Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.175192 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovn-controller" containerID="cri-o://98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e" gracePeriod=30 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.175244 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="nbdb" containerID="cri-o://8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43" gracePeriod=30 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.175295 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f" gracePeriod=30 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.175325 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="kube-rbac-proxy-node" containerID="cri-o://efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9" gracePeriod=30 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.175358 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="northd" containerID="cri-o://92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86" gracePeriod=30 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.175330 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovn-acl-logging" containerID="cri-o://ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed" gracePeriod=30 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.175406 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="sbdb" containerID="cri-o://45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355" gracePeriod=30 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.248299 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" containerID="cri-o://c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45" gracePeriod=30 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.546829 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/2.log" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.549686 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovn-acl-logging/0.log" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.550198 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovn-controller/0.log" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.550690 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601643 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pxkcg"] Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601820 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovn-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601831 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovn-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601838 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="kube-rbac-proxy-ovn-metrics" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601843 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="kube-rbac-proxy-ovn-metrics" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601855 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovn-acl-logging" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601861 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovn-acl-logging" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601870 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601875 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601883 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="kubecfg-setup" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601888 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="kubecfg-setup" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601896 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="sbdb" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601902 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="sbdb" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601911 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="nbdb" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601916 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="nbdb" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601926 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="northd" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601931 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="northd" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601940 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601945 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601953 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="kube-rbac-proxy-node" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601958 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="kube-rbac-proxy-node" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601966 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601971 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.601978 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a18fef7-00cd-4027-bec1-91ded07e3bfb" containerName="registry" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.601984 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a18fef7-00cd-4027-bec1-91ded07e3bfb" containerName="registry" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602067 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovn-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602078 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="kube-rbac-proxy-node" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602085 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602092 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="sbdb" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602098 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602104 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602110 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="nbdb" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602116 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovn-acl-logging" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602125 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="northd" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602131 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="kube-rbac-proxy-ovn-metrics" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602139 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a18fef7-00cd-4027-bec1-91ded07e3bfb" containerName="registry" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.602250 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602257 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.602334 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796c89a-761f-48d7-80b5-031f75703f32" containerName="ovnkube-controller" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.604006 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.638446 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovnkube-controller/2.log" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.641647 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovn-acl-logging/0.log" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642099 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9pd9r_a796c89a-761f-48d7-80b5-031f75703f32/ovn-controller/0.log" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642498 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-ovn\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642525 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-systemd-units\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642560 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-env-overrides\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642588 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a796c89a-761f-48d7-80b5-031f75703f32-ovn-node-metrics-cert\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642609 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-script-lib\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642549 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642578 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642658 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642630 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-etc-openvswitch\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642734 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-config\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642773 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-node-log\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642808 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-slash\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642869 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-log-socket\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642902 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-netns\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642932 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-netd\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.642960 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643004 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-var-lib-openvswitch\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643041 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl6hh\" (UniqueName: \"kubernetes.io/projected/a796c89a-761f-48d7-80b5-031f75703f32-kube-api-access-xl6hh\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643071 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-systemd\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643109 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-ovn-kubernetes\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643181 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-bin\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643209 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-kubelet\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643235 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-openvswitch\") pod \"a796c89a-761f-48d7-80b5-031f75703f32\" (UID: \"a796c89a-761f-48d7-80b5-031f75703f32\") " Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643296 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643323 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643521 4753 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643541 4753 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643558 4753 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643577 4753 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643595 4753 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643642 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643685 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643696 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643720 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643731 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-log-socket" (OuterVolumeSpecName: "log-socket") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643755 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-node-log" (OuterVolumeSpecName: "node-log") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643757 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643757 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643788 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-slash" (OuterVolumeSpecName: "host-slash") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.643835 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.644123 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.644339 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645093 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45" exitCode=0 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645125 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355" exitCode=0 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645140 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43" exitCode=0 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645137 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645171 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86" exitCode=0 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645180 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f" exitCode=0 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645189 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9" exitCode=0 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645202 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645204 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed" exitCode=143 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645216 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645220 4753 generic.go:334] "Generic (PLEG): container finished" podID="a796c89a-761f-48d7-80b5-031f75703f32" containerID="98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e" exitCode=143 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645226 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645241 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645254 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645267 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645280 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645286 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645292 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645298 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645306 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645219 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645311 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645448 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645464 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645481 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645501 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645511 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645520 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645527 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645534 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645541 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645549 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645556 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645563 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645569 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645580 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645591 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645599 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645606 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645613 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645620 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645626 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645632 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645639 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645646 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645652 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645661 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pd9r" event={"ID":"a796c89a-761f-48d7-80b5-031f75703f32","Type":"ContainerDied","Data":"5404f3066943ea510e2486041cf0532862eeb302ba50806da7c2eed384dcdc8a"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645670 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645678 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645686 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645692 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645699 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645705 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645712 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645718 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645726 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645735 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.645298 4753 scope.go:117] "RemoveContainer" containerID="c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.648912 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vfrvp_63926a91-5e42-4768-8277-55a0113cb5e2/kube-multus/1.log" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.650023 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vfrvp_63926a91-5e42-4768-8277-55a0113cb5e2/kube-multus/0.log" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.650168 4753 generic.go:334] "Generic (PLEG): container finished" podID="63926a91-5e42-4768-8277-55a0113cb5e2" containerID="a178927f4539cbdccdd23649d9477c2c6c0b238d6a9702240d428e1e2fe90d4b" exitCode=2 Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.650298 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vfrvp" event={"ID":"63926a91-5e42-4768-8277-55a0113cb5e2","Type":"ContainerDied","Data":"a178927f4539cbdccdd23649d9477c2c6c0b238d6a9702240d428e1e2fe90d4b"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.650346 4753 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6"} Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.650889 4753 scope.go:117] "RemoveContainer" containerID="a178927f4539cbdccdd23649d9477c2c6c0b238d6a9702240d428e1e2fe90d4b" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.651283 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-vfrvp_openshift-multus(63926a91-5e42-4768-8277-55a0113cb5e2)\"" pod="openshift-multus/multus-vfrvp" podUID="63926a91-5e42-4768-8277-55a0113cb5e2" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.652676 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a796c89a-761f-48d7-80b5-031f75703f32-kube-api-access-xl6hh" (OuterVolumeSpecName: "kube-api-access-xl6hh") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "kube-api-access-xl6hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.653317 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a796c89a-761f-48d7-80b5-031f75703f32-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.668131 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a796c89a-761f-48d7-80b5-031f75703f32" (UID: "a796c89a-761f-48d7-80b5-031f75703f32"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.680953 4753 scope.go:117] "RemoveContainer" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.699260 4753 scope.go:117] "RemoveContainer" containerID="45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.718831 4753 scope.go:117] "RemoveContainer" containerID="8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.734716 4753 scope.go:117] "RemoveContainer" containerID="92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.744744 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-cni-netd\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.744786 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-log-socket\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.744814 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj6gc\" (UniqueName: \"kubernetes.io/projected/0fa200f2-320b-4054-9d05-53d0c251120b-kube-api-access-lj6gc\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.744850 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0fa200f2-320b-4054-9d05-53d0c251120b-ovnkube-config\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.744882 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0fa200f2-320b-4054-9d05-53d0c251120b-env-overrides\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.744919 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-run-systemd\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.744984 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-run-netns\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745197 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-var-lib-openvswitch\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745259 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-run-ovn\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745298 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745333 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-etc-openvswitch\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745421 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-kubelet\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745462 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-run-openvswitch\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745519 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745556 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-node-log\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745597 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-slash\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745620 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-cni-bin\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745663 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0fa200f2-320b-4054-9d05-53d0c251120b-ovnkube-script-lib\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745689 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0fa200f2-320b-4054-9d05-53d0c251120b-ovn-node-metrics-cert\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745729 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-systemd-units\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745850 4753 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745881 4753 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-node-log\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745894 4753 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-slash\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745907 4753 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-log-socket\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745918 4753 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745930 4753 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745941 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl6hh\" (UniqueName: \"kubernetes.io/projected/a796c89a-761f-48d7-80b5-031f75703f32-kube-api-access-xl6hh\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745953 4753 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745964 4753 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.745976 4753 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.746077 4753 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.746107 4753 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.746130 4753 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a796c89a-761f-48d7-80b5-031f75703f32-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.746372 4753 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a796c89a-761f-48d7-80b5-031f75703f32-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.746401 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a796c89a-761f-48d7-80b5-031f75703f32-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.751370 4753 scope.go:117] "RemoveContainer" containerID="942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.766936 4753 scope.go:117] "RemoveContainer" containerID="efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.781426 4753 scope.go:117] "RemoveContainer" containerID="ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.795855 4753 scope.go:117] "RemoveContainer" containerID="98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.812081 4753 scope.go:117] "RemoveContainer" containerID="3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.829474 4753 scope.go:117] "RemoveContainer" containerID="c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.829845 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": container with ID starting with c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45 not found: ID does not exist" containerID="c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.829906 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45"} err="failed to get container status \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": rpc error: code = NotFound desc = could not find container \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": container with ID starting with c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.829947 4753 scope.go:117] "RemoveContainer" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.830865 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\": container with ID starting with 2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491 not found: ID does not exist" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.830920 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491"} err="failed to get container status \"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\": rpc error: code = NotFound desc = could not find container \"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\": container with ID starting with 2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.830950 4753 scope.go:117] "RemoveContainer" containerID="45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.831407 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\": container with ID starting with 45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355 not found: ID does not exist" containerID="45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.831475 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355"} err="failed to get container status \"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\": rpc error: code = NotFound desc = could not find container \"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\": container with ID starting with 45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.831522 4753 scope.go:117] "RemoveContainer" containerID="8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.831967 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\": container with ID starting with 8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43 not found: ID does not exist" containerID="8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.832019 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43"} err="failed to get container status \"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\": rpc error: code = NotFound desc = could not find container \"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\": container with ID starting with 8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.832057 4753 scope.go:117] "RemoveContainer" containerID="92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.832504 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\": container with ID starting with 92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86 not found: ID does not exist" containerID="92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.832571 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86"} err="failed to get container status \"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\": rpc error: code = NotFound desc = could not find container \"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\": container with ID starting with 92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.832612 4753 scope.go:117] "RemoveContainer" containerID="942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.832984 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\": container with ID starting with 942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f not found: ID does not exist" containerID="942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.833048 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f"} err="failed to get container status \"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\": rpc error: code = NotFound desc = could not find container \"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\": container with ID starting with 942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.833090 4753 scope.go:117] "RemoveContainer" containerID="efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.833610 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\": container with ID starting with efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9 not found: ID does not exist" containerID="efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.833653 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9"} err="failed to get container status \"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\": rpc error: code = NotFound desc = could not find container \"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\": container with ID starting with efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.833684 4753 scope.go:117] "RemoveContainer" containerID="ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.834192 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\": container with ID starting with ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed not found: ID does not exist" containerID="ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.834246 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed"} err="failed to get container status \"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\": rpc error: code = NotFound desc = could not find container \"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\": container with ID starting with ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.834279 4753 scope.go:117] "RemoveContainer" containerID="98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.834582 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\": container with ID starting with 98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e not found: ID does not exist" containerID="98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.834615 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e"} err="failed to get container status \"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\": rpc error: code = NotFound desc = could not find container \"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\": container with ID starting with 98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.834635 4753 scope.go:117] "RemoveContainer" containerID="3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f" Jan 29 14:13:21 crc kubenswrapper[4753]: E0129 14:13:21.835086 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\": container with ID starting with 3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f not found: ID does not exist" containerID="3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.835132 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f"} err="failed to get container status \"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\": rpc error: code = NotFound desc = could not find container \"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\": container with ID starting with 3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.835190 4753 scope.go:117] "RemoveContainer" containerID="c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.835529 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45"} err="failed to get container status \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": rpc error: code = NotFound desc = could not find container \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": container with ID starting with c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.835559 4753 scope.go:117] "RemoveContainer" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.835987 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491"} err="failed to get container status \"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\": rpc error: code = NotFound desc = could not find container \"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\": container with ID starting with 2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.836026 4753 scope.go:117] "RemoveContainer" containerID="45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.836368 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355"} err="failed to get container status \"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\": rpc error: code = NotFound desc = could not find container \"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\": container with ID starting with 45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.836413 4753 scope.go:117] "RemoveContainer" containerID="8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.836771 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43"} err="failed to get container status \"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\": rpc error: code = NotFound desc = could not find container \"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\": container with ID starting with 8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.836811 4753 scope.go:117] "RemoveContainer" containerID="92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.837134 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86"} err="failed to get container status \"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\": rpc error: code = NotFound desc = could not find container \"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\": container with ID starting with 92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.837205 4753 scope.go:117] "RemoveContainer" containerID="942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.837647 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f"} err="failed to get container status \"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\": rpc error: code = NotFound desc = could not find container \"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\": container with ID starting with 942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.837688 4753 scope.go:117] "RemoveContainer" containerID="efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.838001 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9"} err="failed to get container status \"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\": rpc error: code = NotFound desc = could not find container \"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\": container with ID starting with efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.838068 4753 scope.go:117] "RemoveContainer" containerID="ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.838434 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed"} err="failed to get container status \"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\": rpc error: code = NotFound desc = could not find container \"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\": container with ID starting with ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.838473 4753 scope.go:117] "RemoveContainer" containerID="98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.838761 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e"} err="failed to get container status \"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\": rpc error: code = NotFound desc = could not find container \"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\": container with ID starting with 98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.838802 4753 scope.go:117] "RemoveContainer" containerID="3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.839228 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f"} err="failed to get container status \"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\": rpc error: code = NotFound desc = could not find container \"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\": container with ID starting with 3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.839282 4753 scope.go:117] "RemoveContainer" containerID="c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.839626 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45"} err="failed to get container status \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": rpc error: code = NotFound desc = could not find container \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": container with ID starting with c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.839669 4753 scope.go:117] "RemoveContainer" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.839956 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491"} err="failed to get container status \"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\": rpc error: code = NotFound desc = could not find container \"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\": container with ID starting with 2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.839986 4753 scope.go:117] "RemoveContainer" containerID="45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.840307 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355"} err="failed to get container status \"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\": rpc error: code = NotFound desc = could not find container \"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\": container with ID starting with 45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.840353 4753 scope.go:117] "RemoveContainer" containerID="8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.840651 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43"} err="failed to get container status \"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\": rpc error: code = NotFound desc = could not find container \"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\": container with ID starting with 8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.840685 4753 scope.go:117] "RemoveContainer" containerID="92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.841123 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86"} err="failed to get container status \"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\": rpc error: code = NotFound desc = could not find container \"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\": container with ID starting with 92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.841248 4753 scope.go:117] "RemoveContainer" containerID="942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.841581 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f"} err="failed to get container status \"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\": rpc error: code = NotFound desc = could not find container \"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\": container with ID starting with 942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.841627 4753 scope.go:117] "RemoveContainer" containerID="efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.842081 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9"} err="failed to get container status \"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\": rpc error: code = NotFound desc = could not find container \"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\": container with ID starting with efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.842224 4753 scope.go:117] "RemoveContainer" containerID="ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.842711 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed"} err="failed to get container status \"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\": rpc error: code = NotFound desc = could not find container \"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\": container with ID starting with ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.842755 4753 scope.go:117] "RemoveContainer" containerID="98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.843202 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e"} err="failed to get container status \"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\": rpc error: code = NotFound desc = could not find container \"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\": container with ID starting with 98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.843248 4753 scope.go:117] "RemoveContainer" containerID="3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.843694 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f"} err="failed to get container status \"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\": rpc error: code = NotFound desc = could not find container \"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\": container with ID starting with 3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.843738 4753 scope.go:117] "RemoveContainer" containerID="c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.844072 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45"} err="failed to get container status \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": rpc error: code = NotFound desc = could not find container \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": container with ID starting with c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.844102 4753 scope.go:117] "RemoveContainer" containerID="2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.844431 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491"} err="failed to get container status \"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\": rpc error: code = NotFound desc = could not find container \"2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491\": container with ID starting with 2218fb59ab445f05e21070aa231cdd5fa889e0d1baa27b1106b699344cb30491 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.844538 4753 scope.go:117] "RemoveContainer" containerID="45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.844885 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355"} err="failed to get container status \"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\": rpc error: code = NotFound desc = could not find container \"45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355\": container with ID starting with 45059e8de0a96dc22e8eec5a2cb857de7c11bdf4e3e98ca3b0ea60797cc68355 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.844910 4753 scope.go:117] "RemoveContainer" containerID="8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.845243 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43"} err="failed to get container status \"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\": rpc error: code = NotFound desc = could not find container \"8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43\": container with ID starting with 8bf528b8646b40522570aece10f5f8e6c24ad30785c16ae0463c8ed0875a8e43 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.845283 4753 scope.go:117] "RemoveContainer" containerID="92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.845640 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86"} err="failed to get container status \"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\": rpc error: code = NotFound desc = could not find container \"92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86\": container with ID starting with 92678f4bc1d2a99b321c750ece0be8094f56cd8e0de76c731073da0ffc1d6c86 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.845682 4753 scope.go:117] "RemoveContainer" containerID="942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.845998 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f"} err="failed to get container status \"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\": rpc error: code = NotFound desc = could not find container \"942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f\": container with ID starting with 942e73532861c2fc8382844f0770a1bef55ce78b962562d9daacb4e4ce07e42f not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.846036 4753 scope.go:117] "RemoveContainer" containerID="efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.846356 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9"} err="failed to get container status \"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\": rpc error: code = NotFound desc = could not find container \"efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9\": container with ID starting with efe41b5053d3531bfe905c8b8ffe5c436f91a9ad18c48c2751b537164db9efe9 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.846402 4753 scope.go:117] "RemoveContainer" containerID="ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.846874 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed"} err="failed to get container status \"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\": rpc error: code = NotFound desc = could not find container \"ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed\": container with ID starting with ce48c2f242fb8e90d578812c4ce49f2be1475a06c858ab1a37ddd65d17a671ed not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.846899 4753 scope.go:117] "RemoveContainer" containerID="98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847188 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-kubelet\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847239 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-run-openvswitch\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847267 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-kubelet\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847285 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847340 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-node-log\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847353 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847380 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-run-openvswitch\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847385 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-slash\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847443 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-node-log\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847421 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-slash\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847397 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e"} err="failed to get container status \"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\": rpc error: code = NotFound desc = could not find container \"98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e\": container with ID starting with 98ca4698b472a94966f89eea97ca0334c656b816dd70ca1495140bb9595b3d2e not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847465 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-cni-bin\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847501 4753 scope.go:117] "RemoveContainer" containerID="3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847635 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0fa200f2-320b-4054-9d05-53d0c251120b-ovnkube-script-lib\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847500 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-cni-bin\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847688 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0fa200f2-320b-4054-9d05-53d0c251120b-ovn-node-metrics-cert\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847752 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-systemd-units\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847819 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-cni-netd\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847871 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-log-socket\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847909 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj6gc\" (UniqueName: \"kubernetes.io/projected/0fa200f2-320b-4054-9d05-53d0c251120b-kube-api-access-lj6gc\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847921 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-cni-netd\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847909 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-systemd-units\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847954 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0fa200f2-320b-4054-9d05-53d0c251120b-ovnkube-config\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.847959 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-log-socket\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848078 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0fa200f2-320b-4054-9d05-53d0c251120b-env-overrides\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848231 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-run-systemd\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848301 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-run-netns\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848337 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-run-ovn\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848381 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-var-lib-openvswitch\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848425 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848497 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-etc-openvswitch\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848508 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-run-netns\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848563 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-var-lib-openvswitch\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848745 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-run-systemd\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848778 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848792 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-etc-openvswitch\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848811 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0fa200f2-320b-4054-9d05-53d0c251120b-run-ovn\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.848945 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0fa200f2-320b-4054-9d05-53d0c251120b-env-overrides\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.849190 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0fa200f2-320b-4054-9d05-53d0c251120b-ovnkube-config\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.849453 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f"} err="failed to get container status \"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\": rpc error: code = NotFound desc = could not find container \"3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f\": container with ID starting with 3f18db02a3daa23025231f9ce02fdb5ae5931f69cb52e6ce640fa1644ad2cb0f not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.849521 4753 scope.go:117] "RemoveContainer" containerID="c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.850127 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45"} err="failed to get container status \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": rpc error: code = NotFound desc = could not find container \"c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45\": container with ID starting with c7de37a6a804165c011394f4f817b799424e4f4d25ab13b22449499825257f45 not found: ID does not exist" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.851848 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0fa200f2-320b-4054-9d05-53d0c251120b-ovnkube-script-lib\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.853894 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0fa200f2-320b-4054-9d05-53d0c251120b-ovn-node-metrics-cert\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.866055 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj6gc\" (UniqueName: \"kubernetes.io/projected/0fa200f2-320b-4054-9d05-53d0c251120b-kube-api-access-lj6gc\") pod \"ovnkube-node-pxkcg\" (UID: \"0fa200f2-320b-4054-9d05-53d0c251120b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:21 crc kubenswrapper[4753]: I0129 14:13:21.917475 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:22 crc kubenswrapper[4753]: I0129 14:13:22.001644 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9pd9r"] Jan 29 14:13:22 crc kubenswrapper[4753]: I0129 14:13:22.008639 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9pd9r"] Jan 29 14:13:22 crc kubenswrapper[4753]: I0129 14:13:22.172033 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a796c89a-761f-48d7-80b5-031f75703f32" path="/var/lib/kubelet/pods/a796c89a-761f-48d7-80b5-031f75703f32/volumes" Jan 29 14:13:22 crc kubenswrapper[4753]: I0129 14:13:22.660960 4753 generic.go:334] "Generic (PLEG): container finished" podID="0fa200f2-320b-4054-9d05-53d0c251120b" containerID="05c2102b057ba334e9e5cf14b19157748696628c96e95ab92ca29232e89331e0" exitCode=0 Jan 29 14:13:22 crc kubenswrapper[4753]: I0129 14:13:22.661104 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" event={"ID":"0fa200f2-320b-4054-9d05-53d0c251120b","Type":"ContainerDied","Data":"05c2102b057ba334e9e5cf14b19157748696628c96e95ab92ca29232e89331e0"} Jan 29 14:13:22 crc kubenswrapper[4753]: I0129 14:13:22.661500 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" event={"ID":"0fa200f2-320b-4054-9d05-53d0c251120b","Type":"ContainerStarted","Data":"f6de97eb9f3cd36a770441c0554bf56223ee78571bb84bcad6e8c9e4f1e01542"} Jan 29 14:13:23 crc kubenswrapper[4753]: I0129 14:13:23.673448 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" event={"ID":"0fa200f2-320b-4054-9d05-53d0c251120b","Type":"ContainerStarted","Data":"9ec67c6aedc52baf729f1f093338c23e39248a9193e2531fe437cc7a2995a156"} Jan 29 14:13:23 crc kubenswrapper[4753]: I0129 14:13:23.674370 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" event={"ID":"0fa200f2-320b-4054-9d05-53d0c251120b","Type":"ContainerStarted","Data":"e5b85a8ab3125da4fded175c40be991bf2cfb2f451d50fd02c132b686d0a1209"} Jan 29 14:13:23 crc kubenswrapper[4753]: I0129 14:13:23.674385 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" event={"ID":"0fa200f2-320b-4054-9d05-53d0c251120b","Type":"ContainerStarted","Data":"fb53fe6db931d13c5d93efaa6f311d5ca349a69285a17fcbc6e379d96ce4c944"} Jan 29 14:13:23 crc kubenswrapper[4753]: I0129 14:13:23.674397 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" event={"ID":"0fa200f2-320b-4054-9d05-53d0c251120b","Type":"ContainerStarted","Data":"e4391937a0bbc8aef890fb07756663edd878057670786cafbd6f637c2a42d6c9"} Jan 29 14:13:23 crc kubenswrapper[4753]: I0129 14:13:23.674409 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" event={"ID":"0fa200f2-320b-4054-9d05-53d0c251120b","Type":"ContainerStarted","Data":"fa8e4ce5192a548ee5b0e4045958e0b0a847955ef11c89245bb4e41b60c04826"} Jan 29 14:13:23 crc kubenswrapper[4753]: I0129 14:13:23.674421 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" event={"ID":"0fa200f2-320b-4054-9d05-53d0c251120b","Type":"ContainerStarted","Data":"eb025809d3c6a2f30361ce50e2a9436d197d74682ce92dfa72f6b38f7e1aeb30"} Jan 29 14:13:26 crc kubenswrapper[4753]: I0129 14:13:26.703943 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" event={"ID":"0fa200f2-320b-4054-9d05-53d0c251120b","Type":"ContainerStarted","Data":"7696ac9ca2c01bc3cb0f7e8994b009ddcec548152cd7709ff82a6489065adb79"} Jan 29 14:13:27 crc kubenswrapper[4753]: I0129 14:13:27.874235 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9wffl"] Jan 29 14:13:27 crc kubenswrapper[4753]: I0129 14:13:27.875281 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:27 crc kubenswrapper[4753]: I0129 14:13:27.878568 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 29 14:13:27 crc kubenswrapper[4753]: I0129 14:13:27.878750 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 29 14:13:27 crc kubenswrapper[4753]: I0129 14:13:27.879023 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 29 14:13:27 crc kubenswrapper[4753]: I0129 14:13:27.879039 4753 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-x2h76" Jan 29 14:13:27 crc kubenswrapper[4753]: I0129 14:13:27.942851 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh5l4\" (UniqueName: \"kubernetes.io/projected/28beaa45-8794-4da4-9715-05084646e567-kube-api-access-rh5l4\") pod \"crc-storage-crc-9wffl\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:27 crc kubenswrapper[4753]: I0129 14:13:27.943121 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/28beaa45-8794-4da4-9715-05084646e567-crc-storage\") pod \"crc-storage-crc-9wffl\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:27 crc kubenswrapper[4753]: I0129 14:13:27.943279 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/28beaa45-8794-4da4-9715-05084646e567-node-mnt\") pod \"crc-storage-crc-9wffl\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.044541 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/28beaa45-8794-4da4-9715-05084646e567-crc-storage\") pod \"crc-storage-crc-9wffl\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.045084 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/28beaa45-8794-4da4-9715-05084646e567-node-mnt\") pod \"crc-storage-crc-9wffl\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.045205 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh5l4\" (UniqueName: \"kubernetes.io/projected/28beaa45-8794-4da4-9715-05084646e567-kube-api-access-rh5l4\") pod \"crc-storage-crc-9wffl\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.045334 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/28beaa45-8794-4da4-9715-05084646e567-crc-storage\") pod \"crc-storage-crc-9wffl\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.045699 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/28beaa45-8794-4da4-9715-05084646e567-node-mnt\") pod \"crc-storage-crc-9wffl\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.064728 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh5l4\" (UniqueName: \"kubernetes.io/projected/28beaa45-8794-4da4-9715-05084646e567-kube-api-access-rh5l4\") pod \"crc-storage-crc-9wffl\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.201895 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: E0129 14:13:28.233561 4753 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9wffl_crc-storage_28beaa45-8794-4da4-9715-05084646e567_0(1e61557387551c371cb853b00182c1cc93fabb8a69d767dd086e08575caf50c9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 29 14:13:28 crc kubenswrapper[4753]: E0129 14:13:28.233676 4753 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9wffl_crc-storage_28beaa45-8794-4da4-9715-05084646e567_0(1e61557387551c371cb853b00182c1cc93fabb8a69d767dd086e08575caf50c9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: E0129 14:13:28.233715 4753 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9wffl_crc-storage_28beaa45-8794-4da4-9715-05084646e567_0(1e61557387551c371cb853b00182c1cc93fabb8a69d767dd086e08575caf50c9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: E0129 14:13:28.233802 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9wffl_crc-storage(28beaa45-8794-4da4-9715-05084646e567)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9wffl_crc-storage(28beaa45-8794-4da4-9715-05084646e567)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9wffl_crc-storage_28beaa45-8794-4da4-9715-05084646e567_0(1e61557387551c371cb853b00182c1cc93fabb8a69d767dd086e08575caf50c9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9wffl" podUID="28beaa45-8794-4da4-9715-05084646e567" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.736740 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" event={"ID":"0fa200f2-320b-4054-9d05-53d0c251120b","Type":"ContainerStarted","Data":"4f299a455d5b6716198526ff7cae5e614215b8120dc6bd08df3c081b23185741"} Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.737051 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.737129 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.737323 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.778720 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.781510 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.790836 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" podStartSLOduration=7.790811467 podStartE2EDuration="7.790811467s" podCreationTimestamp="2026-01-29 14:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:13:28.783443419 +0000 UTC m=+643.478177811" watchObservedRunningTime="2026-01-29 14:13:28.790811467 +0000 UTC m=+643.485545859" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.901380 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9wffl"] Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.901524 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: I0129 14:13:28.901970 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: E0129 14:13:28.936434 4753 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9wffl_crc-storage_28beaa45-8794-4da4-9715-05084646e567_0(947d01d386d0c3450484332d48a16c4670009d2164f944b85baa5d8dff370433): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 29 14:13:28 crc kubenswrapper[4753]: E0129 14:13:28.936522 4753 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9wffl_crc-storage_28beaa45-8794-4da4-9715-05084646e567_0(947d01d386d0c3450484332d48a16c4670009d2164f944b85baa5d8dff370433): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: E0129 14:13:28.936555 4753 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9wffl_crc-storage_28beaa45-8794-4da4-9715-05084646e567_0(947d01d386d0c3450484332d48a16c4670009d2164f944b85baa5d8dff370433): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:28 crc kubenswrapper[4753]: E0129 14:13:28.936618 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9wffl_crc-storage(28beaa45-8794-4da4-9715-05084646e567)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9wffl_crc-storage(28beaa45-8794-4da4-9715-05084646e567)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9wffl_crc-storage_28beaa45-8794-4da4-9715-05084646e567_0(947d01d386d0c3450484332d48a16c4670009d2164f944b85baa5d8dff370433): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9wffl" podUID="28beaa45-8794-4da4-9715-05084646e567" Jan 29 14:13:35 crc kubenswrapper[4753]: I0129 14:13:35.149224 4753 scope.go:117] "RemoveContainer" containerID="a178927f4539cbdccdd23649d9477c2c6c0b238d6a9702240d428e1e2fe90d4b" Jan 29 14:13:35 crc kubenswrapper[4753]: I0129 14:13:35.790029 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vfrvp_63926a91-5e42-4768-8277-55a0113cb5e2/kube-multus/1.log" Jan 29 14:13:35 crc kubenswrapper[4753]: I0129 14:13:35.792201 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vfrvp_63926a91-5e42-4768-8277-55a0113cb5e2/kube-multus/0.log" Jan 29 14:13:35 crc kubenswrapper[4753]: I0129 14:13:35.792282 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vfrvp" event={"ID":"63926a91-5e42-4768-8277-55a0113cb5e2","Type":"ContainerStarted","Data":"40d6201310cf7f7f823facfbe1e9c1c9bb115a77273f223d2febbcb2edf635d9"} Jan 29 14:13:41 crc kubenswrapper[4753]: I0129 14:13:41.149116 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:41 crc kubenswrapper[4753]: I0129 14:13:41.149965 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:41 crc kubenswrapper[4753]: I0129 14:13:41.430786 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9wffl"] Jan 29 14:13:41 crc kubenswrapper[4753]: I0129 14:13:41.444449 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 14:13:41 crc kubenswrapper[4753]: I0129 14:13:41.840846 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9wffl" event={"ID":"28beaa45-8794-4da4-9715-05084646e567","Type":"ContainerStarted","Data":"3f13939d8a5eaddda957f6cedeac0fa0ebf6f9ef81f6e4432558e187852ce291"} Jan 29 14:13:43 crc kubenswrapper[4753]: I0129 14:13:43.864984 4753 generic.go:334] "Generic (PLEG): container finished" podID="28beaa45-8794-4da4-9715-05084646e567" containerID="1cb2ef0e742ed0db993d4566bd709a32826d688b67e66c0349a74a83f03bd6ec" exitCode=0 Jan 29 14:13:43 crc kubenswrapper[4753]: I0129 14:13:43.865085 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9wffl" event={"ID":"28beaa45-8794-4da4-9715-05084646e567","Type":"ContainerDied","Data":"1cb2ef0e742ed0db993d4566bd709a32826d688b67e66c0349a74a83f03bd6ec"} Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.181791 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.304479 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/28beaa45-8794-4da4-9715-05084646e567-node-mnt\") pod \"28beaa45-8794-4da4-9715-05084646e567\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.304645 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh5l4\" (UniqueName: \"kubernetes.io/projected/28beaa45-8794-4da4-9715-05084646e567-kube-api-access-rh5l4\") pod \"28beaa45-8794-4da4-9715-05084646e567\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.304692 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/28beaa45-8794-4da4-9715-05084646e567-crc-storage\") pod \"28beaa45-8794-4da4-9715-05084646e567\" (UID: \"28beaa45-8794-4da4-9715-05084646e567\") " Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.304789 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28beaa45-8794-4da4-9715-05084646e567-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "28beaa45-8794-4da4-9715-05084646e567" (UID: "28beaa45-8794-4da4-9715-05084646e567"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.305378 4753 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/28beaa45-8794-4da4-9715-05084646e567-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.312469 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28beaa45-8794-4da4-9715-05084646e567-kube-api-access-rh5l4" (OuterVolumeSpecName: "kube-api-access-rh5l4") pod "28beaa45-8794-4da4-9715-05084646e567" (UID: "28beaa45-8794-4da4-9715-05084646e567"). InnerVolumeSpecName "kube-api-access-rh5l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.334369 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28beaa45-8794-4da4-9715-05084646e567-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "28beaa45-8794-4da4-9715-05084646e567" (UID: "28beaa45-8794-4da4-9715-05084646e567"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.406488 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh5l4\" (UniqueName: \"kubernetes.io/projected/28beaa45-8794-4da4-9715-05084646e567-kube-api-access-rh5l4\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.406535 4753 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/28beaa45-8794-4da4-9715-05084646e567-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.884537 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9wffl" event={"ID":"28beaa45-8794-4da4-9715-05084646e567","Type":"ContainerDied","Data":"3f13939d8a5eaddda957f6cedeac0fa0ebf6f9ef81f6e4432558e187852ce291"} Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.884600 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f13939d8a5eaddda957f6cedeac0fa0ebf6f9ef81f6e4432558e187852ce291" Jan 29 14:13:45 crc kubenswrapper[4753]: I0129 14:13:45.884629 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wffl" Jan 29 14:13:46 crc kubenswrapper[4753]: I0129 14:13:46.699952 4753 scope.go:117] "RemoveContainer" containerID="450232c4d5f66c6dcf9fecf1ae3cb8be0d6a3fd94d226313437eb869d9d8d5d6" Jan 29 14:13:46 crc kubenswrapper[4753]: I0129 14:13:46.894753 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vfrvp_63926a91-5e42-4768-8277-55a0113cb5e2/kube-multus/1.log" Jan 29 14:13:51 crc kubenswrapper[4753]: I0129 14:13:51.944676 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pxkcg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.028597 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg"] Jan 29 14:13:53 crc kubenswrapper[4753]: E0129 14:13:53.029018 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28beaa45-8794-4da4-9715-05084646e567" containerName="storage" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.029046 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="28beaa45-8794-4da4-9715-05084646e567" containerName="storage" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.029386 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="28beaa45-8794-4da4-9715-05084646e567" containerName="storage" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.031045 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.033543 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.047737 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg"] Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.145633 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.145808 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mn2d\" (UniqueName: \"kubernetes.io/projected/148bd6aa-767b-4aff-9fb1-e0a34a060121-kube-api-access-2mn2d\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.145870 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.247122 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.247312 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mn2d\" (UniqueName: \"kubernetes.io/projected/148bd6aa-767b-4aff-9fb1-e0a34a060121-kube-api-access-2mn2d\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.247417 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.247922 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.248137 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.270329 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mn2d\" (UniqueName: \"kubernetes.io/projected/148bd6aa-767b-4aff-9fb1-e0a34a060121-kube-api-access-2mn2d\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.356353 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.652833 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg"] Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.951284 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" event={"ID":"148bd6aa-767b-4aff-9fb1-e0a34a060121","Type":"ContainerStarted","Data":"e660c4792f05cccf99b5456957bf4c4722ded6cc776210f5a998c4538e74f28f"} Jan 29 14:13:53 crc kubenswrapper[4753]: I0129 14:13:53.951694 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" event={"ID":"148bd6aa-767b-4aff-9fb1-e0a34a060121","Type":"ContainerStarted","Data":"a05eeac36ad62a2cbb6e0bfaef9d0fab528564a3d3ebaf22db91bd65ad67c556"} Jan 29 14:13:54 crc kubenswrapper[4753]: I0129 14:13:54.965895 4753 generic.go:334] "Generic (PLEG): container finished" podID="148bd6aa-767b-4aff-9fb1-e0a34a060121" containerID="e660c4792f05cccf99b5456957bf4c4722ded6cc776210f5a998c4538e74f28f" exitCode=0 Jan 29 14:13:54 crc kubenswrapper[4753]: I0129 14:13:54.965983 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" event={"ID":"148bd6aa-767b-4aff-9fb1-e0a34a060121","Type":"ContainerDied","Data":"e660c4792f05cccf99b5456957bf4c4722ded6cc776210f5a998c4538e74f28f"} Jan 29 14:13:56 crc kubenswrapper[4753]: I0129 14:13:56.981898 4753 generic.go:334] "Generic (PLEG): container finished" podID="148bd6aa-767b-4aff-9fb1-e0a34a060121" containerID="516435191d117169414485c1a47f790cc6e28d8facd5af2594907214305189e5" exitCode=0 Jan 29 14:13:56 crc kubenswrapper[4753]: I0129 14:13:56.982007 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" event={"ID":"148bd6aa-767b-4aff-9fb1-e0a34a060121","Type":"ContainerDied","Data":"516435191d117169414485c1a47f790cc6e28d8facd5af2594907214305189e5"} Jan 29 14:13:57 crc kubenswrapper[4753]: I0129 14:13:57.993182 4753 generic.go:334] "Generic (PLEG): container finished" podID="148bd6aa-767b-4aff-9fb1-e0a34a060121" containerID="5d033344268baea74bedd45d3b6b777e55a3554c97cdfb59b3f43cae4c5b60ab" exitCode=0 Jan 29 14:13:57 crc kubenswrapper[4753]: I0129 14:13:57.993248 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" event={"ID":"148bd6aa-767b-4aff-9fb1-e0a34a060121","Type":"ContainerDied","Data":"5d033344268baea74bedd45d3b6b777e55a3554c97cdfb59b3f43cae4c5b60ab"} Jan 29 14:13:59 crc kubenswrapper[4753]: I0129 14:13:59.349260 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:13:59 crc kubenswrapper[4753]: I0129 14:13:59.449557 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mn2d\" (UniqueName: \"kubernetes.io/projected/148bd6aa-767b-4aff-9fb1-e0a34a060121-kube-api-access-2mn2d\") pod \"148bd6aa-767b-4aff-9fb1-e0a34a060121\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " Jan 29 14:13:59 crc kubenswrapper[4753]: I0129 14:13:59.449715 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-util\") pod \"148bd6aa-767b-4aff-9fb1-e0a34a060121\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " Jan 29 14:13:59 crc kubenswrapper[4753]: I0129 14:13:59.449839 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-bundle\") pod \"148bd6aa-767b-4aff-9fb1-e0a34a060121\" (UID: \"148bd6aa-767b-4aff-9fb1-e0a34a060121\") " Jan 29 14:13:59 crc kubenswrapper[4753]: I0129 14:13:59.450986 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-bundle" (OuterVolumeSpecName: "bundle") pod "148bd6aa-767b-4aff-9fb1-e0a34a060121" (UID: "148bd6aa-767b-4aff-9fb1-e0a34a060121"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:13:59 crc kubenswrapper[4753]: I0129 14:13:59.456688 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148bd6aa-767b-4aff-9fb1-e0a34a060121-kube-api-access-2mn2d" (OuterVolumeSpecName: "kube-api-access-2mn2d") pod "148bd6aa-767b-4aff-9fb1-e0a34a060121" (UID: "148bd6aa-767b-4aff-9fb1-e0a34a060121"). InnerVolumeSpecName "kube-api-access-2mn2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:13:59 crc kubenswrapper[4753]: I0129 14:13:59.475364 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-util" (OuterVolumeSpecName: "util") pod "148bd6aa-767b-4aff-9fb1-e0a34a060121" (UID: "148bd6aa-767b-4aff-9fb1-e0a34a060121"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:13:59 crc kubenswrapper[4753]: I0129 14:13:59.551269 4753 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-util\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:59 crc kubenswrapper[4753]: I0129 14:13:59.551323 4753 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/148bd6aa-767b-4aff-9fb1-e0a34a060121-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:13:59 crc kubenswrapper[4753]: I0129 14:13:59.551345 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mn2d\" (UniqueName: \"kubernetes.io/projected/148bd6aa-767b-4aff-9fb1-e0a34a060121-kube-api-access-2mn2d\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:00 crc kubenswrapper[4753]: I0129 14:14:00.012682 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" event={"ID":"148bd6aa-767b-4aff-9fb1-e0a34a060121","Type":"ContainerDied","Data":"a05eeac36ad62a2cbb6e0bfaef9d0fab528564a3d3ebaf22db91bd65ad67c556"} Jan 29 14:14:00 crc kubenswrapper[4753]: I0129 14:14:00.012742 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a05eeac36ad62a2cbb6e0bfaef9d0fab528564a3d3ebaf22db91bd65ad67c556" Jan 29 14:14:00 crc kubenswrapper[4753]: I0129 14:14:00.012804 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.697796 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-6vgcx"] Jan 29 14:14:01 crc kubenswrapper[4753]: E0129 14:14:01.699334 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148bd6aa-767b-4aff-9fb1-e0a34a060121" containerName="pull" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.699505 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="148bd6aa-767b-4aff-9fb1-e0a34a060121" containerName="pull" Jan 29 14:14:01 crc kubenswrapper[4753]: E0129 14:14:01.699621 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148bd6aa-767b-4aff-9fb1-e0a34a060121" containerName="extract" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.699725 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="148bd6aa-767b-4aff-9fb1-e0a34a060121" containerName="extract" Jan 29 14:14:01 crc kubenswrapper[4753]: E0129 14:14:01.699831 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148bd6aa-767b-4aff-9fb1-e0a34a060121" containerName="util" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.699946 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="148bd6aa-767b-4aff-9fb1-e0a34a060121" containerName="util" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.700278 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="148bd6aa-767b-4aff-9fb1-e0a34a060121" containerName="extract" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.701000 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-6vgcx" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.704564 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.704663 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-88k6c" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.705575 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.720769 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-6vgcx"] Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.891285 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvct4\" (UniqueName: \"kubernetes.io/projected/b343a24e-ab8c-4aff-8eeb-99b1f50868eb-kube-api-access-cvct4\") pod \"nmstate-operator-646758c888-6vgcx\" (UID: \"b343a24e-ab8c-4aff-8eeb-99b1f50868eb\") " pod="openshift-nmstate/nmstate-operator-646758c888-6vgcx" Jan 29 14:14:01 crc kubenswrapper[4753]: I0129 14:14:01.992938 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvct4\" (UniqueName: \"kubernetes.io/projected/b343a24e-ab8c-4aff-8eeb-99b1f50868eb-kube-api-access-cvct4\") pod \"nmstate-operator-646758c888-6vgcx\" (UID: \"b343a24e-ab8c-4aff-8eeb-99b1f50868eb\") " pod="openshift-nmstate/nmstate-operator-646758c888-6vgcx" Jan 29 14:14:02 crc kubenswrapper[4753]: I0129 14:14:02.024087 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvct4\" (UniqueName: \"kubernetes.io/projected/b343a24e-ab8c-4aff-8eeb-99b1f50868eb-kube-api-access-cvct4\") pod \"nmstate-operator-646758c888-6vgcx\" (UID: \"b343a24e-ab8c-4aff-8eeb-99b1f50868eb\") " pod="openshift-nmstate/nmstate-operator-646758c888-6vgcx" Jan 29 14:14:02 crc kubenswrapper[4753]: I0129 14:14:02.070520 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-6vgcx" Jan 29 14:14:02 crc kubenswrapper[4753]: I0129 14:14:02.295059 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-6vgcx"] Jan 29 14:14:03 crc kubenswrapper[4753]: I0129 14:14:03.034519 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-6vgcx" event={"ID":"b343a24e-ab8c-4aff-8eeb-99b1f50868eb","Type":"ContainerStarted","Data":"6e2d84ccc5867fd705a07b6f4059ca64cf12383bf795f043bb0a4fcf888de02d"} Jan 29 14:14:05 crc kubenswrapper[4753]: I0129 14:14:05.049377 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-6vgcx" event={"ID":"b343a24e-ab8c-4aff-8eeb-99b1f50868eb","Type":"ContainerStarted","Data":"8b7458cff06ac4a1e3b8b9f93f7e78bb968b15377f46c32a22b0cfd226ca4c0d"} Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.034246 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-6vgcx" podStartSLOduration=2.884914557 podStartE2EDuration="5.034218827s" podCreationTimestamp="2026-01-29 14:14:01 +0000 UTC" firstStartedPulling="2026-01-29 14:14:02.335494642 +0000 UTC m=+677.030229024" lastFinishedPulling="2026-01-29 14:14:04.484798912 +0000 UTC m=+679.179533294" observedRunningTime="2026-01-29 14:14:05.073288862 +0000 UTC m=+679.768023284" watchObservedRunningTime="2026-01-29 14:14:06.034218827 +0000 UTC m=+680.728953249" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.036591 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-nvcx7"] Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.038117 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-nvcx7" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.040383 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-sszlm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.058916 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-nvcx7"] Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.066837 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z"] Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.067523 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.070818 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.072057 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-wfhtc"] Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.072813 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.084814 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksjhq\" (UniqueName: \"kubernetes.io/projected/34db578d-7849-457b-bf77-5bd07a7fb0b5-kube-api-access-ksjhq\") pod \"nmstate-webhook-8474b5b9d8-ksd2z\" (UID: \"34db578d-7849-457b-bf77-5bd07a7fb0b5\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.084858 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/34db578d-7849-457b-bf77-5bd07a7fb0b5-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-ksd2z\" (UID: \"34db578d-7849-457b-bf77-5bd07a7fb0b5\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.084904 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gbxp\" (UniqueName: \"kubernetes.io/projected/3c3c4dba-c774-449f-a0a9-afd2119b5730-kube-api-access-8gbxp\") pod \"nmstate-metrics-54757c584b-nvcx7\" (UID: \"3c3c4dba-c774-449f-a0a9-afd2119b5730\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-nvcx7" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.175893 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z"] Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.186412 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7738b656-597e-4e3c-89ea-3b16e36b5c9f-nmstate-lock\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.186455 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gbxp\" (UniqueName: \"kubernetes.io/projected/3c3c4dba-c774-449f-a0a9-afd2119b5730-kube-api-access-8gbxp\") pod \"nmstate-metrics-54757c584b-nvcx7\" (UID: \"3c3c4dba-c774-449f-a0a9-afd2119b5730\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-nvcx7" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.186499 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpr42\" (UniqueName: \"kubernetes.io/projected/7738b656-597e-4e3c-89ea-3b16e36b5c9f-kube-api-access-fpr42\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.186623 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7738b656-597e-4e3c-89ea-3b16e36b5c9f-dbus-socket\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.186676 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksjhq\" (UniqueName: \"kubernetes.io/projected/34db578d-7849-457b-bf77-5bd07a7fb0b5-kube-api-access-ksjhq\") pod \"nmstate-webhook-8474b5b9d8-ksd2z\" (UID: \"34db578d-7849-457b-bf77-5bd07a7fb0b5\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.186717 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7738b656-597e-4e3c-89ea-3b16e36b5c9f-ovs-socket\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.186752 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/34db578d-7849-457b-bf77-5bd07a7fb0b5-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-ksd2z\" (UID: \"34db578d-7849-457b-bf77-5bd07a7fb0b5\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:06 crc kubenswrapper[4753]: E0129 14:14:06.186903 4753 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 29 14:14:06 crc kubenswrapper[4753]: E0129 14:14:06.186945 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34db578d-7849-457b-bf77-5bd07a7fb0b5-tls-key-pair podName:34db578d-7849-457b-bf77-5bd07a7fb0b5 nodeName:}" failed. No retries permitted until 2026-01-29 14:14:06.68692895 +0000 UTC m=+681.381663332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/34db578d-7849-457b-bf77-5bd07a7fb0b5-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-ksd2z" (UID: "34db578d-7849-457b-bf77-5bd07a7fb0b5") : secret "openshift-nmstate-webhook" not found Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.203634 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gbxp\" (UniqueName: \"kubernetes.io/projected/3c3c4dba-c774-449f-a0a9-afd2119b5730-kube-api-access-8gbxp\") pod \"nmstate-metrics-54757c584b-nvcx7\" (UID: \"3c3c4dba-c774-449f-a0a9-afd2119b5730\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-nvcx7" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.206841 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksjhq\" (UniqueName: \"kubernetes.io/projected/34db578d-7849-457b-bf77-5bd07a7fb0b5-kube-api-access-ksjhq\") pod \"nmstate-webhook-8474b5b9d8-ksd2z\" (UID: \"34db578d-7849-457b-bf77-5bd07a7fb0b5\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.220752 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4"] Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.221430 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.223286 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.223601 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-t246g" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.223700 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.233005 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4"] Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.288552 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7738b656-597e-4e3c-89ea-3b16e36b5c9f-nmstate-lock\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.288606 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpr42\" (UniqueName: \"kubernetes.io/projected/7738b656-597e-4e3c-89ea-3b16e36b5c9f-kube-api-access-fpr42\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.288665 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clmh9\" (UniqueName: \"kubernetes.io/projected/69a0f59f-b01e-4f27-9001-bd2460df3a27-kube-api-access-clmh9\") pod \"nmstate-console-plugin-7754f76f8b-pxnz4\" (UID: \"69a0f59f-b01e-4f27-9001-bd2460df3a27\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.289002 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7738b656-597e-4e3c-89ea-3b16e36b5c9f-dbus-socket\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.288660 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7738b656-597e-4e3c-89ea-3b16e36b5c9f-nmstate-lock\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.289122 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/69a0f59f-b01e-4f27-9001-bd2460df3a27-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-pxnz4\" (UID: \"69a0f59f-b01e-4f27-9001-bd2460df3a27\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.289445 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7738b656-597e-4e3c-89ea-3b16e36b5c9f-dbus-socket\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.289530 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7738b656-597e-4e3c-89ea-3b16e36b5c9f-ovs-socket\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.289567 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/69a0f59f-b01e-4f27-9001-bd2460df3a27-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-pxnz4\" (UID: \"69a0f59f-b01e-4f27-9001-bd2460df3a27\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.289638 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7738b656-597e-4e3c-89ea-3b16e36b5c9f-ovs-socket\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.304517 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpr42\" (UniqueName: \"kubernetes.io/projected/7738b656-597e-4e3c-89ea-3b16e36b5c9f-kube-api-access-fpr42\") pod \"nmstate-handler-wfhtc\" (UID: \"7738b656-597e-4e3c-89ea-3b16e36b5c9f\") " pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.390468 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/69a0f59f-b01e-4f27-9001-bd2460df3a27-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-pxnz4\" (UID: \"69a0f59f-b01e-4f27-9001-bd2460df3a27\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.390523 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/69a0f59f-b01e-4f27-9001-bd2460df3a27-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-pxnz4\" (UID: \"69a0f59f-b01e-4f27-9001-bd2460df3a27\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.390618 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clmh9\" (UniqueName: \"kubernetes.io/projected/69a0f59f-b01e-4f27-9001-bd2460df3a27-kube-api-access-clmh9\") pod \"nmstate-console-plugin-7754f76f8b-pxnz4\" (UID: \"69a0f59f-b01e-4f27-9001-bd2460df3a27\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.391840 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/69a0f59f-b01e-4f27-9001-bd2460df3a27-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-pxnz4\" (UID: \"69a0f59f-b01e-4f27-9001-bd2460df3a27\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.393399 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/69a0f59f-b01e-4f27-9001-bd2460df3a27-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-pxnz4\" (UID: \"69a0f59f-b01e-4f27-9001-bd2460df3a27\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.398171 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b5f5df768-q7mxm"] Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.398974 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.401625 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-nvcx7" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.417170 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clmh9\" (UniqueName: \"kubernetes.io/projected/69a0f59f-b01e-4f27-9001-bd2460df3a27-kube-api-access-clmh9\") pod \"nmstate-console-plugin-7754f76f8b-pxnz4\" (UID: \"69a0f59f-b01e-4f27-9001-bd2460df3a27\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.420569 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b5f5df768-q7mxm"] Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.427849 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.491804 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-oauth-serving-cert\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.491895 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcfc2\" (UniqueName: \"kubernetes.io/projected/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-kube-api-access-pcfc2\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.491956 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-console-serving-cert\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.491971 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-service-ca\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.492019 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-console-config\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.492083 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-trusted-ca-bundle\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.492198 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-console-oauth-config\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.550629 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.593373 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-oauth-serving-cert\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.593472 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcfc2\" (UniqueName: \"kubernetes.io/projected/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-kube-api-access-pcfc2\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.593540 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-console-serving-cert\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.593557 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-service-ca\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.593601 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-console-config\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.593615 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-trusted-ca-bundle\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.593643 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-console-oauth-config\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.595474 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-oauth-serving-cert\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.595570 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-console-config\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.596025 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-service-ca\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.596427 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-trusted-ca-bundle\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.600242 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-console-serving-cert\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.601392 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-console-oauth-config\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.608335 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-nvcx7"] Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.612142 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcfc2\" (UniqueName: \"kubernetes.io/projected/c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d-kube-api-access-pcfc2\") pod \"console-5b5f5df768-q7mxm\" (UID: \"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d\") " pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:06 crc kubenswrapper[4753]: W0129 14:14:06.612655 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3c4dba_c774_449f_a0a9_afd2119b5730.slice/crio-6ccf154bee351a4ca13d599285f35552565cf8f22d24aa9eb8c3651bea939cc8 WatchSource:0}: Error finding container 6ccf154bee351a4ca13d599285f35552565cf8f22d24aa9eb8c3651bea939cc8: Status 404 returned error can't find the container with id 6ccf154bee351a4ca13d599285f35552565cf8f22d24aa9eb8c3651bea939cc8 Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.695619 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/34db578d-7849-457b-bf77-5bd07a7fb0b5-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-ksd2z\" (UID: \"34db578d-7849-457b-bf77-5bd07a7fb0b5\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.700031 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/34db578d-7849-457b-bf77-5bd07a7fb0b5-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-ksd2z\" (UID: \"34db578d-7849-457b-bf77-5bd07a7fb0b5\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.718627 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.722315 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4"] Jan 29 14:14:06 crc kubenswrapper[4753]: W0129 14:14:06.730938 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a0f59f_b01e_4f27_9001_bd2460df3a27.slice/crio-bc2b4b91222964866c25cadbbf8884e224d451c1d4680816dbf309c6a83d62d2 WatchSource:0}: Error finding container bc2b4b91222964866c25cadbbf8884e224d451c1d4680816dbf309c6a83d62d2: Status 404 returned error can't find the container with id bc2b4b91222964866c25cadbbf8884e224d451c1d4680816dbf309c6a83d62d2 Jan 29 14:14:06 crc kubenswrapper[4753]: I0129 14:14:06.764246 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:07 crc kubenswrapper[4753]: I0129 14:14:07.000758 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b5f5df768-q7mxm"] Jan 29 14:14:07 crc kubenswrapper[4753]: W0129 14:14:07.003989 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ac8a3c_f3d0_43e7_818b_7ad9fe76163d.slice/crio-50daeda41e52bdf97f5ab65b17d2b7ebe9ca0931bcb517e49c1925af95bafa2f WatchSource:0}: Error finding container 50daeda41e52bdf97f5ab65b17d2b7ebe9ca0931bcb517e49c1925af95bafa2f: Status 404 returned error can't find the container with id 50daeda41e52bdf97f5ab65b17d2b7ebe9ca0931bcb517e49c1925af95bafa2f Jan 29 14:14:07 crc kubenswrapper[4753]: I0129 14:14:07.064318 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" event={"ID":"69a0f59f-b01e-4f27-9001-bd2460df3a27","Type":"ContainerStarted","Data":"bc2b4b91222964866c25cadbbf8884e224d451c1d4680816dbf309c6a83d62d2"} Jan 29 14:14:07 crc kubenswrapper[4753]: I0129 14:14:07.066319 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-nvcx7" event={"ID":"3c3c4dba-c774-449f-a0a9-afd2119b5730","Type":"ContainerStarted","Data":"6ccf154bee351a4ca13d599285f35552565cf8f22d24aa9eb8c3651bea939cc8"} Jan 29 14:14:07 crc kubenswrapper[4753]: I0129 14:14:07.067868 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b5f5df768-q7mxm" event={"ID":"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d","Type":"ContainerStarted","Data":"50daeda41e52bdf97f5ab65b17d2b7ebe9ca0931bcb517e49c1925af95bafa2f"} Jan 29 14:14:07 crc kubenswrapper[4753]: I0129 14:14:07.070941 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wfhtc" event={"ID":"7738b656-597e-4e3c-89ea-3b16e36b5c9f","Type":"ContainerStarted","Data":"aae68a92b705aa86778451e578934b6e1cff59d500650ce3309a703c12308666"} Jan 29 14:14:07 crc kubenswrapper[4753]: I0129 14:14:07.182197 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z"] Jan 29 14:14:07 crc kubenswrapper[4753]: W0129 14:14:07.191339 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34db578d_7849_457b_bf77_5bd07a7fb0b5.slice/crio-b8f8e10402409da0ab0c707f15e40c17b997f6d91afed74bfa25fc4c2cea787b WatchSource:0}: Error finding container b8f8e10402409da0ab0c707f15e40c17b997f6d91afed74bfa25fc4c2cea787b: Status 404 returned error can't find the container with id b8f8e10402409da0ab0c707f15e40c17b997f6d91afed74bfa25fc4c2cea787b Jan 29 14:14:08 crc kubenswrapper[4753]: I0129 14:14:08.078234 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b5f5df768-q7mxm" event={"ID":"c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d","Type":"ContainerStarted","Data":"65523676b28564e6978490d8bdfda126b274bc3255a9ae0e19648f1fea49b3c9"} Jan 29 14:14:08 crc kubenswrapper[4753]: I0129 14:14:08.081274 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" event={"ID":"34db578d-7849-457b-bf77-5bd07a7fb0b5","Type":"ContainerStarted","Data":"b8f8e10402409da0ab0c707f15e40c17b997f6d91afed74bfa25fc4c2cea787b"} Jan 29 14:14:08 crc kubenswrapper[4753]: I0129 14:14:08.094313 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b5f5df768-q7mxm" podStartSLOduration=2.094290518 podStartE2EDuration="2.094290518s" podCreationTimestamp="2026-01-29 14:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:14:08.091480292 +0000 UTC m=+682.786214744" watchObservedRunningTime="2026-01-29 14:14:08.094290518 +0000 UTC m=+682.789024930" Jan 29 14:14:10 crc kubenswrapper[4753]: I0129 14:14:10.095795 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" event={"ID":"34db578d-7849-457b-bf77-5bd07a7fb0b5","Type":"ContainerStarted","Data":"b64284edc6051c6b1fb6709fef1bb65679ecc0382682b675a549ba5366c8bace"} Jan 29 14:14:10 crc kubenswrapper[4753]: I0129 14:14:10.096508 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:10 crc kubenswrapper[4753]: I0129 14:14:10.098356 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-nvcx7" event={"ID":"3c3c4dba-c774-449f-a0a9-afd2119b5730","Type":"ContainerStarted","Data":"dcf89435eccd44c74ac0a9f66a824f22ceb84d6ad808f544d3d93c26d792a740"} Jan 29 14:14:10 crc kubenswrapper[4753]: I0129 14:14:10.101134 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" event={"ID":"69a0f59f-b01e-4f27-9001-bd2460df3a27","Type":"ContainerStarted","Data":"8f2e9e3914413bc8153f967af7db9bed1c04402f736c087bfc6cd319aa99e882"} Jan 29 14:14:10 crc kubenswrapper[4753]: I0129 14:14:10.103522 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wfhtc" event={"ID":"7738b656-597e-4e3c-89ea-3b16e36b5c9f","Type":"ContainerStarted","Data":"d2141b09c4e2fe109fe0126fc4acf5257555a996bdb8c6839a347cc2682fc547"} Jan 29 14:14:10 crc kubenswrapper[4753]: I0129 14:14:10.103747 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:10 crc kubenswrapper[4753]: I0129 14:14:10.121606 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" podStartSLOduration=1.817420454 podStartE2EDuration="4.121579005s" podCreationTimestamp="2026-01-29 14:14:06 +0000 UTC" firstStartedPulling="2026-01-29 14:14:07.194393212 +0000 UTC m=+681.889127594" lastFinishedPulling="2026-01-29 14:14:09.498551753 +0000 UTC m=+684.193286145" observedRunningTime="2026-01-29 14:14:10.112442309 +0000 UTC m=+684.807176701" watchObservedRunningTime="2026-01-29 14:14:10.121579005 +0000 UTC m=+684.816313417" Jan 29 14:14:10 crc kubenswrapper[4753]: I0129 14:14:10.136692 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-wfhtc" podStartSLOduration=1.112704077 podStartE2EDuration="4.136656432s" podCreationTimestamp="2026-01-29 14:14:06 +0000 UTC" firstStartedPulling="2026-01-29 14:14:06.452239174 +0000 UTC m=+681.146973556" lastFinishedPulling="2026-01-29 14:14:09.476191509 +0000 UTC m=+684.170925911" observedRunningTime="2026-01-29 14:14:10.128589394 +0000 UTC m=+684.823323806" watchObservedRunningTime="2026-01-29 14:14:10.136656432 +0000 UTC m=+684.831390864" Jan 29 14:14:10 crc kubenswrapper[4753]: I0129 14:14:10.155891 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-pxnz4" podStartSLOduration=1.42580494 podStartE2EDuration="4.155870961s" podCreationTimestamp="2026-01-29 14:14:06 +0000 UTC" firstStartedPulling="2026-01-29 14:14:06.735470501 +0000 UTC m=+681.430204883" lastFinishedPulling="2026-01-29 14:14:09.465536512 +0000 UTC m=+684.160270904" observedRunningTime="2026-01-29 14:14:10.151328719 +0000 UTC m=+684.846063141" watchObservedRunningTime="2026-01-29 14:14:10.155870961 +0000 UTC m=+684.850605353" Jan 29 14:14:13 crc kubenswrapper[4753]: I0129 14:14:13.133110 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-nvcx7" event={"ID":"3c3c4dba-c774-449f-a0a9-afd2119b5730","Type":"ContainerStarted","Data":"7f514b8f53b80b82de886a22933c28e4c2ccaf973bc83cf4e8e468a34d9da8bc"} Jan 29 14:14:13 crc kubenswrapper[4753]: I0129 14:14:13.157002 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-nvcx7" podStartSLOduration=1.6664879780000001 podStartE2EDuration="7.15698061s" podCreationTimestamp="2026-01-29 14:14:06 +0000 UTC" firstStartedPulling="2026-01-29 14:14:06.614925316 +0000 UTC m=+681.309659698" lastFinishedPulling="2026-01-29 14:14:12.105417948 +0000 UTC m=+686.800152330" observedRunningTime="2026-01-29 14:14:13.15511683 +0000 UTC m=+687.849851242" watchObservedRunningTime="2026-01-29 14:14:13.15698061 +0000 UTC m=+687.851715022" Jan 29 14:14:16 crc kubenswrapper[4753]: I0129 14:14:16.473425 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-wfhtc" Jan 29 14:14:16 crc kubenswrapper[4753]: I0129 14:14:16.765457 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:16 crc kubenswrapper[4753]: I0129 14:14:16.765539 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:16 crc kubenswrapper[4753]: I0129 14:14:16.773808 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:17 crc kubenswrapper[4753]: I0129 14:14:17.173217 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b5f5df768-q7mxm" Jan 29 14:14:17 crc kubenswrapper[4753]: I0129 14:14:17.288446 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bxx4k"] Jan 29 14:14:26 crc kubenswrapper[4753]: I0129 14:14:26.723546 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ksd2z" Jan 29 14:14:33 crc kubenswrapper[4753]: I0129 14:14:33.884564 4753 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.327839 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-bxx4k" podUID="810f6f52-858d-46f6-b2d2-71f9c3135263" containerName="console" containerID="cri-o://afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d" gracePeriod=15 Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.734004 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bxx4k_810f6f52-858d-46f6-b2d2-71f9c3135263/console/0.log" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.734124 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.761858 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-console-config\") pod \"810f6f52-858d-46f6-b2d2-71f9c3135263\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.761928 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-service-ca\") pod \"810f6f52-858d-46f6-b2d2-71f9c3135263\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.761976 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-oauth-config\") pod \"810f6f52-858d-46f6-b2d2-71f9c3135263\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.762020 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-serving-cert\") pod \"810f6f52-858d-46f6-b2d2-71f9c3135263\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.762071 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-oauth-serving-cert\") pod \"810f6f52-858d-46f6-b2d2-71f9c3135263\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.762096 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-trusted-ca-bundle\") pod \"810f6f52-858d-46f6-b2d2-71f9c3135263\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.762123 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz2vq\" (UniqueName: \"kubernetes.io/projected/810f6f52-858d-46f6-b2d2-71f9c3135263-kube-api-access-dz2vq\") pod \"810f6f52-858d-46f6-b2d2-71f9c3135263\" (UID: \"810f6f52-858d-46f6-b2d2-71f9c3135263\") " Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.762964 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-console-config" (OuterVolumeSpecName: "console-config") pod "810f6f52-858d-46f6-b2d2-71f9c3135263" (UID: "810f6f52-858d-46f6-b2d2-71f9c3135263"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.763034 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "810f6f52-858d-46f6-b2d2-71f9c3135263" (UID: "810f6f52-858d-46f6-b2d2-71f9c3135263"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.763270 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "810f6f52-858d-46f6-b2d2-71f9c3135263" (UID: "810f6f52-858d-46f6-b2d2-71f9c3135263"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.763567 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-service-ca" (OuterVolumeSpecName: "service-ca") pod "810f6f52-858d-46f6-b2d2-71f9c3135263" (UID: "810f6f52-858d-46f6-b2d2-71f9c3135263"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.769223 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810f6f52-858d-46f6-b2d2-71f9c3135263-kube-api-access-dz2vq" (OuterVolumeSpecName: "kube-api-access-dz2vq") pod "810f6f52-858d-46f6-b2d2-71f9c3135263" (UID: "810f6f52-858d-46f6-b2d2-71f9c3135263"). InnerVolumeSpecName "kube-api-access-dz2vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.771323 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "810f6f52-858d-46f6-b2d2-71f9c3135263" (UID: "810f6f52-858d-46f6-b2d2-71f9c3135263"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.774958 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "810f6f52-858d-46f6-b2d2-71f9c3135263" (UID: "810f6f52-858d-46f6-b2d2-71f9c3135263"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.863941 4753 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.863980 4753 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.863994 4753 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.864007 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz2vq\" (UniqueName: \"kubernetes.io/projected/810f6f52-858d-46f6-b2d2-71f9c3135263-kube-api-access-dz2vq\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.864022 4753 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-console-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.864035 4753 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/810f6f52-858d-46f6-b2d2-71f9c3135263-service-ca\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:42 crc kubenswrapper[4753]: I0129 14:14:42.864046 4753 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/810f6f52-858d-46f6-b2d2-71f9c3135263-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.229727 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf"] Jan 29 14:14:43 crc kubenswrapper[4753]: E0129 14:14:43.230062 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810f6f52-858d-46f6-b2d2-71f9c3135263" containerName="console" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.230084 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="810f6f52-858d-46f6-b2d2-71f9c3135263" containerName="console" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.230233 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="810f6f52-858d-46f6-b2d2-71f9c3135263" containerName="console" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.231243 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.233956 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.252003 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf"] Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.348978 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bxx4k_810f6f52-858d-46f6-b2d2-71f9c3135263/console/0.log" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.349041 4753 generic.go:334] "Generic (PLEG): container finished" podID="810f6f52-858d-46f6-b2d2-71f9c3135263" containerID="afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d" exitCode=2 Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.349090 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bxx4k" event={"ID":"810f6f52-858d-46f6-b2d2-71f9c3135263","Type":"ContainerDied","Data":"afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d"} Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.349139 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bxx4k" event={"ID":"810f6f52-858d-46f6-b2d2-71f9c3135263","Type":"ContainerDied","Data":"ec77a16d63b0f3e13f632231d0532b6182f70ffc8ad80a54069e8b11c4af903c"} Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.349199 4753 scope.go:117] "RemoveContainer" containerID="afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.349355 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bxx4k" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.370641 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.370687 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.370730 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7jmj\" (UniqueName: \"kubernetes.io/projected/11f899c0-458d-470c-862b-a13ec652365c-kube-api-access-p7jmj\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.371726 4753 scope.go:117] "RemoveContainer" containerID="afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d" Jan 29 14:14:43 crc kubenswrapper[4753]: E0129 14:14:43.372172 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d\": container with ID starting with afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d not found: ID does not exist" containerID="afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.372207 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d"} err="failed to get container status \"afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d\": rpc error: code = NotFound desc = could not find container \"afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d\": container with ID starting with afa99e323c5f0431f2d3b3a7dd62d5a78f7038e5aedfe5530b9133937ef42d6d not found: ID does not exist" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.381488 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bxx4k"] Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.386346 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-bxx4k"] Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.471802 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7jmj\" (UniqueName: \"kubernetes.io/projected/11f899c0-458d-470c-862b-a13ec652365c-kube-api-access-p7jmj\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.471929 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.471982 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.472651 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.472798 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.499753 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7jmj\" (UniqueName: \"kubernetes.io/projected/11f899c0-458d-470c-862b-a13ec652365c-kube-api-access-p7jmj\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.562787 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:43 crc kubenswrapper[4753]: I0129 14:14:43.875895 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf"] Jan 29 14:14:44 crc kubenswrapper[4753]: I0129 14:14:44.156467 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="810f6f52-858d-46f6-b2d2-71f9c3135263" path="/var/lib/kubelet/pods/810f6f52-858d-46f6-b2d2-71f9c3135263/volumes" Jan 29 14:14:44 crc kubenswrapper[4753]: I0129 14:14:44.363843 4753 generic.go:334] "Generic (PLEG): container finished" podID="11f899c0-458d-470c-862b-a13ec652365c" containerID="49a8cc6df3527360b7db5abf7cc7b039adfd8ea2671754e19724caba3ab5c95e" exitCode=0 Jan 29 14:14:44 crc kubenswrapper[4753]: I0129 14:14:44.363922 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" event={"ID":"11f899c0-458d-470c-862b-a13ec652365c","Type":"ContainerDied","Data":"49a8cc6df3527360b7db5abf7cc7b039adfd8ea2671754e19724caba3ab5c95e"} Jan 29 14:14:44 crc kubenswrapper[4753]: I0129 14:14:44.364009 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" event={"ID":"11f899c0-458d-470c-862b-a13ec652365c","Type":"ContainerStarted","Data":"2434902d41212ba159e6fb5e416894f8aad6f254cbe4a761c93f57e270ca46df"} Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.580928 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-76ckz"] Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.583558 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.595463 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-76ckz"] Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.707129 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-utilities\") pod \"redhat-operators-76ckz\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.707238 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-catalog-content\") pod \"redhat-operators-76ckz\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.707378 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz8m4\" (UniqueName: \"kubernetes.io/projected/7c9c41ea-ce38-4e1d-97ec-33712358724b-kube-api-access-vz8m4\") pod \"redhat-operators-76ckz\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.808680 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz8m4\" (UniqueName: \"kubernetes.io/projected/7c9c41ea-ce38-4e1d-97ec-33712358724b-kube-api-access-vz8m4\") pod \"redhat-operators-76ckz\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.808786 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-utilities\") pod \"redhat-operators-76ckz\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.808817 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-catalog-content\") pod \"redhat-operators-76ckz\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.809434 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-utilities\") pod \"redhat-operators-76ckz\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.809587 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-catalog-content\") pod \"redhat-operators-76ckz\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.835501 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz8m4\" (UniqueName: \"kubernetes.io/projected/7c9c41ea-ce38-4e1d-97ec-33712358724b-kube-api-access-vz8m4\") pod \"redhat-operators-76ckz\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:45 crc kubenswrapper[4753]: I0129 14:14:45.955144 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:46 crc kubenswrapper[4753]: I0129 14:14:46.191020 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-76ckz"] Jan 29 14:14:46 crc kubenswrapper[4753]: W0129 14:14:46.202694 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9c41ea_ce38_4e1d_97ec_33712358724b.slice/crio-a729682583185c3f4dfa258776fef95c9fee01291964758ebd7b4cb39d6cd2e9 WatchSource:0}: Error finding container a729682583185c3f4dfa258776fef95c9fee01291964758ebd7b4cb39d6cd2e9: Status 404 returned error can't find the container with id a729682583185c3f4dfa258776fef95c9fee01291964758ebd7b4cb39d6cd2e9 Jan 29 14:14:46 crc kubenswrapper[4753]: I0129 14:14:46.378235 4753 generic.go:334] "Generic (PLEG): container finished" podID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerID="f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03" exitCode=0 Jan 29 14:14:46 crc kubenswrapper[4753]: I0129 14:14:46.378619 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ckz" event={"ID":"7c9c41ea-ce38-4e1d-97ec-33712358724b","Type":"ContainerDied","Data":"f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03"} Jan 29 14:14:46 crc kubenswrapper[4753]: I0129 14:14:46.378648 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ckz" event={"ID":"7c9c41ea-ce38-4e1d-97ec-33712358724b","Type":"ContainerStarted","Data":"a729682583185c3f4dfa258776fef95c9fee01291964758ebd7b4cb39d6cd2e9"} Jan 29 14:14:46 crc kubenswrapper[4753]: I0129 14:14:46.382934 4753 generic.go:334] "Generic (PLEG): container finished" podID="11f899c0-458d-470c-862b-a13ec652365c" containerID="51ddb87bb183d0420dd2992968d97f76bc0a583eeba97548073889f252fad2c4" exitCode=0 Jan 29 14:14:46 crc kubenswrapper[4753]: I0129 14:14:46.382964 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" event={"ID":"11f899c0-458d-470c-862b-a13ec652365c","Type":"ContainerDied","Data":"51ddb87bb183d0420dd2992968d97f76bc0a583eeba97548073889f252fad2c4"} Jan 29 14:14:46 crc kubenswrapper[4753]: E0129 14:14:46.410334 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9c41ea_ce38_4e1d_97ec_33712358724b.slice/crio-f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9c41ea_ce38_4e1d_97ec_33712358724b.slice/crio-conmon-f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03.scope\": RecentStats: unable to find data in memory cache]" Jan 29 14:14:47 crc kubenswrapper[4753]: I0129 14:14:47.392933 4753 generic.go:334] "Generic (PLEG): container finished" podID="11f899c0-458d-470c-862b-a13ec652365c" containerID="8ffa47beb1f280d72e07ffe79e18a0830cbd1eb3d93acb73620b843bf6b1c0f4" exitCode=0 Jan 29 14:14:47 crc kubenswrapper[4753]: I0129 14:14:47.393001 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" event={"ID":"11f899c0-458d-470c-862b-a13ec652365c","Type":"ContainerDied","Data":"8ffa47beb1f280d72e07ffe79e18a0830cbd1eb3d93acb73620b843bf6b1c0f4"} Jan 29 14:14:47 crc kubenswrapper[4753]: I0129 14:14:47.396706 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ckz" event={"ID":"7c9c41ea-ce38-4e1d-97ec-33712358724b","Type":"ContainerStarted","Data":"11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef"} Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.405626 4753 generic.go:334] "Generic (PLEG): container finished" podID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerID="11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef" exitCode=0 Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.405717 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ckz" event={"ID":"7c9c41ea-ce38-4e1d-97ec-33712358724b","Type":"ContainerDied","Data":"11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef"} Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.747276 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.767113 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-bundle\") pod \"11f899c0-458d-470c-862b-a13ec652365c\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.767199 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7jmj\" (UniqueName: \"kubernetes.io/projected/11f899c0-458d-470c-862b-a13ec652365c-kube-api-access-p7jmj\") pod \"11f899c0-458d-470c-862b-a13ec652365c\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.767258 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-util\") pod \"11f899c0-458d-470c-862b-a13ec652365c\" (UID: \"11f899c0-458d-470c-862b-a13ec652365c\") " Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.770257 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-bundle" (OuterVolumeSpecName: "bundle") pod "11f899c0-458d-470c-862b-a13ec652365c" (UID: "11f899c0-458d-470c-862b-a13ec652365c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.811282 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-util" (OuterVolumeSpecName: "util") pod "11f899c0-458d-470c-862b-a13ec652365c" (UID: "11f899c0-458d-470c-862b-a13ec652365c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.825070 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f899c0-458d-470c-862b-a13ec652365c-kube-api-access-p7jmj" (OuterVolumeSpecName: "kube-api-access-p7jmj") pod "11f899c0-458d-470c-862b-a13ec652365c" (UID: "11f899c0-458d-470c-862b-a13ec652365c"). InnerVolumeSpecName "kube-api-access-p7jmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.873109 4753 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.873176 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7jmj\" (UniqueName: \"kubernetes.io/projected/11f899c0-458d-470c-862b-a13ec652365c-kube-api-access-p7jmj\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:48 crc kubenswrapper[4753]: I0129 14:14:48.873197 4753 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11f899c0-458d-470c-862b-a13ec652365c-util\") on node \"crc\" DevicePath \"\"" Jan 29 14:14:49 crc kubenswrapper[4753]: I0129 14:14:49.411824 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" event={"ID":"11f899c0-458d-470c-862b-a13ec652365c","Type":"ContainerDied","Data":"2434902d41212ba159e6fb5e416894f8aad6f254cbe4a761c93f57e270ca46df"} Jan 29 14:14:49 crc kubenswrapper[4753]: I0129 14:14:49.411862 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2434902d41212ba159e6fb5e416894f8aad6f254cbe4a761c93f57e270ca46df" Jan 29 14:14:49 crc kubenswrapper[4753]: I0129 14:14:49.411965 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf" Jan 29 14:14:49 crc kubenswrapper[4753]: I0129 14:14:49.416852 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ckz" event={"ID":"7c9c41ea-ce38-4e1d-97ec-33712358724b","Type":"ContainerStarted","Data":"89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e"} Jan 29 14:14:49 crc kubenswrapper[4753]: I0129 14:14:49.451217 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-76ckz" podStartSLOduration=2.005527185 podStartE2EDuration="4.451194627s" podCreationTimestamp="2026-01-29 14:14:45 +0000 UTC" firstStartedPulling="2026-01-29 14:14:46.380310834 +0000 UTC m=+721.075045226" lastFinishedPulling="2026-01-29 14:14:48.825978286 +0000 UTC m=+723.520712668" observedRunningTime="2026-01-29 14:14:49.448596636 +0000 UTC m=+724.143331058" watchObservedRunningTime="2026-01-29 14:14:49.451194627 +0000 UTC m=+724.145929019" Jan 29 14:14:55 crc kubenswrapper[4753]: I0129 14:14:55.955857 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:55 crc kubenswrapper[4753]: I0129 14:14:55.956454 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:14:56 crc kubenswrapper[4753]: I0129 14:14:56.996559 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-76ckz" podUID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerName="registry-server" probeResult="failure" output=< Jan 29 14:14:56 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 14:14:56 crc kubenswrapper[4753]: > Jan 29 14:14:57 crc kubenswrapper[4753]: I0129 14:14:57.055074 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:14:57 crc kubenswrapper[4753]: I0129 14:14:57.055204 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.972886 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp"] Jan 29 14:14:59 crc kubenswrapper[4753]: E0129 14:14:59.973333 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f899c0-458d-470c-862b-a13ec652365c" containerName="util" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.973345 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f899c0-458d-470c-862b-a13ec652365c" containerName="util" Jan 29 14:14:59 crc kubenswrapper[4753]: E0129 14:14:59.973359 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f899c0-458d-470c-862b-a13ec652365c" containerName="extract" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.973364 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f899c0-458d-470c-862b-a13ec652365c" containerName="extract" Jan 29 14:14:59 crc kubenswrapper[4753]: E0129 14:14:59.973373 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f899c0-458d-470c-862b-a13ec652365c" containerName="pull" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.973379 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f899c0-458d-470c-862b-a13ec652365c" containerName="pull" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.973465 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f899c0-458d-470c-862b-a13ec652365c" containerName="extract" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.973823 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.976618 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.977082 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-djf65" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.977350 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.977515 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 29 14:14:59 crc kubenswrapper[4753]: I0129 14:14:59.978392 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.033805 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp"] Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.136669 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd141147-06aa-42ff-86af-96ad3b852349-apiservice-cert\") pod \"metallb-operator-controller-manager-5657cf448d-jpgpp\" (UID: \"dd141147-06aa-42ff-86af-96ad3b852349\") " pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.136726 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95wx4\" (UniqueName: \"kubernetes.io/projected/dd141147-06aa-42ff-86af-96ad3b852349-kube-api-access-95wx4\") pod \"metallb-operator-controller-manager-5657cf448d-jpgpp\" (UID: \"dd141147-06aa-42ff-86af-96ad3b852349\") " pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.136837 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd141147-06aa-42ff-86af-96ad3b852349-webhook-cert\") pod \"metallb-operator-controller-manager-5657cf448d-jpgpp\" (UID: \"dd141147-06aa-42ff-86af-96ad3b852349\") " pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.142811 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8"] Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.143697 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.145649 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.146073 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.158873 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8"] Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.237840 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95wx4\" (UniqueName: \"kubernetes.io/projected/dd141147-06aa-42ff-86af-96ad3b852349-kube-api-access-95wx4\") pod \"metallb-operator-controller-manager-5657cf448d-jpgpp\" (UID: \"dd141147-06aa-42ff-86af-96ad3b852349\") " pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.237917 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f981cab8-0955-4cb1-98ed-a7aecbca702c-secret-volume\") pod \"collect-profiles-29494935-n4hw8\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.237987 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd141147-06aa-42ff-86af-96ad3b852349-webhook-cert\") pod \"metallb-operator-controller-manager-5657cf448d-jpgpp\" (UID: \"dd141147-06aa-42ff-86af-96ad3b852349\") " pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.238035 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv8qj\" (UniqueName: \"kubernetes.io/projected/f981cab8-0955-4cb1-98ed-a7aecbca702c-kube-api-access-dv8qj\") pod \"collect-profiles-29494935-n4hw8\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.238081 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd141147-06aa-42ff-86af-96ad3b852349-apiservice-cert\") pod \"metallb-operator-controller-manager-5657cf448d-jpgpp\" (UID: \"dd141147-06aa-42ff-86af-96ad3b852349\") " pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.238109 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f981cab8-0955-4cb1-98ed-a7aecbca702c-config-volume\") pod \"collect-profiles-29494935-n4hw8\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.248811 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd141147-06aa-42ff-86af-96ad3b852349-apiservice-cert\") pod \"metallb-operator-controller-manager-5657cf448d-jpgpp\" (UID: \"dd141147-06aa-42ff-86af-96ad3b852349\") " pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.248852 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd141147-06aa-42ff-86af-96ad3b852349-webhook-cert\") pod \"metallb-operator-controller-manager-5657cf448d-jpgpp\" (UID: \"dd141147-06aa-42ff-86af-96ad3b852349\") " pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.260670 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95wx4\" (UniqueName: \"kubernetes.io/projected/dd141147-06aa-42ff-86af-96ad3b852349-kube-api-access-95wx4\") pod \"metallb-operator-controller-manager-5657cf448d-jpgpp\" (UID: \"dd141147-06aa-42ff-86af-96ad3b852349\") " pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.289411 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.316747 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk"] Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.317411 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.319416 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-v4xwn" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.319661 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.320756 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.338498 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk"] Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.339993 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv8qj\" (UniqueName: \"kubernetes.io/projected/f981cab8-0955-4cb1-98ed-a7aecbca702c-kube-api-access-dv8qj\") pod \"collect-profiles-29494935-n4hw8\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.340093 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f981cab8-0955-4cb1-98ed-a7aecbca702c-config-volume\") pod \"collect-profiles-29494935-n4hw8\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.340167 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f981cab8-0955-4cb1-98ed-a7aecbca702c-secret-volume\") pod \"collect-profiles-29494935-n4hw8\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.340975 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f981cab8-0955-4cb1-98ed-a7aecbca702c-config-volume\") pod \"collect-profiles-29494935-n4hw8\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.360200 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f981cab8-0955-4cb1-98ed-a7aecbca702c-secret-volume\") pod \"collect-profiles-29494935-n4hw8\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.374758 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv8qj\" (UniqueName: \"kubernetes.io/projected/f981cab8-0955-4cb1-98ed-a7aecbca702c-kube-api-access-dv8qj\") pod \"collect-profiles-29494935-n4hw8\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.441111 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82c917f2-b33e-4508-8f83-b54e5238fb38-apiservice-cert\") pod \"metallb-operator-webhook-server-7b447d96c7-khxwk\" (UID: \"82c917f2-b33e-4508-8f83-b54e5238fb38\") " pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.441622 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82c917f2-b33e-4508-8f83-b54e5238fb38-webhook-cert\") pod \"metallb-operator-webhook-server-7b447d96c7-khxwk\" (UID: \"82c917f2-b33e-4508-8f83-b54e5238fb38\") " pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.441652 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzqxb\" (UniqueName: \"kubernetes.io/projected/82c917f2-b33e-4508-8f83-b54e5238fb38-kube-api-access-lzqxb\") pod \"metallb-operator-webhook-server-7b447d96c7-khxwk\" (UID: \"82c917f2-b33e-4508-8f83-b54e5238fb38\") " pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.460541 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.556575 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82c917f2-b33e-4508-8f83-b54e5238fb38-apiservice-cert\") pod \"metallb-operator-webhook-server-7b447d96c7-khxwk\" (UID: \"82c917f2-b33e-4508-8f83-b54e5238fb38\") " pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.556618 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82c917f2-b33e-4508-8f83-b54e5238fb38-webhook-cert\") pod \"metallb-operator-webhook-server-7b447d96c7-khxwk\" (UID: \"82c917f2-b33e-4508-8f83-b54e5238fb38\") " pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.556644 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzqxb\" (UniqueName: \"kubernetes.io/projected/82c917f2-b33e-4508-8f83-b54e5238fb38-kube-api-access-lzqxb\") pod \"metallb-operator-webhook-server-7b447d96c7-khxwk\" (UID: \"82c917f2-b33e-4508-8f83-b54e5238fb38\") " pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.568925 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82c917f2-b33e-4508-8f83-b54e5238fb38-apiservice-cert\") pod \"metallb-operator-webhook-server-7b447d96c7-khxwk\" (UID: \"82c917f2-b33e-4508-8f83-b54e5238fb38\") " pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.571796 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82c917f2-b33e-4508-8f83-b54e5238fb38-webhook-cert\") pod \"metallb-operator-webhook-server-7b447d96c7-khxwk\" (UID: \"82c917f2-b33e-4508-8f83-b54e5238fb38\") " pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.572202 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzqxb\" (UniqueName: \"kubernetes.io/projected/82c917f2-b33e-4508-8f83-b54e5238fb38-kube-api-access-lzqxb\") pod \"metallb-operator-webhook-server-7b447d96c7-khxwk\" (UID: \"82c917f2-b33e-4508-8f83-b54e5238fb38\") " pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.605464 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp"] Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.711045 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.768369 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8"] Jan 29 14:15:00 crc kubenswrapper[4753]: W0129 14:15:00.969127 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82c917f2_b33e_4508_8f83_b54e5238fb38.slice/crio-3d644c1f98a72f8b9c91888f47a7d8fb2e5a729b1b5325730c2ae1bb1023535d WatchSource:0}: Error finding container 3d644c1f98a72f8b9c91888f47a7d8fb2e5a729b1b5325730c2ae1bb1023535d: Status 404 returned error can't find the container with id 3d644c1f98a72f8b9c91888f47a7d8fb2e5a729b1b5325730c2ae1bb1023535d Jan 29 14:15:00 crc kubenswrapper[4753]: I0129 14:15:00.969416 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk"] Jan 29 14:15:01 crc kubenswrapper[4753]: I0129 14:15:01.529393 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" event={"ID":"82c917f2-b33e-4508-8f83-b54e5238fb38","Type":"ContainerStarted","Data":"3d644c1f98a72f8b9c91888f47a7d8fb2e5a729b1b5325730c2ae1bb1023535d"} Jan 29 14:15:01 crc kubenswrapper[4753]: I0129 14:15:01.531845 4753 generic.go:334] "Generic (PLEG): container finished" podID="f981cab8-0955-4cb1-98ed-a7aecbca702c" containerID="f73006e1eedd78c74143a97093d2d84e767137c79d54aec2c86268166a6321db" exitCode=0 Jan 29 14:15:01 crc kubenswrapper[4753]: I0129 14:15:01.531960 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" event={"ID":"f981cab8-0955-4cb1-98ed-a7aecbca702c","Type":"ContainerDied","Data":"f73006e1eedd78c74143a97093d2d84e767137c79d54aec2c86268166a6321db"} Jan 29 14:15:01 crc kubenswrapper[4753]: I0129 14:15:01.532015 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" event={"ID":"f981cab8-0955-4cb1-98ed-a7aecbca702c","Type":"ContainerStarted","Data":"4d06affe73aa0c9512a63f0c7609e759ce9f636bea614e89092d77b0c7e806cc"} Jan 29 14:15:01 crc kubenswrapper[4753]: I0129 14:15:01.533535 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" event={"ID":"dd141147-06aa-42ff-86af-96ad3b852349","Type":"ContainerStarted","Data":"95cb0fa173d80938cae11880eaee5793ec8e27a092543476672cc0116735fa66"} Jan 29 14:15:02 crc kubenswrapper[4753]: I0129 14:15:02.843617 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:02 crc kubenswrapper[4753]: I0129 14:15:02.989341 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv8qj\" (UniqueName: \"kubernetes.io/projected/f981cab8-0955-4cb1-98ed-a7aecbca702c-kube-api-access-dv8qj\") pod \"f981cab8-0955-4cb1-98ed-a7aecbca702c\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " Jan 29 14:15:02 crc kubenswrapper[4753]: I0129 14:15:02.989418 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f981cab8-0955-4cb1-98ed-a7aecbca702c-secret-volume\") pod \"f981cab8-0955-4cb1-98ed-a7aecbca702c\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " Jan 29 14:15:02 crc kubenswrapper[4753]: I0129 14:15:02.989452 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f981cab8-0955-4cb1-98ed-a7aecbca702c-config-volume\") pod \"f981cab8-0955-4cb1-98ed-a7aecbca702c\" (UID: \"f981cab8-0955-4cb1-98ed-a7aecbca702c\") " Jan 29 14:15:02 crc kubenswrapper[4753]: I0129 14:15:02.991114 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f981cab8-0955-4cb1-98ed-a7aecbca702c-config-volume" (OuterVolumeSpecName: "config-volume") pod "f981cab8-0955-4cb1-98ed-a7aecbca702c" (UID: "f981cab8-0955-4cb1-98ed-a7aecbca702c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:15:02 crc kubenswrapper[4753]: I0129 14:15:02.995230 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f981cab8-0955-4cb1-98ed-a7aecbca702c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f981cab8-0955-4cb1-98ed-a7aecbca702c" (UID: "f981cab8-0955-4cb1-98ed-a7aecbca702c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:15:02 crc kubenswrapper[4753]: I0129 14:15:02.996664 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f981cab8-0955-4cb1-98ed-a7aecbca702c-kube-api-access-dv8qj" (OuterVolumeSpecName: "kube-api-access-dv8qj") pod "f981cab8-0955-4cb1-98ed-a7aecbca702c" (UID: "f981cab8-0955-4cb1-98ed-a7aecbca702c"). InnerVolumeSpecName "kube-api-access-dv8qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:15:03 crc kubenswrapper[4753]: I0129 14:15:03.091106 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv8qj\" (UniqueName: \"kubernetes.io/projected/f981cab8-0955-4cb1-98ed-a7aecbca702c-kube-api-access-dv8qj\") on node \"crc\" DevicePath \"\"" Jan 29 14:15:03 crc kubenswrapper[4753]: I0129 14:15:03.091144 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f981cab8-0955-4cb1-98ed-a7aecbca702c-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 14:15:03 crc kubenswrapper[4753]: I0129 14:15:03.091196 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f981cab8-0955-4cb1-98ed-a7aecbca702c-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 14:15:03 crc kubenswrapper[4753]: I0129 14:15:03.547479 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" event={"ID":"f981cab8-0955-4cb1-98ed-a7aecbca702c","Type":"ContainerDied","Data":"4d06affe73aa0c9512a63f0c7609e759ce9f636bea614e89092d77b0c7e806cc"} Jan 29 14:15:03 crc kubenswrapper[4753]: I0129 14:15:03.547784 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d06affe73aa0c9512a63f0c7609e759ce9f636bea614e89092d77b0c7e806cc" Jan 29 14:15:03 crc kubenswrapper[4753]: I0129 14:15:03.547547 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8" Jan 29 14:15:06 crc kubenswrapper[4753]: I0129 14:15:06.007046 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:15:06 crc kubenswrapper[4753]: I0129 14:15:06.076978 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:15:06 crc kubenswrapper[4753]: I0129 14:15:06.574147 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" event={"ID":"82c917f2-b33e-4508-8f83-b54e5238fb38","Type":"ContainerStarted","Data":"b6e5cf11e6e06851ea981e906e3f22cd1b168eb2c3ef0547310279951d20b34f"} Jan 29 14:15:06 crc kubenswrapper[4753]: I0129 14:15:06.574392 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:06 crc kubenswrapper[4753]: I0129 14:15:06.576801 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" event={"ID":"dd141147-06aa-42ff-86af-96ad3b852349","Type":"ContainerStarted","Data":"7853802b19a33f17a114f3f3eb25368ab82455ea56740032ef7d7c92ea7c7836"} Jan 29 14:15:06 crc kubenswrapper[4753]: I0129 14:15:06.577115 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:06 crc kubenswrapper[4753]: I0129 14:15:06.603438 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" podStartSLOduration=1.895187322 podStartE2EDuration="6.603421433s" podCreationTimestamp="2026-01-29 14:15:00 +0000 UTC" firstStartedPulling="2026-01-29 14:15:00.980714222 +0000 UTC m=+735.675448604" lastFinishedPulling="2026-01-29 14:15:05.688948323 +0000 UTC m=+740.383682715" observedRunningTime="2026-01-29 14:15:06.599630102 +0000 UTC m=+741.294364494" watchObservedRunningTime="2026-01-29 14:15:06.603421433 +0000 UTC m=+741.298155835" Jan 29 14:15:06 crc kubenswrapper[4753]: I0129 14:15:06.629412 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" podStartSLOduration=2.613732454 podStartE2EDuration="7.629388875s" podCreationTimestamp="2026-01-29 14:14:59 +0000 UTC" firstStartedPulling="2026-01-29 14:15:00.623056535 +0000 UTC m=+735.317790917" lastFinishedPulling="2026-01-29 14:15:05.638712966 +0000 UTC m=+740.333447338" observedRunningTime="2026-01-29 14:15:06.624656587 +0000 UTC m=+741.319390979" watchObservedRunningTime="2026-01-29 14:15:06.629388875 +0000 UTC m=+741.324123257" Jan 29 14:15:07 crc kubenswrapper[4753]: I0129 14:15:07.968371 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-76ckz"] Jan 29 14:15:07 crc kubenswrapper[4753]: I0129 14:15:07.969100 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-76ckz" podUID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerName="registry-server" containerID="cri-o://89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e" gracePeriod=2 Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.399530 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.512024 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-utilities\") pod \"7c9c41ea-ce38-4e1d-97ec-33712358724b\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.512167 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz8m4\" (UniqueName: \"kubernetes.io/projected/7c9c41ea-ce38-4e1d-97ec-33712358724b-kube-api-access-vz8m4\") pod \"7c9c41ea-ce38-4e1d-97ec-33712358724b\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.512218 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-catalog-content\") pod \"7c9c41ea-ce38-4e1d-97ec-33712358724b\" (UID: \"7c9c41ea-ce38-4e1d-97ec-33712358724b\") " Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.513777 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-utilities" (OuterVolumeSpecName: "utilities") pod "7c9c41ea-ce38-4e1d-97ec-33712358724b" (UID: "7c9c41ea-ce38-4e1d-97ec-33712358724b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.518993 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c9c41ea-ce38-4e1d-97ec-33712358724b-kube-api-access-vz8m4" (OuterVolumeSpecName: "kube-api-access-vz8m4") pod "7c9c41ea-ce38-4e1d-97ec-33712358724b" (UID: "7c9c41ea-ce38-4e1d-97ec-33712358724b"). InnerVolumeSpecName "kube-api-access-vz8m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.592540 4753 generic.go:334] "Generic (PLEG): container finished" podID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerID="89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e" exitCode=0 Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.592641 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-76ckz" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.592654 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ckz" event={"ID":"7c9c41ea-ce38-4e1d-97ec-33712358724b","Type":"ContainerDied","Data":"89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e"} Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.592736 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-76ckz" event={"ID":"7c9c41ea-ce38-4e1d-97ec-33712358724b","Type":"ContainerDied","Data":"a729682583185c3f4dfa258776fef95c9fee01291964758ebd7b4cb39d6cd2e9"} Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.592774 4753 scope.go:117] "RemoveContainer" containerID="89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.614017 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.614066 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz8m4\" (UniqueName: \"kubernetes.io/projected/7c9c41ea-ce38-4e1d-97ec-33712358724b-kube-api-access-vz8m4\") on node \"crc\" DevicePath \"\"" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.616366 4753 scope.go:117] "RemoveContainer" containerID="11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.638543 4753 scope.go:117] "RemoveContainer" containerID="f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.653686 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c9c41ea-ce38-4e1d-97ec-33712358724b" (UID: "7c9c41ea-ce38-4e1d-97ec-33712358724b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.681822 4753 scope.go:117] "RemoveContainer" containerID="89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e" Jan 29 14:15:08 crc kubenswrapper[4753]: E0129 14:15:08.682410 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e\": container with ID starting with 89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e not found: ID does not exist" containerID="89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.682483 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e"} err="failed to get container status \"89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e\": rpc error: code = NotFound desc = could not find container \"89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e\": container with ID starting with 89c00bce4b449dd662853763461646430049b205481a9096fabf3ccc5746181e not found: ID does not exist" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.682524 4753 scope.go:117] "RemoveContainer" containerID="11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef" Jan 29 14:15:08 crc kubenswrapper[4753]: E0129 14:15:08.682989 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef\": container with ID starting with 11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef not found: ID does not exist" containerID="11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.683029 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef"} err="failed to get container status \"11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef\": rpc error: code = NotFound desc = could not find container \"11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef\": container with ID starting with 11ea4611a030d75f1ac6fe0166c3c2318c1f330758022325929157bf2a1465ef not found: ID does not exist" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.683052 4753 scope.go:117] "RemoveContainer" containerID="f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03" Jan 29 14:15:08 crc kubenswrapper[4753]: E0129 14:15:08.683508 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03\": container with ID starting with f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03 not found: ID does not exist" containerID="f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.683701 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03"} err="failed to get container status \"f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03\": rpc error: code = NotFound desc = could not find container \"f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03\": container with ID starting with f40b7d1fd975949a811b6a9fde7f6c6ae7afc820b3369fb376a5fd6be6c31b03 not found: ID does not exist" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.715728 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c9c41ea-ce38-4e1d-97ec-33712358724b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.921454 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-76ckz"] Jan 29 14:15:08 crc kubenswrapper[4753]: I0129 14:15:08.931390 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-76ckz"] Jan 29 14:15:10 crc kubenswrapper[4753]: I0129 14:15:10.159755 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c9c41ea-ce38-4e1d-97ec-33712358724b" path="/var/lib/kubelet/pods/7c9c41ea-ce38-4e1d-97ec-33712358724b/volumes" Jan 29 14:15:20 crc kubenswrapper[4753]: I0129 14:15:20.719450 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7b447d96c7-khxwk" Jan 29 14:15:27 crc kubenswrapper[4753]: I0129 14:15:27.054784 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:15:27 crc kubenswrapper[4753]: I0129 14:15:27.055702 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:15:40 crc kubenswrapper[4753]: I0129 14:15:40.294117 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5657cf448d-jpgpp" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.106134 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-6xwns"] Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.106417 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerName="extract-content" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.106433 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerName="extract-content" Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.106445 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerName="extract-utilities" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.106453 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerName="extract-utilities" Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.106474 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerName="registry-server" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.106483 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerName="registry-server" Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.106494 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f981cab8-0955-4cb1-98ed-a7aecbca702c" containerName="collect-profiles" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.106502 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f981cab8-0955-4cb1-98ed-a7aecbca702c" containerName="collect-profiles" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.106626 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c9c41ea-ce38-4e1d-97ec-33712358724b" containerName="registry-server" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.106638 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f981cab8-0955-4cb1-98ed-a7aecbca702c" containerName="collect-profiles" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.108898 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.112971 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.113047 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.113122 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-mchhr" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.124272 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj"] Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.125022 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.127799 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.139931 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj"] Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.165875 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-metrics\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.165913 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-cc5mj\" (UID: \"d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.165936 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmhpw\" (UniqueName: \"kubernetes.io/projected/110bd7ac-0311-4b68-82b0-8d33b63a24bc-kube-api-access-wmhpw\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.165964 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-frr-conf\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.165979 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/110bd7ac-0311-4b68-82b0-8d33b63a24bc-frr-startup\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.166102 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/110bd7ac-0311-4b68-82b0-8d33b63a24bc-metrics-certs\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.166146 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mkz9\" (UniqueName: \"kubernetes.io/projected/d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4-kube-api-access-4mkz9\") pod \"frr-k8s-webhook-server-7df86c4f6c-cc5mj\" (UID: \"d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.166234 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-reloader\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.166295 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-frr-sockets\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.213398 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tvgdk"] Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.217405 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.228000 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.228186 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-7brl8" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.228343 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.228506 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.244677 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-kwv4j"] Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.247476 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.251430 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-kwv4j"] Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.262504 4753 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267684 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-metrics-certs\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267737 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-metallb-excludel2\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267766 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-metrics\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267806 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-cc5mj\" (UID: \"d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267843 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmhpw\" (UniqueName: \"kubernetes.io/projected/110bd7ac-0311-4b68-82b0-8d33b63a24bc-kube-api-access-wmhpw\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267875 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/110bd7ac-0311-4b68-82b0-8d33b63a24bc-frr-startup\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267892 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-frr-conf\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267915 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/110bd7ac-0311-4b68-82b0-8d33b63a24bc-metrics-certs\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267934 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mkz9\" (UniqueName: \"kubernetes.io/projected/d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4-kube-api-access-4mkz9\") pod \"frr-k8s-webhook-server-7df86c4f6c-cc5mj\" (UID: \"d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267951 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-reloader\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267978 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-memberlist\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.267995 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-frr-sockets\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.268018 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwqk8\" (UniqueName: \"kubernetes.io/projected/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-kube-api-access-cwqk8\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.268808 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-metrics\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.268876 4753 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.268914 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4-cert podName:d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4 nodeName:}" failed. No retries permitted until 2026-01-29 14:15:41.768901205 +0000 UTC m=+776.463635587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4-cert") pod "frr-k8s-webhook-server-7df86c4f6c-cc5mj" (UID: "d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4") : secret "frr-k8s-webhook-server-cert" not found Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.270577 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-reloader\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.270790 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-frr-conf\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.270875 4753 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.270924 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/110bd7ac-0311-4b68-82b0-8d33b63a24bc-metrics-certs podName:110bd7ac-0311-4b68-82b0-8d33b63a24bc nodeName:}" failed. No retries permitted until 2026-01-29 14:15:41.770907729 +0000 UTC m=+776.465642111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/110bd7ac-0311-4b68-82b0-8d33b63a24bc-metrics-certs") pod "frr-k8s-6xwns" (UID: "110bd7ac-0311-4b68-82b0-8d33b63a24bc") : secret "frr-k8s-certs-secret" not found Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.271505 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/110bd7ac-0311-4b68-82b0-8d33b63a24bc-frr-sockets\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.272054 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/110bd7ac-0311-4b68-82b0-8d33b63a24bc-frr-startup\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.292851 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mkz9\" (UniqueName: \"kubernetes.io/projected/d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4-kube-api-access-4mkz9\") pod \"frr-k8s-webhook-server-7df86c4f6c-cc5mj\" (UID: \"d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.293286 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmhpw\" (UniqueName: \"kubernetes.io/projected/110bd7ac-0311-4b68-82b0-8d33b63a24bc-kube-api-access-wmhpw\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.369497 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ceea48e-1742-416c-9ecd-389b66588487-metrics-certs\") pod \"controller-6968d8fdc4-kwv4j\" (UID: \"6ceea48e-1742-416c-9ecd-389b66588487\") " pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.369579 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtqbq\" (UniqueName: \"kubernetes.io/projected/6ceea48e-1742-416c-9ecd-389b66588487-kube-api-access-xtqbq\") pod \"controller-6968d8fdc4-kwv4j\" (UID: \"6ceea48e-1742-416c-9ecd-389b66588487\") " pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.369658 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ceea48e-1742-416c-9ecd-389b66588487-cert\") pod \"controller-6968d8fdc4-kwv4j\" (UID: \"6ceea48e-1742-416c-9ecd-389b66588487\") " pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.369772 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-memberlist\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.369839 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwqk8\" (UniqueName: \"kubernetes.io/projected/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-kube-api-access-cwqk8\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.369884 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-metrics-certs\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.369976 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-metallb-excludel2\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.369884 4753 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.370055 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-memberlist podName:ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec nodeName:}" failed. No retries permitted until 2026-01-29 14:15:41.870037645 +0000 UTC m=+776.564772027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-memberlist") pod "speaker-tvgdk" (UID: "ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec") : secret "metallb-memberlist" not found Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.370704 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-metallb-excludel2\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.372871 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-metrics-certs\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.385040 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwqk8\" (UniqueName: \"kubernetes.io/projected/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-kube-api-access-cwqk8\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.471821 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ceea48e-1742-416c-9ecd-389b66588487-metrics-certs\") pod \"controller-6968d8fdc4-kwv4j\" (UID: \"6ceea48e-1742-416c-9ecd-389b66588487\") " pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.471926 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtqbq\" (UniqueName: \"kubernetes.io/projected/6ceea48e-1742-416c-9ecd-389b66588487-kube-api-access-xtqbq\") pod \"controller-6968d8fdc4-kwv4j\" (UID: \"6ceea48e-1742-416c-9ecd-389b66588487\") " pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.472006 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ceea48e-1742-416c-9ecd-389b66588487-cert\") pod \"controller-6968d8fdc4-kwv4j\" (UID: \"6ceea48e-1742-416c-9ecd-389b66588487\") " pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.476019 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ceea48e-1742-416c-9ecd-389b66588487-cert\") pod \"controller-6968d8fdc4-kwv4j\" (UID: \"6ceea48e-1742-416c-9ecd-389b66588487\") " pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.476616 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ceea48e-1742-416c-9ecd-389b66588487-metrics-certs\") pod \"controller-6968d8fdc4-kwv4j\" (UID: \"6ceea48e-1742-416c-9ecd-389b66588487\") " pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.497909 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtqbq\" (UniqueName: \"kubernetes.io/projected/6ceea48e-1742-416c-9ecd-389b66588487-kube-api-access-xtqbq\") pod \"controller-6968d8fdc4-kwv4j\" (UID: \"6ceea48e-1742-416c-9ecd-389b66588487\") " pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.579042 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.775861 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-cc5mj\" (UID: \"d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.776301 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/110bd7ac-0311-4b68-82b0-8d33b63a24bc-metrics-certs\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.781916 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-cc5mj\" (UID: \"d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.782869 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/110bd7ac-0311-4b68-82b0-8d33b63a24bc-metrics-certs\") pod \"frr-k8s-6xwns\" (UID: \"110bd7ac-0311-4b68-82b0-8d33b63a24bc\") " pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.878206 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-memberlist\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.878397 4753 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 29 14:15:41 crc kubenswrapper[4753]: E0129 14:15:41.878484 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-memberlist podName:ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec nodeName:}" failed. No retries permitted until 2026-01-29 14:15:42.878462803 +0000 UTC m=+777.573197195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-memberlist") pod "speaker-tvgdk" (UID: "ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec") : secret "metallb-memberlist" not found Jan 29 14:15:41 crc kubenswrapper[4753]: I0129 14:15:41.910083 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-kwv4j"] Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.034638 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.047268 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.285228 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj"] Jan 29 14:15:42 crc kubenswrapper[4753]: W0129 14:15:42.290660 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6ad7a91_6b35_4d1d_aa3d_e9e1a5a164c4.slice/crio-275a8c86201a8cd658510368840351d61a08800b7c18251c50920cdd8f7a930c WatchSource:0}: Error finding container 275a8c86201a8cd658510368840351d61a08800b7c18251c50920cdd8f7a930c: Status 404 returned error can't find the container with id 275a8c86201a8cd658510368840351d61a08800b7c18251c50920cdd8f7a930c Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.840641 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" event={"ID":"d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4","Type":"ContainerStarted","Data":"275a8c86201a8cd658510368840351d61a08800b7c18251c50920cdd8f7a930c"} Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.845243 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xwns" event={"ID":"110bd7ac-0311-4b68-82b0-8d33b63a24bc","Type":"ContainerStarted","Data":"40a0ad553b5cdf42f46385b4a7361c7aebd2a8fd17736536c0861e773aa16f66"} Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.851011 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-kwv4j" event={"ID":"6ceea48e-1742-416c-9ecd-389b66588487","Type":"ContainerStarted","Data":"a46cdbc8125440dd8cac3a082526a41f58e02a345ac15ac4b2435d6395e7bf1d"} Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.851097 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-kwv4j" event={"ID":"6ceea48e-1742-416c-9ecd-389b66588487","Type":"ContainerStarted","Data":"bcb68a1c3c2ba4eb14b59982f9c04dd052c72abd4026b13bec26babb8d5b4df3"} Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.851133 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-kwv4j" event={"ID":"6ceea48e-1742-416c-9ecd-389b66588487","Type":"ContainerStarted","Data":"574d849d90223ea22d73f7cb5dcdd4394778dfe1dcf95ca6cc17246ca7540361"} Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.851216 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.883410 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-kwv4j" podStartSLOduration=1.883379817 podStartE2EDuration="1.883379817s" podCreationTimestamp="2026-01-29 14:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:15:42.877100207 +0000 UTC m=+777.571834619" watchObservedRunningTime="2026-01-29 14:15:42.883379817 +0000 UTC m=+777.578114239" Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.892209 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-memberlist\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:42 crc kubenswrapper[4753]: I0129 14:15:42.904101 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec-memberlist\") pod \"speaker-tvgdk\" (UID: \"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec\") " pod="metallb-system/speaker-tvgdk" Jan 29 14:15:43 crc kubenswrapper[4753]: I0129 14:15:43.065211 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tvgdk" Jan 29 14:15:43 crc kubenswrapper[4753]: W0129 14:15:43.086079 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca38a0c1_c5a0_4e33_8ea4_165facbeb3ec.slice/crio-2119c30f1df8ef3bbd4f624b64ce17551e4cb1a49f3e04d170826bc679d0f400 WatchSource:0}: Error finding container 2119c30f1df8ef3bbd4f624b64ce17551e4cb1a49f3e04d170826bc679d0f400: Status 404 returned error can't find the container with id 2119c30f1df8ef3bbd4f624b64ce17551e4cb1a49f3e04d170826bc679d0f400 Jan 29 14:15:43 crc kubenswrapper[4753]: I0129 14:15:43.861037 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tvgdk" event={"ID":"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec","Type":"ContainerStarted","Data":"666b51d223ad284c5a91e5ad80bd9b04f8ceb44758e06708136cb852d2873ced"} Jan 29 14:15:43 crc kubenswrapper[4753]: I0129 14:15:43.861341 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tvgdk" event={"ID":"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec","Type":"ContainerStarted","Data":"b719d896ab49cf5f2472c734788ed20b3070f81c270385f5ecf7432efa81ec43"} Jan 29 14:15:43 crc kubenswrapper[4753]: I0129 14:15:43.861353 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tvgdk" event={"ID":"ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec","Type":"ContainerStarted","Data":"2119c30f1df8ef3bbd4f624b64ce17551e4cb1a49f3e04d170826bc679d0f400"} Jan 29 14:15:43 crc kubenswrapper[4753]: I0129 14:15:43.861522 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tvgdk" Jan 29 14:15:43 crc kubenswrapper[4753]: I0129 14:15:43.895697 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tvgdk" podStartSLOduration=2.895677766 podStartE2EDuration="2.895677766s" podCreationTimestamp="2026-01-29 14:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:15:43.894291468 +0000 UTC m=+778.589025870" watchObservedRunningTime="2026-01-29 14:15:43.895677766 +0000 UTC m=+778.590412148" Jan 29 14:15:49 crc kubenswrapper[4753]: I0129 14:15:49.911390 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" event={"ID":"d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4","Type":"ContainerStarted","Data":"59ea48d3ce595c3b1893f73b3a313db20330f1df239edd317bc26637c5f311f3"} Jan 29 14:15:49 crc kubenswrapper[4753]: I0129 14:15:49.911929 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:15:49 crc kubenswrapper[4753]: I0129 14:15:49.914071 4753 generic.go:334] "Generic (PLEG): container finished" podID="110bd7ac-0311-4b68-82b0-8d33b63a24bc" containerID="9780b329825e1b357f94643840b4f5c02ef47290b76006a9615a7d48345a34f7" exitCode=0 Jan 29 14:15:49 crc kubenswrapper[4753]: I0129 14:15:49.914105 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xwns" event={"ID":"110bd7ac-0311-4b68-82b0-8d33b63a24bc","Type":"ContainerDied","Data":"9780b329825e1b357f94643840b4f5c02ef47290b76006a9615a7d48345a34f7"} Jan 29 14:15:49 crc kubenswrapper[4753]: I0129 14:15:49.944052 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" podStartSLOduration=2.264528942 podStartE2EDuration="8.94402555s" podCreationTimestamp="2026-01-29 14:15:41 +0000 UTC" firstStartedPulling="2026-01-29 14:15:42.295255446 +0000 UTC m=+776.989989828" lastFinishedPulling="2026-01-29 14:15:48.974752044 +0000 UTC m=+783.669486436" observedRunningTime="2026-01-29 14:15:49.933880995 +0000 UTC m=+784.628615427" watchObservedRunningTime="2026-01-29 14:15:49.94402555 +0000 UTC m=+784.638759972" Jan 29 14:15:50 crc kubenswrapper[4753]: I0129 14:15:50.926088 4753 generic.go:334] "Generic (PLEG): container finished" podID="110bd7ac-0311-4b68-82b0-8d33b63a24bc" containerID="1b7fa4c66e1e22d27ee764ee477f7f0ca28a81366b5924794b6b17e4d91fcbdd" exitCode=0 Jan 29 14:15:50 crc kubenswrapper[4753]: I0129 14:15:50.926224 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xwns" event={"ID":"110bd7ac-0311-4b68-82b0-8d33b63a24bc","Type":"ContainerDied","Data":"1b7fa4c66e1e22d27ee764ee477f7f0ca28a81366b5924794b6b17e4d91fcbdd"} Jan 29 14:15:51 crc kubenswrapper[4753]: I0129 14:15:51.941315 4753 generic.go:334] "Generic (PLEG): container finished" podID="110bd7ac-0311-4b68-82b0-8d33b63a24bc" containerID="a9861e7ac56d6ae0b3823b23965b21150a337c926ab116035025bebca651db50" exitCode=0 Jan 29 14:15:51 crc kubenswrapper[4753]: I0129 14:15:51.941378 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xwns" event={"ID":"110bd7ac-0311-4b68-82b0-8d33b63a24bc","Type":"ContainerDied","Data":"a9861e7ac56d6ae0b3823b23965b21150a337c926ab116035025bebca651db50"} Jan 29 14:15:52 crc kubenswrapper[4753]: I0129 14:15:52.965385 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xwns" event={"ID":"110bd7ac-0311-4b68-82b0-8d33b63a24bc","Type":"ContainerStarted","Data":"a8bc2665ecd771434a3cc6c76828d5f59e67028fff796b0e48a87a9241a7cfb9"} Jan 29 14:15:52 crc kubenswrapper[4753]: I0129 14:15:52.965912 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xwns" event={"ID":"110bd7ac-0311-4b68-82b0-8d33b63a24bc","Type":"ContainerStarted","Data":"2d1fda6babbec1319fc9ad33071f9122bad3a6a9a1a63d95f5034f2566800bbd"} Jan 29 14:15:52 crc kubenswrapper[4753]: I0129 14:15:52.965960 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xwns" event={"ID":"110bd7ac-0311-4b68-82b0-8d33b63a24bc","Type":"ContainerStarted","Data":"81bffa1d7a789dba6a4e08bc6274653902ad49ae8e58e59de31cad2b4f39cf94"} Jan 29 14:15:52 crc kubenswrapper[4753]: I0129 14:15:52.965986 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xwns" event={"ID":"110bd7ac-0311-4b68-82b0-8d33b63a24bc","Type":"ContainerStarted","Data":"f6f95443dbf1e6911b808d00fd3c15410a694812434f080fcfd9f36b59f09ff8"} Jan 29 14:15:52 crc kubenswrapper[4753]: I0129 14:15:52.966008 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xwns" event={"ID":"110bd7ac-0311-4b68-82b0-8d33b63a24bc","Type":"ContainerStarted","Data":"c876ce7fa1c42acae8a59ef19f2738b0738055e98be97f0d7a345f9cc0014de2"} Jan 29 14:15:53 crc kubenswrapper[4753]: I0129 14:15:53.072575 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tvgdk" Jan 29 14:15:53 crc kubenswrapper[4753]: I0129 14:15:53.976510 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xwns" event={"ID":"110bd7ac-0311-4b68-82b0-8d33b63a24bc","Type":"ContainerStarted","Data":"1fe1d78ffe7990022dfa5c8fcb53dfffe23a3f9f3fd0654f878fd20c00fb02a1"} Jan 29 14:15:53 crc kubenswrapper[4753]: I0129 14:15:53.976952 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:53 crc kubenswrapper[4753]: I0129 14:15:53.998514 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-6xwns" podStartSLOduration=6.219136392 podStartE2EDuration="12.998480696s" podCreationTimestamp="2026-01-29 14:15:41 +0000 UTC" firstStartedPulling="2026-01-29 14:15:42.182557583 +0000 UTC m=+776.877291965" lastFinishedPulling="2026-01-29 14:15:48.961901877 +0000 UTC m=+783.656636269" observedRunningTime="2026-01-29 14:15:53.996434181 +0000 UTC m=+788.691168573" watchObservedRunningTime="2026-01-29 14:15:53.998480696 +0000 UTC m=+788.693215078" Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.841711 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8"] Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.842793 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.846543 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.862182 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8"] Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.878967 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.879033 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kgc5\" (UniqueName: \"kubernetes.io/projected/f141c868-e82d-452b-9ae8-3e160d964237-kube-api-access-8kgc5\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.879081 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.979543 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.979592 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kgc5\" (UniqueName: \"kubernetes.io/projected/f141c868-e82d-452b-9ae8-3e160d964237-kube-api-access-8kgc5\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.979640 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.980088 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:54 crc kubenswrapper[4753]: I0129 14:15:54.980304 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:55 crc kubenswrapper[4753]: I0129 14:15:55.006090 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kgc5\" (UniqueName: \"kubernetes.io/projected/f141c868-e82d-452b-9ae8-3e160d964237-kube-api-access-8kgc5\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:55 crc kubenswrapper[4753]: I0129 14:15:55.157467 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:15:55 crc kubenswrapper[4753]: I0129 14:15:55.655447 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8"] Jan 29 14:15:55 crc kubenswrapper[4753]: I0129 14:15:55.989079 4753 generic.go:334] "Generic (PLEG): container finished" podID="f141c868-e82d-452b-9ae8-3e160d964237" containerID="12b3f2f2cac84656989b9b59cebce126d0e4432d920e45189813445a23607103" exitCode=0 Jan 29 14:15:55 crc kubenswrapper[4753]: I0129 14:15:55.989407 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" event={"ID":"f141c868-e82d-452b-9ae8-3e160d964237","Type":"ContainerDied","Data":"12b3f2f2cac84656989b9b59cebce126d0e4432d920e45189813445a23607103"} Jan 29 14:15:55 crc kubenswrapper[4753]: I0129 14:15:55.989432 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" event={"ID":"f141c868-e82d-452b-9ae8-3e160d964237","Type":"ContainerStarted","Data":"99766e5dc89f22b7d42638bd317d1fe30f4560167c6660f7013dbda65676b9fc"} Jan 29 14:15:57 crc kubenswrapper[4753]: I0129 14:15:57.035400 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:57 crc kubenswrapper[4753]: I0129 14:15:57.055483 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:15:57 crc kubenswrapper[4753]: I0129 14:15:57.055588 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:15:57 crc kubenswrapper[4753]: I0129 14:15:57.055677 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:15:57 crc kubenswrapper[4753]: I0129 14:15:57.056911 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30a83f4047e7a21e63740d93f083e75525a0e3fe674659ba74e59493ea388ecf"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:15:57 crc kubenswrapper[4753]: I0129 14:15:57.057069 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://30a83f4047e7a21e63740d93f083e75525a0e3fe674659ba74e59493ea388ecf" gracePeriod=600 Jan 29 14:15:57 crc kubenswrapper[4753]: I0129 14:15:57.092063 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6xwns" Jan 29 14:15:58 crc kubenswrapper[4753]: I0129 14:15:58.005970 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="30a83f4047e7a21e63740d93f083e75525a0e3fe674659ba74e59493ea388ecf" exitCode=0 Jan 29 14:15:58 crc kubenswrapper[4753]: I0129 14:15:58.006232 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"30a83f4047e7a21e63740d93f083e75525a0e3fe674659ba74e59493ea388ecf"} Jan 29 14:15:58 crc kubenswrapper[4753]: I0129 14:15:58.006972 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"60212ebb28237ec94902995089383e664d1c6ec845691e27febd40b2f34c00cd"} Jan 29 14:15:58 crc kubenswrapper[4753]: I0129 14:15:58.007000 4753 scope.go:117] "RemoveContainer" containerID="90491981003addce6f1b9660cc7a2bd6006504000ac2c231afb7b0bfc2f931be" Jan 29 14:16:00 crc kubenswrapper[4753]: I0129 14:16:00.029251 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" event={"ID":"f141c868-e82d-452b-9ae8-3e160d964237","Type":"ContainerStarted","Data":"3fdf0f2c65278e45cade3ccc11725448fb0ef3dddcb047e9eb94865ac8944221"} Jan 29 14:16:01 crc kubenswrapper[4753]: I0129 14:16:01.039437 4753 generic.go:334] "Generic (PLEG): container finished" podID="f141c868-e82d-452b-9ae8-3e160d964237" containerID="3fdf0f2c65278e45cade3ccc11725448fb0ef3dddcb047e9eb94865ac8944221" exitCode=0 Jan 29 14:16:01 crc kubenswrapper[4753]: I0129 14:16:01.039512 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" event={"ID":"f141c868-e82d-452b-9ae8-3e160d964237","Type":"ContainerDied","Data":"3fdf0f2c65278e45cade3ccc11725448fb0ef3dddcb047e9eb94865ac8944221"} Jan 29 14:16:01 crc kubenswrapper[4753]: I0129 14:16:01.586227 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-kwv4j" Jan 29 14:16:02 crc kubenswrapper[4753]: I0129 14:16:02.039699 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6xwns" Jan 29 14:16:02 crc kubenswrapper[4753]: I0129 14:16:02.052071 4753 generic.go:334] "Generic (PLEG): container finished" podID="f141c868-e82d-452b-9ae8-3e160d964237" containerID="296a2c2f5b360c586bf950619b1930e194be22d558d85445780284c6042502d0" exitCode=0 Jan 29 14:16:02 crc kubenswrapper[4753]: I0129 14:16:02.052192 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" event={"ID":"f141c868-e82d-452b-9ae8-3e160d964237","Type":"ContainerDied","Data":"296a2c2f5b360c586bf950619b1930e194be22d558d85445780284c6042502d0"} Jan 29 14:16:02 crc kubenswrapper[4753]: I0129 14:16:02.064588 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-cc5mj" Jan 29 14:16:03 crc kubenswrapper[4753]: I0129 14:16:03.439038 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:16:03 crc kubenswrapper[4753]: I0129 14:16:03.526781 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-bundle\") pod \"f141c868-e82d-452b-9ae8-3e160d964237\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " Jan 29 14:16:03 crc kubenswrapper[4753]: I0129 14:16:03.527219 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-util\") pod \"f141c868-e82d-452b-9ae8-3e160d964237\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " Jan 29 14:16:03 crc kubenswrapper[4753]: I0129 14:16:03.527306 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kgc5\" (UniqueName: \"kubernetes.io/projected/f141c868-e82d-452b-9ae8-3e160d964237-kube-api-access-8kgc5\") pod \"f141c868-e82d-452b-9ae8-3e160d964237\" (UID: \"f141c868-e82d-452b-9ae8-3e160d964237\") " Jan 29 14:16:03 crc kubenswrapper[4753]: I0129 14:16:03.528913 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-bundle" (OuterVolumeSpecName: "bundle") pod "f141c868-e82d-452b-9ae8-3e160d964237" (UID: "f141c868-e82d-452b-9ae8-3e160d964237"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:16:03 crc kubenswrapper[4753]: I0129 14:16:03.538560 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f141c868-e82d-452b-9ae8-3e160d964237-kube-api-access-8kgc5" (OuterVolumeSpecName: "kube-api-access-8kgc5") pod "f141c868-e82d-452b-9ae8-3e160d964237" (UID: "f141c868-e82d-452b-9ae8-3e160d964237"). InnerVolumeSpecName "kube-api-access-8kgc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:16:03 crc kubenswrapper[4753]: I0129 14:16:03.542697 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-util" (OuterVolumeSpecName: "util") pod "f141c868-e82d-452b-9ae8-3e160d964237" (UID: "f141c868-e82d-452b-9ae8-3e160d964237"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:16:03 crc kubenswrapper[4753]: I0129 14:16:03.629135 4753 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-util\") on node \"crc\" DevicePath \"\"" Jan 29 14:16:03 crc kubenswrapper[4753]: I0129 14:16:03.629205 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kgc5\" (UniqueName: \"kubernetes.io/projected/f141c868-e82d-452b-9ae8-3e160d964237-kube-api-access-8kgc5\") on node \"crc\" DevicePath \"\"" Jan 29 14:16:03 crc kubenswrapper[4753]: I0129 14:16:03.629224 4753 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f141c868-e82d-452b-9ae8-3e160d964237-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:16:04 crc kubenswrapper[4753]: I0129 14:16:04.071192 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" event={"ID":"f141c868-e82d-452b-9ae8-3e160d964237","Type":"ContainerDied","Data":"99766e5dc89f22b7d42638bd317d1fe30f4560167c6660f7013dbda65676b9fc"} Jan 29 14:16:04 crc kubenswrapper[4753]: I0129 14:16:04.071256 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99766e5dc89f22b7d42638bd317d1fe30f4560167c6660f7013dbda65676b9fc" Jan 29 14:16:04 crc kubenswrapper[4753]: I0129 14:16:04.071285 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.457722 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst"] Jan 29 14:16:08 crc kubenswrapper[4753]: E0129 14:16:08.459060 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f141c868-e82d-452b-9ae8-3e160d964237" containerName="util" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.459081 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f141c868-e82d-452b-9ae8-3e160d964237" containerName="util" Jan 29 14:16:08 crc kubenswrapper[4753]: E0129 14:16:08.459104 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f141c868-e82d-452b-9ae8-3e160d964237" containerName="extract" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.459116 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f141c868-e82d-452b-9ae8-3e160d964237" containerName="extract" Jan 29 14:16:08 crc kubenswrapper[4753]: E0129 14:16:08.459128 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f141c868-e82d-452b-9ae8-3e160d964237" containerName="pull" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.459137 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f141c868-e82d-452b-9ae8-3e160d964237" containerName="pull" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.459331 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f141c868-e82d-452b-9ae8-3e160d964237" containerName="extract" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.459957 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.470861 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.475427 4753 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-bnrm6" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.481083 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.489581 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst"] Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.498275 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb9f18e0-6db7-4cf5-b209-839265130791-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k2nst\" (UID: \"fb9f18e0-6db7-4cf5-b209-839265130791\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.498359 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxl6x\" (UniqueName: \"kubernetes.io/projected/fb9f18e0-6db7-4cf5-b209-839265130791-kube-api-access-mxl6x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k2nst\" (UID: \"fb9f18e0-6db7-4cf5-b209-839265130791\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.599359 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxl6x\" (UniqueName: \"kubernetes.io/projected/fb9f18e0-6db7-4cf5-b209-839265130791-kube-api-access-mxl6x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k2nst\" (UID: \"fb9f18e0-6db7-4cf5-b209-839265130791\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.599435 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb9f18e0-6db7-4cf5-b209-839265130791-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k2nst\" (UID: \"fb9f18e0-6db7-4cf5-b209-839265130791\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.600084 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb9f18e0-6db7-4cf5-b209-839265130791-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k2nst\" (UID: \"fb9f18e0-6db7-4cf5-b209-839265130791\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.618831 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxl6x\" (UniqueName: \"kubernetes.io/projected/fb9f18e0-6db7-4cf5-b209-839265130791-kube-api-access-mxl6x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-k2nst\" (UID: \"fb9f18e0-6db7-4cf5-b209-839265130791\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" Jan 29 14:16:08 crc kubenswrapper[4753]: I0129 14:16:08.776505 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" Jan 29 14:16:09 crc kubenswrapper[4753]: I0129 14:16:09.251739 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst"] Jan 29 14:16:09 crc kubenswrapper[4753]: W0129 14:16:09.258384 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9f18e0_6db7_4cf5_b209_839265130791.slice/crio-b5d5b8c8cc830818bf6a850b725cc4c80b3d13bc95146f461d275962ac5c7d04 WatchSource:0}: Error finding container b5d5b8c8cc830818bf6a850b725cc4c80b3d13bc95146f461d275962ac5c7d04: Status 404 returned error can't find the container with id b5d5b8c8cc830818bf6a850b725cc4c80b3d13bc95146f461d275962ac5c7d04 Jan 29 14:16:10 crc kubenswrapper[4753]: I0129 14:16:10.111441 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" event={"ID":"fb9f18e0-6db7-4cf5-b209-839265130791","Type":"ContainerStarted","Data":"b5d5b8c8cc830818bf6a850b725cc4c80b3d13bc95146f461d275962ac5c7d04"} Jan 29 14:16:13 crc kubenswrapper[4753]: I0129 14:16:13.131786 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" event={"ID":"fb9f18e0-6db7-4cf5-b209-839265130791","Type":"ContainerStarted","Data":"6e9ec23d9dbf282083983d2480752b578fef07b5fa5ebe0791bddbcb32939f53"} Jan 29 14:16:13 crc kubenswrapper[4753]: I0129 14:16:13.161341 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-k2nst" podStartSLOduration=1.996033264 podStartE2EDuration="5.161323479s" podCreationTimestamp="2026-01-29 14:16:08 +0000 UTC" firstStartedPulling="2026-01-29 14:16:09.261201381 +0000 UTC m=+803.955935763" lastFinishedPulling="2026-01-29 14:16:12.426491596 +0000 UTC m=+807.121225978" observedRunningTime="2026-01-29 14:16:13.157590848 +0000 UTC m=+807.852325270" watchObservedRunningTime="2026-01-29 14:16:13.161323479 +0000 UTC m=+807.856057851" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.616596 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bhj74"] Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.617822 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.622335 4753 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-mktmh" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.622735 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.626452 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.633931 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnrhk\" (UniqueName: \"kubernetes.io/projected/037d7c67-99e9-410d-8f17-77d3f95b1443-kube-api-access-vnrhk\") pod \"cert-manager-webhook-6888856db4-bhj74\" (UID: \"037d7c67-99e9-410d-8f17-77d3f95b1443\") " pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.633985 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/037d7c67-99e9-410d-8f17-77d3f95b1443-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bhj74\" (UID: \"037d7c67-99e9-410d-8f17-77d3f95b1443\") " pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.635767 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bhj74"] Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.735792 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnrhk\" (UniqueName: \"kubernetes.io/projected/037d7c67-99e9-410d-8f17-77d3f95b1443-kube-api-access-vnrhk\") pod \"cert-manager-webhook-6888856db4-bhj74\" (UID: \"037d7c67-99e9-410d-8f17-77d3f95b1443\") " pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.735860 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/037d7c67-99e9-410d-8f17-77d3f95b1443-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bhj74\" (UID: \"037d7c67-99e9-410d-8f17-77d3f95b1443\") " pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.775272 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/037d7c67-99e9-410d-8f17-77d3f95b1443-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bhj74\" (UID: \"037d7c67-99e9-410d-8f17-77d3f95b1443\") " pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.776869 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnrhk\" (UniqueName: \"kubernetes.io/projected/037d7c67-99e9-410d-8f17-77d3f95b1443-kube-api-access-vnrhk\") pod \"cert-manager-webhook-6888856db4-bhj74\" (UID: \"037d7c67-99e9-410d-8f17-77d3f95b1443\") " pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" Jan 29 14:16:16 crc kubenswrapper[4753]: I0129 14:16:16.938740 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" Jan 29 14:16:17 crc kubenswrapper[4753]: I0129 14:16:17.429562 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bhj74"] Jan 29 14:16:18 crc kubenswrapper[4753]: I0129 14:16:18.174081 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" event={"ID":"037d7c67-99e9-410d-8f17-77d3f95b1443","Type":"ContainerStarted","Data":"a74aed3e218465a566d8fe24247d38712b78fdbf597ec7cbb5c6d306245ade3d"} Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.238233 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-55p62"] Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.239057 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.241444 4753 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-j6nz5" Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.247437 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-55p62"] Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.268807 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08add473-034f-403a-9f96-7d5fc2e4c8df-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-55p62\" (UID: \"08add473-034f-403a-9f96-7d5fc2e4c8df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.268926 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdzh9\" (UniqueName: \"kubernetes.io/projected/08add473-034f-403a-9f96-7d5fc2e4c8df-kube-api-access-qdzh9\") pod \"cert-manager-cainjector-5545bd876-55p62\" (UID: \"08add473-034f-403a-9f96-7d5fc2e4c8df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.371707 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08add473-034f-403a-9f96-7d5fc2e4c8df-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-55p62\" (UID: \"08add473-034f-403a-9f96-7d5fc2e4c8df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.371767 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdzh9\" (UniqueName: \"kubernetes.io/projected/08add473-034f-403a-9f96-7d5fc2e4c8df-kube-api-access-qdzh9\") pod \"cert-manager-cainjector-5545bd876-55p62\" (UID: \"08add473-034f-403a-9f96-7d5fc2e4c8df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.401627 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdzh9\" (UniqueName: \"kubernetes.io/projected/08add473-034f-403a-9f96-7d5fc2e4c8df-kube-api-access-qdzh9\") pod \"cert-manager-cainjector-5545bd876-55p62\" (UID: \"08add473-034f-403a-9f96-7d5fc2e4c8df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.403504 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08add473-034f-403a-9f96-7d5fc2e4c8df-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-55p62\" (UID: \"08add473-034f-403a-9f96-7d5fc2e4c8df\") " pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.560321 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" Jan 29 14:16:19 crc kubenswrapper[4753]: I0129 14:16:19.984165 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-55p62"] Jan 29 14:16:19 crc kubenswrapper[4753]: W0129 14:16:19.991943 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08add473_034f_403a_9f96_7d5fc2e4c8df.slice/crio-36c6e81061eecbc3156da4471b355dae503a79d9c0148b3a1adfe2659da3be7c WatchSource:0}: Error finding container 36c6e81061eecbc3156da4471b355dae503a79d9c0148b3a1adfe2659da3be7c: Status 404 returned error can't find the container with id 36c6e81061eecbc3156da4471b355dae503a79d9c0148b3a1adfe2659da3be7c Jan 29 14:16:20 crc kubenswrapper[4753]: I0129 14:16:20.193066 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" event={"ID":"08add473-034f-403a-9f96-7d5fc2e4c8df","Type":"ContainerStarted","Data":"36c6e81061eecbc3156da4471b355dae503a79d9c0148b3a1adfe2659da3be7c"} Jan 29 14:16:22 crc kubenswrapper[4753]: I0129 14:16:22.207758 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" event={"ID":"037d7c67-99e9-410d-8f17-77d3f95b1443","Type":"ContainerStarted","Data":"c7a9f108a877a7d0bdaf06ffd1c9c8f68b8f305463692958a26dc2e11eaec9c4"} Jan 29 14:16:22 crc kubenswrapper[4753]: I0129 14:16:22.208114 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" Jan 29 14:16:22 crc kubenswrapper[4753]: I0129 14:16:22.230326 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" podStartSLOduration=2.066109659 podStartE2EDuration="6.2302993s" podCreationTimestamp="2026-01-29 14:16:16 +0000 UTC" firstStartedPulling="2026-01-29 14:16:17.443017265 +0000 UTC m=+812.137751677" lastFinishedPulling="2026-01-29 14:16:21.607206926 +0000 UTC m=+816.301941318" observedRunningTime="2026-01-29 14:16:22.228904322 +0000 UTC m=+816.923638714" watchObservedRunningTime="2026-01-29 14:16:22.2302993 +0000 UTC m=+816.925033722" Jan 29 14:16:23 crc kubenswrapper[4753]: I0129 14:16:23.217179 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" event={"ID":"08add473-034f-403a-9f96-7d5fc2e4c8df","Type":"ContainerStarted","Data":"c5b1a9727c72d272c59df3ad171237fddc60b60e852f62ca8e79f70ebfc32e1b"} Jan 29 14:16:23 crc kubenswrapper[4753]: I0129 14:16:23.236479 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-55p62" podStartSLOduration=1.663360468 podStartE2EDuration="4.236457573s" podCreationTimestamp="2026-01-29 14:16:19 +0000 UTC" firstStartedPulling="2026-01-29 14:16:19.993908211 +0000 UTC m=+814.688642593" lastFinishedPulling="2026-01-29 14:16:22.567005316 +0000 UTC m=+817.261739698" observedRunningTime="2026-01-29 14:16:23.230848722 +0000 UTC m=+817.925583104" watchObservedRunningTime="2026-01-29 14:16:23.236457573 +0000 UTC m=+817.931191985" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.058796 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-frsj7"] Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.073577 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.078592 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frsj7"] Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.162827 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-catalog-content\") pod \"redhat-marketplace-frsj7\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.162879 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k428k\" (UniqueName: \"kubernetes.io/projected/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-kube-api-access-k428k\") pod \"redhat-marketplace-frsj7\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.162899 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-utilities\") pod \"redhat-marketplace-frsj7\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.264219 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-catalog-content\") pod \"redhat-marketplace-frsj7\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.264282 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k428k\" (UniqueName: \"kubernetes.io/projected/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-kube-api-access-k428k\") pod \"redhat-marketplace-frsj7\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.264309 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-utilities\") pod \"redhat-marketplace-frsj7\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.264880 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-catalog-content\") pod \"redhat-marketplace-frsj7\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.264926 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-utilities\") pod \"redhat-marketplace-frsj7\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.292850 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k428k\" (UniqueName: \"kubernetes.io/projected/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-kube-api-access-k428k\") pod \"redhat-marketplace-frsj7\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.390270 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.606742 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frsj7"] Jan 29 14:16:26 crc kubenswrapper[4753]: W0129 14:16:26.612577 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaad9fc6c_2cf9_4e43_82f2_e7ccd775a798.slice/crio-b7ac8153756ce48da784eb0a89a7064fc9be3e453f46b35031e828b48c22f3be WatchSource:0}: Error finding container b7ac8153756ce48da784eb0a89a7064fc9be3e453f46b35031e828b48c22f3be: Status 404 returned error can't find the container with id b7ac8153756ce48da784eb0a89a7064fc9be3e453f46b35031e828b48c22f3be Jan 29 14:16:26 crc kubenswrapper[4753]: I0129 14:16:26.941529 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-bhj74" Jan 29 14:16:27 crc kubenswrapper[4753]: I0129 14:16:27.246930 4753 generic.go:334] "Generic (PLEG): container finished" podID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerID="465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27" exitCode=0 Jan 29 14:16:27 crc kubenswrapper[4753]: I0129 14:16:27.246981 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frsj7" event={"ID":"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798","Type":"ContainerDied","Data":"465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27"} Jan 29 14:16:27 crc kubenswrapper[4753]: I0129 14:16:27.247014 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frsj7" event={"ID":"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798","Type":"ContainerStarted","Data":"b7ac8153756ce48da784eb0a89a7064fc9be3e453f46b35031e828b48c22f3be"} Jan 29 14:16:28 crc kubenswrapper[4753]: I0129 14:16:28.255490 4753 generic.go:334] "Generic (PLEG): container finished" podID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerID="2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647" exitCode=0 Jan 29 14:16:28 crc kubenswrapper[4753]: I0129 14:16:28.255555 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frsj7" event={"ID":"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798","Type":"ContainerDied","Data":"2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647"} Jan 29 14:16:29 crc kubenswrapper[4753]: I0129 14:16:29.264487 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frsj7" event={"ID":"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798","Type":"ContainerStarted","Data":"6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9"} Jan 29 14:16:29 crc kubenswrapper[4753]: I0129 14:16:29.288221 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-frsj7" podStartSLOduration=1.838864852 podStartE2EDuration="3.288202078s" podCreationTimestamp="2026-01-29 14:16:26 +0000 UTC" firstStartedPulling="2026-01-29 14:16:27.249277444 +0000 UTC m=+821.944011866" lastFinishedPulling="2026-01-29 14:16:28.69861471 +0000 UTC m=+823.393349092" observedRunningTime="2026-01-29 14:16:29.286301866 +0000 UTC m=+823.981036258" watchObservedRunningTime="2026-01-29 14:16:29.288202078 +0000 UTC m=+823.982936470" Jan 29 14:16:34 crc kubenswrapper[4753]: I0129 14:16:34.838240 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-t4wnm"] Jan 29 14:16:34 crc kubenswrapper[4753]: I0129 14:16:34.839958 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-t4wnm" Jan 29 14:16:34 crc kubenswrapper[4753]: I0129 14:16:34.842854 4753 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-74zkj" Jan 29 14:16:34 crc kubenswrapper[4753]: I0129 14:16:34.861869 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-t4wnm"] Jan 29 14:16:34 crc kubenswrapper[4753]: I0129 14:16:34.996279 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4wpq\" (UniqueName: \"kubernetes.io/projected/9a2d92f9-fefb-4f88-9ee2-3841da9e0a74-kube-api-access-v4wpq\") pod \"cert-manager-545d4d4674-t4wnm\" (UID: \"9a2d92f9-fefb-4f88-9ee2-3841da9e0a74\") " pod="cert-manager/cert-manager-545d4d4674-t4wnm" Jan 29 14:16:34 crc kubenswrapper[4753]: I0129 14:16:34.996834 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a2d92f9-fefb-4f88-9ee2-3841da9e0a74-bound-sa-token\") pod \"cert-manager-545d4d4674-t4wnm\" (UID: \"9a2d92f9-fefb-4f88-9ee2-3841da9e0a74\") " pod="cert-manager/cert-manager-545d4d4674-t4wnm" Jan 29 14:16:35 crc kubenswrapper[4753]: I0129 14:16:35.097946 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a2d92f9-fefb-4f88-9ee2-3841da9e0a74-bound-sa-token\") pod \"cert-manager-545d4d4674-t4wnm\" (UID: \"9a2d92f9-fefb-4f88-9ee2-3841da9e0a74\") " pod="cert-manager/cert-manager-545d4d4674-t4wnm" Jan 29 14:16:35 crc kubenswrapper[4753]: I0129 14:16:35.098064 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4wpq\" (UniqueName: \"kubernetes.io/projected/9a2d92f9-fefb-4f88-9ee2-3841da9e0a74-kube-api-access-v4wpq\") pod \"cert-manager-545d4d4674-t4wnm\" (UID: \"9a2d92f9-fefb-4f88-9ee2-3841da9e0a74\") " pod="cert-manager/cert-manager-545d4d4674-t4wnm" Jan 29 14:16:35 crc kubenswrapper[4753]: I0129 14:16:35.135811 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4wpq\" (UniqueName: \"kubernetes.io/projected/9a2d92f9-fefb-4f88-9ee2-3841da9e0a74-kube-api-access-v4wpq\") pod \"cert-manager-545d4d4674-t4wnm\" (UID: \"9a2d92f9-fefb-4f88-9ee2-3841da9e0a74\") " pod="cert-manager/cert-manager-545d4d4674-t4wnm" Jan 29 14:16:35 crc kubenswrapper[4753]: I0129 14:16:35.136665 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a2d92f9-fefb-4f88-9ee2-3841da9e0a74-bound-sa-token\") pod \"cert-manager-545d4d4674-t4wnm\" (UID: \"9a2d92f9-fefb-4f88-9ee2-3841da9e0a74\") " pod="cert-manager/cert-manager-545d4d4674-t4wnm" Jan 29 14:16:35 crc kubenswrapper[4753]: I0129 14:16:35.168868 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-t4wnm" Jan 29 14:16:35 crc kubenswrapper[4753]: I0129 14:16:35.626659 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-t4wnm"] Jan 29 14:16:35 crc kubenswrapper[4753]: W0129 14:16:35.633897 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a2d92f9_fefb_4f88_9ee2_3841da9e0a74.slice/crio-4b144be67c4c85c4457ca4597cccac21f99f09d426aa2b7f690046d0698c54ae WatchSource:0}: Error finding container 4b144be67c4c85c4457ca4597cccac21f99f09d426aa2b7f690046d0698c54ae: Status 404 returned error can't find the container with id 4b144be67c4c85c4457ca4597cccac21f99f09d426aa2b7f690046d0698c54ae Jan 29 14:16:36 crc kubenswrapper[4753]: I0129 14:16:36.324215 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-t4wnm" event={"ID":"9a2d92f9-fefb-4f88-9ee2-3841da9e0a74","Type":"ContainerStarted","Data":"25c531311833af71ca48087e580435978b5205f2ed332b1156e685257d793084"} Jan 29 14:16:36 crc kubenswrapper[4753]: I0129 14:16:36.324277 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-t4wnm" event={"ID":"9a2d92f9-fefb-4f88-9ee2-3841da9e0a74","Type":"ContainerStarted","Data":"4b144be67c4c85c4457ca4597cccac21f99f09d426aa2b7f690046d0698c54ae"} Jan 29 14:16:36 crc kubenswrapper[4753]: I0129 14:16:36.353508 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-t4wnm" podStartSLOduration=2.353484346 podStartE2EDuration="2.353484346s" podCreationTimestamp="2026-01-29 14:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:16:36.349935949 +0000 UTC m=+831.044670361" watchObservedRunningTime="2026-01-29 14:16:36.353484346 +0000 UTC m=+831.048218768" Jan 29 14:16:36 crc kubenswrapper[4753]: I0129 14:16:36.391449 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:36 crc kubenswrapper[4753]: I0129 14:16:36.391620 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:36 crc kubenswrapper[4753]: I0129 14:16:36.521241 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:37 crc kubenswrapper[4753]: I0129 14:16:37.405126 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:37 crc kubenswrapper[4753]: I0129 14:16:37.488039 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frsj7"] Jan 29 14:16:39 crc kubenswrapper[4753]: I0129 14:16:39.345892 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-frsj7" podUID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerName="registry-server" containerID="cri-o://6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9" gracePeriod=2 Jan 29 14:16:39 crc kubenswrapper[4753]: I0129 14:16:39.803236 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:39 crc kubenswrapper[4753]: I0129 14:16:39.972460 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-utilities\") pod \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " Jan 29 14:16:39 crc kubenswrapper[4753]: I0129 14:16:39.972648 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k428k\" (UniqueName: \"kubernetes.io/projected/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-kube-api-access-k428k\") pod \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " Jan 29 14:16:39 crc kubenswrapper[4753]: I0129 14:16:39.972836 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-catalog-content\") pod \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\" (UID: \"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798\") " Jan 29 14:16:39 crc kubenswrapper[4753]: I0129 14:16:39.973890 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-utilities" (OuterVolumeSpecName: "utilities") pod "aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" (UID: "aad9fc6c-2cf9-4e43-82f2-e7ccd775a798"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:16:39 crc kubenswrapper[4753]: I0129 14:16:39.981139 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-kube-api-access-k428k" (OuterVolumeSpecName: "kube-api-access-k428k") pod "aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" (UID: "aad9fc6c-2cf9-4e43-82f2-e7ccd775a798"). InnerVolumeSpecName "kube-api-access-k428k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.018376 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" (UID: "aad9fc6c-2cf9-4e43-82f2-e7ccd775a798"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.074566 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k428k\" (UniqueName: \"kubernetes.io/projected/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-kube-api-access-k428k\") on node \"crc\" DevicePath \"\"" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.074619 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.074638 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.369784 4753 generic.go:334] "Generic (PLEG): container finished" podID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerID="6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9" exitCode=0 Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.369930 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frsj7" event={"ID":"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798","Type":"ContainerDied","Data":"6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9"} Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.369963 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frsj7" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.370008 4753 scope.go:117] "RemoveContainer" containerID="6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.369986 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frsj7" event={"ID":"aad9fc6c-2cf9-4e43-82f2-e7ccd775a798","Type":"ContainerDied","Data":"b7ac8153756ce48da784eb0a89a7064fc9be3e453f46b35031e828b48c22f3be"} Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.414679 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frsj7"] Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.419097 4753 scope.go:117] "RemoveContainer" containerID="2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.424696 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-frsj7"] Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.440587 4753 scope.go:117] "RemoveContainer" containerID="465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.472337 4753 scope.go:117] "RemoveContainer" containerID="6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9" Jan 29 14:16:40 crc kubenswrapper[4753]: E0129 14:16:40.472972 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9\": container with ID starting with 6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9 not found: ID does not exist" containerID="6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.473024 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9"} err="failed to get container status \"6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9\": rpc error: code = NotFound desc = could not find container \"6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9\": container with ID starting with 6ffe669c5e4d40464363e451546ae24a1678e0a523b6e4da06f50b85fd139bb9 not found: ID does not exist" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.473061 4753 scope.go:117] "RemoveContainer" containerID="2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647" Jan 29 14:16:40 crc kubenswrapper[4753]: E0129 14:16:40.473610 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647\": container with ID starting with 2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647 not found: ID does not exist" containerID="2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.473690 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647"} err="failed to get container status \"2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647\": rpc error: code = NotFound desc = could not find container \"2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647\": container with ID starting with 2da53dbbe3f7f46f2100396f5df4998f96e461cb37ef43f617142773fb446647 not found: ID does not exist" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.473736 4753 scope.go:117] "RemoveContainer" containerID="465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27" Jan 29 14:16:40 crc kubenswrapper[4753]: E0129 14:16:40.474138 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27\": container with ID starting with 465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27 not found: ID does not exist" containerID="465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27" Jan 29 14:16:40 crc kubenswrapper[4753]: I0129 14:16:40.474237 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27"} err="failed to get container status \"465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27\": rpc error: code = NotFound desc = could not find container \"465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27\": container with ID starting with 465909643e5ebaa766eabb61f9efcaaeb1759b04bf64af2d3086c32cae662d27 not found: ID does not exist" Jan 29 14:16:42 crc kubenswrapper[4753]: I0129 14:16:42.163341 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" path="/var/lib/kubelet/pods/aad9fc6c-2cf9-4e43-82f2-e7ccd775a798/volumes" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.582579 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2lwpn"] Jan 29 14:16:43 crc kubenswrapper[4753]: E0129 14:16:43.583141 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerName="extract-content" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.583169 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerName="extract-content" Jan 29 14:16:43 crc kubenswrapper[4753]: E0129 14:16:43.583209 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerName="registry-server" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.583217 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerName="registry-server" Jan 29 14:16:43 crc kubenswrapper[4753]: E0129 14:16:43.583228 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerName="extract-utilities" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.583238 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerName="extract-utilities" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.583371 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad9fc6c-2cf9-4e43-82f2-e7ccd775a798" containerName="registry-server" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.583799 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2lwpn" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.587781 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.588361 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-h2pkc" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.589846 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.590147 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2lwpn"] Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.639722 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw6gt\" (UniqueName: \"kubernetes.io/projected/c052ce8f-85e8-4e3e-9874-71583bfe166b-kube-api-access-hw6gt\") pod \"openstack-operator-index-2lwpn\" (UID: \"c052ce8f-85e8-4e3e-9874-71583bfe166b\") " pod="openstack-operators/openstack-operator-index-2lwpn" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.740895 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw6gt\" (UniqueName: \"kubernetes.io/projected/c052ce8f-85e8-4e3e-9874-71583bfe166b-kube-api-access-hw6gt\") pod \"openstack-operator-index-2lwpn\" (UID: \"c052ce8f-85e8-4e3e-9874-71583bfe166b\") " pod="openstack-operators/openstack-operator-index-2lwpn" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.779183 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw6gt\" (UniqueName: \"kubernetes.io/projected/c052ce8f-85e8-4e3e-9874-71583bfe166b-kube-api-access-hw6gt\") pod \"openstack-operator-index-2lwpn\" (UID: \"c052ce8f-85e8-4e3e-9874-71583bfe166b\") " pod="openstack-operators/openstack-operator-index-2lwpn" Jan 29 14:16:43 crc kubenswrapper[4753]: I0129 14:16:43.911830 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2lwpn" Jan 29 14:16:44 crc kubenswrapper[4753]: I0129 14:16:44.443753 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2lwpn"] Jan 29 14:16:45 crc kubenswrapper[4753]: I0129 14:16:45.417455 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2lwpn" event={"ID":"c052ce8f-85e8-4e3e-9874-71583bfe166b","Type":"ContainerStarted","Data":"29718dde3991401ba9e089e26606ce7e1eb2304b1376fd1b5f50ce6195dc1fd2"} Jan 29 14:16:46 crc kubenswrapper[4753]: I0129 14:16:46.430266 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2lwpn" event={"ID":"c052ce8f-85e8-4e3e-9874-71583bfe166b","Type":"ContainerStarted","Data":"070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a"} Jan 29 14:16:46 crc kubenswrapper[4753]: I0129 14:16:46.462187 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2lwpn" podStartSLOduration=2.633116378 podStartE2EDuration="3.462143956s" podCreationTimestamp="2026-01-29 14:16:43 +0000 UTC" firstStartedPulling="2026-01-29 14:16:44.454701661 +0000 UTC m=+839.149436073" lastFinishedPulling="2026-01-29 14:16:45.283729259 +0000 UTC m=+839.978463651" observedRunningTime="2026-01-29 14:16:46.456294797 +0000 UTC m=+841.151029219" watchObservedRunningTime="2026-01-29 14:16:46.462143956 +0000 UTC m=+841.156878368" Jan 29 14:16:47 crc kubenswrapper[4753]: I0129 14:16:47.358576 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2lwpn"] Jan 29 14:16:47 crc kubenswrapper[4753]: I0129 14:16:47.770137 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-scww8"] Jan 29 14:16:47 crc kubenswrapper[4753]: I0129 14:16:47.771120 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-scww8" Jan 29 14:16:47 crc kubenswrapper[4753]: I0129 14:16:47.785021 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-scww8"] Jan 29 14:16:47 crc kubenswrapper[4753]: I0129 14:16:47.905667 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzp4h\" (UniqueName: \"kubernetes.io/projected/ceb72c42-5011-4360-9b4f-eae057a53ac0-kube-api-access-kzp4h\") pod \"openstack-operator-index-scww8\" (UID: \"ceb72c42-5011-4360-9b4f-eae057a53ac0\") " pod="openstack-operators/openstack-operator-index-scww8" Jan 29 14:16:48 crc kubenswrapper[4753]: I0129 14:16:48.007451 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzp4h\" (UniqueName: \"kubernetes.io/projected/ceb72c42-5011-4360-9b4f-eae057a53ac0-kube-api-access-kzp4h\") pod \"openstack-operator-index-scww8\" (UID: \"ceb72c42-5011-4360-9b4f-eae057a53ac0\") " pod="openstack-operators/openstack-operator-index-scww8" Jan 29 14:16:48 crc kubenswrapper[4753]: I0129 14:16:48.032874 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzp4h\" (UniqueName: \"kubernetes.io/projected/ceb72c42-5011-4360-9b4f-eae057a53ac0-kube-api-access-kzp4h\") pod \"openstack-operator-index-scww8\" (UID: \"ceb72c42-5011-4360-9b4f-eae057a53ac0\") " pod="openstack-operators/openstack-operator-index-scww8" Jan 29 14:16:48 crc kubenswrapper[4753]: I0129 14:16:48.131770 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-scww8" Jan 29 14:16:48 crc kubenswrapper[4753]: I0129 14:16:48.446362 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-2lwpn" podUID="c052ce8f-85e8-4e3e-9874-71583bfe166b" containerName="registry-server" containerID="cri-o://070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a" gracePeriod=2 Jan 29 14:16:48 crc kubenswrapper[4753]: I0129 14:16:48.643410 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-scww8"] Jan 29 14:16:48 crc kubenswrapper[4753]: W0129 14:16:48.653967 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceb72c42_5011_4360_9b4f_eae057a53ac0.slice/crio-549d37f5350fa4715e1724331ad4d555060fec6eb08c8aa25155bb733aba093f WatchSource:0}: Error finding container 549d37f5350fa4715e1724331ad4d555060fec6eb08c8aa25155bb733aba093f: Status 404 returned error can't find the container with id 549d37f5350fa4715e1724331ad4d555060fec6eb08c8aa25155bb733aba093f Jan 29 14:16:48 crc kubenswrapper[4753]: I0129 14:16:48.792566 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2lwpn" Jan 29 14:16:48 crc kubenswrapper[4753]: I0129 14:16:48.924521 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw6gt\" (UniqueName: \"kubernetes.io/projected/c052ce8f-85e8-4e3e-9874-71583bfe166b-kube-api-access-hw6gt\") pod \"c052ce8f-85e8-4e3e-9874-71583bfe166b\" (UID: \"c052ce8f-85e8-4e3e-9874-71583bfe166b\") " Jan 29 14:16:48 crc kubenswrapper[4753]: I0129 14:16:48.929259 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c052ce8f-85e8-4e3e-9874-71583bfe166b-kube-api-access-hw6gt" (OuterVolumeSpecName: "kube-api-access-hw6gt") pod "c052ce8f-85e8-4e3e-9874-71583bfe166b" (UID: "c052ce8f-85e8-4e3e-9874-71583bfe166b"). InnerVolumeSpecName "kube-api-access-hw6gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.026748 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw6gt\" (UniqueName: \"kubernetes.io/projected/c052ce8f-85e8-4e3e-9874-71583bfe166b-kube-api-access-hw6gt\") on node \"crc\" DevicePath \"\"" Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.458323 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-scww8" event={"ID":"ceb72c42-5011-4360-9b4f-eae057a53ac0","Type":"ContainerStarted","Data":"3c2cbcb6f53db5a11649eeb71c2bec87410418949add1e5effd9474a97a21269"} Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.458928 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-scww8" event={"ID":"ceb72c42-5011-4360-9b4f-eae057a53ac0","Type":"ContainerStarted","Data":"549d37f5350fa4715e1724331ad4d555060fec6eb08c8aa25155bb733aba093f"} Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.461932 4753 generic.go:334] "Generic (PLEG): container finished" podID="c052ce8f-85e8-4e3e-9874-71583bfe166b" containerID="070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a" exitCode=0 Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.461994 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2lwpn" event={"ID":"c052ce8f-85e8-4e3e-9874-71583bfe166b","Type":"ContainerDied","Data":"070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a"} Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.462029 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2lwpn" event={"ID":"c052ce8f-85e8-4e3e-9874-71583bfe166b","Type":"ContainerDied","Data":"29718dde3991401ba9e089e26606ce7e1eb2304b1376fd1b5f50ce6195dc1fd2"} Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.462062 4753 scope.go:117] "RemoveContainer" containerID="070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a" Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.462095 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2lwpn" Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.486008 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-scww8" podStartSLOduration=2.009795194 podStartE2EDuration="2.485980948s" podCreationTimestamp="2026-01-29 14:16:47 +0000 UTC" firstStartedPulling="2026-01-29 14:16:48.662105171 +0000 UTC m=+843.356839563" lastFinishedPulling="2026-01-29 14:16:49.138290895 +0000 UTC m=+843.833025317" observedRunningTime="2026-01-29 14:16:49.479707539 +0000 UTC m=+844.174441971" watchObservedRunningTime="2026-01-29 14:16:49.485980948 +0000 UTC m=+844.180715370" Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.492298 4753 scope.go:117] "RemoveContainer" containerID="070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a" Jan 29 14:16:49 crc kubenswrapper[4753]: E0129 14:16:49.492815 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a\": container with ID starting with 070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a not found: ID does not exist" containerID="070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a" Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.492953 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a"} err="failed to get container status \"070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a\": rpc error: code = NotFound desc = could not find container \"070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a\": container with ID starting with 070a2b6182acc82c2fe80485065847f6c1a194ce9f1081a9f69c4b7e41076d9a not found: ID does not exist" Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.508757 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2lwpn"] Jan 29 14:16:49 crc kubenswrapper[4753]: I0129 14:16:49.513354 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-2lwpn"] Jan 29 14:16:50 crc kubenswrapper[4753]: I0129 14:16:50.460460 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c052ce8f-85e8-4e3e-9874-71583bfe166b" path="/var/lib/kubelet/pods/c052ce8f-85e8-4e3e-9874-71583bfe166b/volumes" Jan 29 14:16:58 crc kubenswrapper[4753]: I0129 14:16:58.132326 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-scww8" Jan 29 14:16:58 crc kubenswrapper[4753]: I0129 14:16:58.132979 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-scww8" Jan 29 14:16:58 crc kubenswrapper[4753]: I0129 14:16:58.164782 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-scww8" Jan 29 14:16:58 crc kubenswrapper[4753]: I0129 14:16:58.582869 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-scww8" Jan 29 14:16:59 crc kubenswrapper[4753]: I0129 14:16:59.803831 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg"] Jan 29 14:16:59 crc kubenswrapper[4753]: E0129 14:16:59.804083 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c052ce8f-85e8-4e3e-9874-71583bfe166b" containerName="registry-server" Jan 29 14:16:59 crc kubenswrapper[4753]: I0129 14:16:59.804098 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c052ce8f-85e8-4e3e-9874-71583bfe166b" containerName="registry-server" Jan 29 14:16:59 crc kubenswrapper[4753]: I0129 14:16:59.804256 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c052ce8f-85e8-4e3e-9874-71583bfe166b" containerName="registry-server" Jan 29 14:16:59 crc kubenswrapper[4753]: I0129 14:16:59.805275 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:16:59 crc kubenswrapper[4753]: I0129 14:16:59.808052 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-pkzb6" Jan 29 14:16:59 crc kubenswrapper[4753]: I0129 14:16:59.823091 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg"] Jan 29 14:16:59 crc kubenswrapper[4753]: I0129 14:16:59.918552 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:16:59 crc kubenswrapper[4753]: I0129 14:16:59.919023 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:16:59 crc kubenswrapper[4753]: I0129 14:16:59.919126 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f264\" (UniqueName: \"kubernetes.io/projected/9ce44e70-fc27-417b-a381-1e253c42a007-kube-api-access-4f264\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.020308 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.020419 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f264\" (UniqueName: \"kubernetes.io/projected/9ce44e70-fc27-417b-a381-1e253c42a007-kube-api-access-4f264\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.021106 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.021193 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.021520 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.050534 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f264\" (UniqueName: \"kubernetes.io/projected/9ce44e70-fc27-417b-a381-1e253c42a007-kube-api-access-4f264\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.171528 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.570775 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z4776"] Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.572660 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.609084 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4776"] Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.671200 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg"] Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.728635 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psghr\" (UniqueName: \"kubernetes.io/projected/675342d8-29a2-4936-9eed-343eab94a58c-kube-api-access-psghr\") pod \"certified-operators-z4776\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.728723 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-utilities\") pod \"certified-operators-z4776\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.728746 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-catalog-content\") pod \"certified-operators-z4776\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.830522 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psghr\" (UniqueName: \"kubernetes.io/projected/675342d8-29a2-4936-9eed-343eab94a58c-kube-api-access-psghr\") pod \"certified-operators-z4776\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.831004 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-utilities\") pod \"certified-operators-z4776\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.831026 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-catalog-content\") pod \"certified-operators-z4776\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.831608 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-catalog-content\") pod \"certified-operators-z4776\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.831784 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-utilities\") pod \"certified-operators-z4776\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.852736 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psghr\" (UniqueName: \"kubernetes.io/projected/675342d8-29a2-4936-9eed-343eab94a58c-kube-api-access-psghr\") pod \"certified-operators-z4776\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:00 crc kubenswrapper[4753]: I0129 14:17:00.919515 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:01 crc kubenswrapper[4753]: I0129 14:17:01.383793 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4776"] Jan 29 14:17:01 crc kubenswrapper[4753]: I0129 14:17:01.552719 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4776" event={"ID":"675342d8-29a2-4936-9eed-343eab94a58c","Type":"ContainerStarted","Data":"f100ad9d64299e6de17e2fc6b682a98f86261934e70e0f57df6ce0c463e6a1f2"} Jan 29 14:17:01 crc kubenswrapper[4753]: I0129 14:17:01.554317 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" event={"ID":"9ce44e70-fc27-417b-a381-1e253c42a007","Type":"ContainerStarted","Data":"1f7ee87850a6e27777dba9646631c15c719097fef82022d13c6ff4cfe06ddc65"} Jan 29 14:17:05 crc kubenswrapper[4753]: I0129 14:17:05.586332 4753 generic.go:334] "Generic (PLEG): container finished" podID="675342d8-29a2-4936-9eed-343eab94a58c" containerID="0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b" exitCode=0 Jan 29 14:17:05 crc kubenswrapper[4753]: I0129 14:17:05.586447 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4776" event={"ID":"675342d8-29a2-4936-9eed-343eab94a58c","Type":"ContainerDied","Data":"0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b"} Jan 29 14:17:05 crc kubenswrapper[4753]: I0129 14:17:05.590851 4753 generic.go:334] "Generic (PLEG): container finished" podID="9ce44e70-fc27-417b-a381-1e253c42a007" containerID="2c217fd6b942da728ac2bdb6d39c543f3e9891a17cdafc8c814e4f287e710545" exitCode=0 Jan 29 14:17:05 crc kubenswrapper[4753]: I0129 14:17:05.591083 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" event={"ID":"9ce44e70-fc27-417b-a381-1e253c42a007","Type":"ContainerDied","Data":"2c217fd6b942da728ac2bdb6d39c543f3e9891a17cdafc8c814e4f287e710545"} Jan 29 14:17:06 crc kubenswrapper[4753]: I0129 14:17:06.602203 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4776" event={"ID":"675342d8-29a2-4936-9eed-343eab94a58c","Type":"ContainerStarted","Data":"4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38"} Jan 29 14:17:06 crc kubenswrapper[4753]: I0129 14:17:06.605369 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" event={"ID":"9ce44e70-fc27-417b-a381-1e253c42a007","Type":"ContainerStarted","Data":"dac0941329c8741a164547f667481c154e3eaecb419d50f9a93061f9f69576f5"} Jan 29 14:17:07 crc kubenswrapper[4753]: I0129 14:17:07.617450 4753 generic.go:334] "Generic (PLEG): container finished" podID="9ce44e70-fc27-417b-a381-1e253c42a007" containerID="dac0941329c8741a164547f667481c154e3eaecb419d50f9a93061f9f69576f5" exitCode=0 Jan 29 14:17:07 crc kubenswrapper[4753]: I0129 14:17:07.617594 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" event={"ID":"9ce44e70-fc27-417b-a381-1e253c42a007","Type":"ContainerDied","Data":"dac0941329c8741a164547f667481c154e3eaecb419d50f9a93061f9f69576f5"} Jan 29 14:17:07 crc kubenswrapper[4753]: I0129 14:17:07.621108 4753 generic.go:334] "Generic (PLEG): container finished" podID="675342d8-29a2-4936-9eed-343eab94a58c" containerID="4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38" exitCode=0 Jan 29 14:17:07 crc kubenswrapper[4753]: I0129 14:17:07.621203 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4776" event={"ID":"675342d8-29a2-4936-9eed-343eab94a58c","Type":"ContainerDied","Data":"4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38"} Jan 29 14:17:08 crc kubenswrapper[4753]: I0129 14:17:08.631113 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4776" event={"ID":"675342d8-29a2-4936-9eed-343eab94a58c","Type":"ContainerStarted","Data":"8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410"} Jan 29 14:17:08 crc kubenswrapper[4753]: I0129 14:17:08.633790 4753 generic.go:334] "Generic (PLEG): container finished" podID="9ce44e70-fc27-417b-a381-1e253c42a007" containerID="edc79de20d01e85e5d80f76f0c4153a4c08a71bed7d3c2b56986de4e210013f3" exitCode=0 Jan 29 14:17:08 crc kubenswrapper[4753]: I0129 14:17:08.633848 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" event={"ID":"9ce44e70-fc27-417b-a381-1e253c42a007","Type":"ContainerDied","Data":"edc79de20d01e85e5d80f76f0c4153a4c08a71bed7d3c2b56986de4e210013f3"} Jan 29 14:17:08 crc kubenswrapper[4753]: I0129 14:17:08.655244 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z4776" podStartSLOduration=6.091539312 podStartE2EDuration="8.655221112s" podCreationTimestamp="2026-01-29 14:17:00 +0000 UTC" firstStartedPulling="2026-01-29 14:17:05.589220601 +0000 UTC m=+860.283955013" lastFinishedPulling="2026-01-29 14:17:08.152902421 +0000 UTC m=+862.847636813" observedRunningTime="2026-01-29 14:17:08.654573385 +0000 UTC m=+863.349307797" watchObservedRunningTime="2026-01-29 14:17:08.655221112 +0000 UTC m=+863.349955524" Jan 29 14:17:09 crc kubenswrapper[4753]: I0129 14:17:09.941670 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.007274 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-bundle\") pod \"9ce44e70-fc27-417b-a381-1e253c42a007\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.007344 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-util\") pod \"9ce44e70-fc27-417b-a381-1e253c42a007\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.008212 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-bundle" (OuterVolumeSpecName: "bundle") pod "9ce44e70-fc27-417b-a381-1e253c42a007" (UID: "9ce44e70-fc27-417b-a381-1e253c42a007"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.009441 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f264\" (UniqueName: \"kubernetes.io/projected/9ce44e70-fc27-417b-a381-1e253c42a007-kube-api-access-4f264\") pod \"9ce44e70-fc27-417b-a381-1e253c42a007\" (UID: \"9ce44e70-fc27-417b-a381-1e253c42a007\") " Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.010131 4753 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.016965 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce44e70-fc27-417b-a381-1e253c42a007-kube-api-access-4f264" (OuterVolumeSpecName: "kube-api-access-4f264") pod "9ce44e70-fc27-417b-a381-1e253c42a007" (UID: "9ce44e70-fc27-417b-a381-1e253c42a007"). InnerVolumeSpecName "kube-api-access-4f264". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.025264 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-util" (OuterVolumeSpecName: "util") pod "9ce44e70-fc27-417b-a381-1e253c42a007" (UID: "9ce44e70-fc27-417b-a381-1e253c42a007"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.111234 4753 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ce44e70-fc27-417b-a381-1e253c42a007-util\") on node \"crc\" DevicePath \"\"" Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.111277 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f264\" (UniqueName: \"kubernetes.io/projected/9ce44e70-fc27-417b-a381-1e253c42a007-kube-api-access-4f264\") on node \"crc\" DevicePath \"\"" Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.656310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" event={"ID":"9ce44e70-fc27-417b-a381-1e253c42a007","Type":"ContainerDied","Data":"1f7ee87850a6e27777dba9646631c15c719097fef82022d13c6ff4cfe06ddc65"} Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.656368 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f7ee87850a6e27777dba9646631c15c719097fef82022d13c6ff4cfe06ddc65" Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.656423 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg" Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.920023 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:10 crc kubenswrapper[4753]: I0129 14:17:10.920929 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:11 crc kubenswrapper[4753]: I0129 14:17:11.021089 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.042840 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc"] Jan 29 14:17:17 crc kubenswrapper[4753]: E0129 14:17:17.043700 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce44e70-fc27-417b-a381-1e253c42a007" containerName="extract" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.043715 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce44e70-fc27-417b-a381-1e253c42a007" containerName="extract" Jan 29 14:17:17 crc kubenswrapper[4753]: E0129 14:17:17.043742 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce44e70-fc27-417b-a381-1e253c42a007" containerName="util" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.043750 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce44e70-fc27-417b-a381-1e253c42a007" containerName="util" Jan 29 14:17:17 crc kubenswrapper[4753]: E0129 14:17:17.043765 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce44e70-fc27-417b-a381-1e253c42a007" containerName="pull" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.043773 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce44e70-fc27-417b-a381-1e253c42a007" containerName="pull" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.043904 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce44e70-fc27-417b-a381-1e253c42a007" containerName="extract" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.044419 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.048796 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-7fkjc" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.076702 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc"] Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.119785 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9zjl\" (UniqueName: \"kubernetes.io/projected/e80d82c2-34f9-4d52-84c8-880f1c787a27-kube-api-access-f9zjl\") pod \"openstack-operator-controller-init-757f46c65d-zw2hc\" (UID: \"e80d82c2-34f9-4d52-84c8-880f1c787a27\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.221756 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9zjl\" (UniqueName: \"kubernetes.io/projected/e80d82c2-34f9-4d52-84c8-880f1c787a27-kube-api-access-f9zjl\") pod \"openstack-operator-controller-init-757f46c65d-zw2hc\" (UID: \"e80d82c2-34f9-4d52-84c8-880f1c787a27\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.243463 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9zjl\" (UniqueName: \"kubernetes.io/projected/e80d82c2-34f9-4d52-84c8-880f1c787a27-kube-api-access-f9zjl\") pod \"openstack-operator-controller-init-757f46c65d-zw2hc\" (UID: \"e80d82c2-34f9-4d52-84c8-880f1c787a27\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.361745 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc" Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.590589 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc"] Jan 29 14:17:17 crc kubenswrapper[4753]: I0129 14:17:17.744238 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc" event={"ID":"e80d82c2-34f9-4d52-84c8-880f1c787a27","Type":"ContainerStarted","Data":"4b1adddbbd30455a3dab8c3d8b5a737215c3a34c0c907f993e58968222fd5987"} Jan 29 14:17:20 crc kubenswrapper[4753]: I0129 14:17:20.971523 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:22 crc kubenswrapper[4753]: I0129 14:17:22.681432 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z4776"] Jan 29 14:17:22 crc kubenswrapper[4753]: I0129 14:17:22.682262 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z4776" podUID="675342d8-29a2-4936-9eed-343eab94a58c" containerName="registry-server" containerID="cri-o://8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410" gracePeriod=2 Jan 29 14:17:22 crc kubenswrapper[4753]: I0129 14:17:22.779671 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc" event={"ID":"e80d82c2-34f9-4d52-84c8-880f1c787a27","Type":"ContainerStarted","Data":"ea3e043d2627848df0f27c95d66278fb86e7569969d636a164711c66cd43527e"} Jan 29 14:17:22 crc kubenswrapper[4753]: I0129 14:17:22.779938 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc" Jan 29 14:17:22 crc kubenswrapper[4753]: I0129 14:17:22.827236 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc" podStartSLOduration=1.107880869 podStartE2EDuration="5.827215009s" podCreationTimestamp="2026-01-29 14:17:17 +0000 UTC" firstStartedPulling="2026-01-29 14:17:17.611631592 +0000 UTC m=+872.306365974" lastFinishedPulling="2026-01-29 14:17:22.330965692 +0000 UTC m=+877.025700114" observedRunningTime="2026-01-29 14:17:22.826124339 +0000 UTC m=+877.520858771" watchObservedRunningTime="2026-01-29 14:17:22.827215009 +0000 UTC m=+877.521949401" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.168714 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.308113 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-utilities\") pod \"675342d8-29a2-4936-9eed-343eab94a58c\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.308340 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-catalog-content\") pod \"675342d8-29a2-4936-9eed-343eab94a58c\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.308424 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psghr\" (UniqueName: \"kubernetes.io/projected/675342d8-29a2-4936-9eed-343eab94a58c-kube-api-access-psghr\") pod \"675342d8-29a2-4936-9eed-343eab94a58c\" (UID: \"675342d8-29a2-4936-9eed-343eab94a58c\") " Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.308923 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-utilities" (OuterVolumeSpecName: "utilities") pod "675342d8-29a2-4936-9eed-343eab94a58c" (UID: "675342d8-29a2-4936-9eed-343eab94a58c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.309070 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.317252 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/675342d8-29a2-4936-9eed-343eab94a58c-kube-api-access-psghr" (OuterVolumeSpecName: "kube-api-access-psghr") pod "675342d8-29a2-4936-9eed-343eab94a58c" (UID: "675342d8-29a2-4936-9eed-343eab94a58c"). InnerVolumeSpecName "kube-api-access-psghr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.370640 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "675342d8-29a2-4936-9eed-343eab94a58c" (UID: "675342d8-29a2-4936-9eed-343eab94a58c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.410271 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/675342d8-29a2-4936-9eed-343eab94a58c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.410314 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psghr\" (UniqueName: \"kubernetes.io/projected/675342d8-29a2-4936-9eed-343eab94a58c-kube-api-access-psghr\") on node \"crc\" DevicePath \"\"" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.795226 4753 generic.go:334] "Generic (PLEG): container finished" podID="675342d8-29a2-4936-9eed-343eab94a58c" containerID="8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410" exitCode=0 Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.795306 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4776" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.795385 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4776" event={"ID":"675342d8-29a2-4936-9eed-343eab94a58c","Type":"ContainerDied","Data":"8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410"} Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.795425 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4776" event={"ID":"675342d8-29a2-4936-9eed-343eab94a58c","Type":"ContainerDied","Data":"f100ad9d64299e6de17e2fc6b682a98f86261934e70e0f57df6ce0c463e6a1f2"} Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.795454 4753 scope.go:117] "RemoveContainer" containerID="8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.821561 4753 scope.go:117] "RemoveContainer" containerID="4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.848501 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z4776"] Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.855440 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z4776"] Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.860709 4753 scope.go:117] "RemoveContainer" containerID="0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.882452 4753 scope.go:117] "RemoveContainer" containerID="8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410" Jan 29 14:17:23 crc kubenswrapper[4753]: E0129 14:17:23.883686 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410\": container with ID starting with 8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410 not found: ID does not exist" containerID="8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.883790 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410"} err="failed to get container status \"8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410\": rpc error: code = NotFound desc = could not find container \"8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410\": container with ID starting with 8febfbe76a0c8557ed60c5f1a9fdae7cf8318449e03e48752ede728b64c6b410 not found: ID does not exist" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.883836 4753 scope.go:117] "RemoveContainer" containerID="4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38" Jan 29 14:17:23 crc kubenswrapper[4753]: E0129 14:17:23.884378 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38\": container with ID starting with 4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38 not found: ID does not exist" containerID="4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.884449 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38"} err="failed to get container status \"4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38\": rpc error: code = NotFound desc = could not find container \"4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38\": container with ID starting with 4acb135d91be756e6d4ae0f95fb034b605be9645babe7389211580d3bf7b0b38 not found: ID does not exist" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.884479 4753 scope.go:117] "RemoveContainer" containerID="0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b" Jan 29 14:17:23 crc kubenswrapper[4753]: E0129 14:17:23.884977 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b\": container with ID starting with 0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b not found: ID does not exist" containerID="0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b" Jan 29 14:17:23 crc kubenswrapper[4753]: I0129 14:17:23.885031 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b"} err="failed to get container status \"0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b\": rpc error: code = NotFound desc = could not find container \"0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b\": container with ID starting with 0085e231214637a29935e8e58bbfad31b55966c5e21a0f3de06715f6d08c593b not found: ID does not exist" Jan 29 14:17:24 crc kubenswrapper[4753]: I0129 14:17:24.164129 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="675342d8-29a2-4936-9eed-343eab94a58c" path="/var/lib/kubelet/pods/675342d8-29a2-4936-9eed-343eab94a58c/volumes" Jan 29 14:17:27 crc kubenswrapper[4753]: I0129 14:17:27.365141 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-zw2hc" Jan 29 14:17:57 crc kubenswrapper[4753]: I0129 14:17:57.055188 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:17:57 crc kubenswrapper[4753]: I0129 14:17:57.056560 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.049806 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j"] Jan 29 14:18:04 crc kubenswrapper[4753]: E0129 14:18:04.050979 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675342d8-29a2-4936-9eed-343eab94a58c" containerName="extract-utilities" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.051004 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="675342d8-29a2-4936-9eed-343eab94a58c" containerName="extract-utilities" Jan 29 14:18:04 crc kubenswrapper[4753]: E0129 14:18:04.051027 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675342d8-29a2-4936-9eed-343eab94a58c" containerName="registry-server" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.051040 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="675342d8-29a2-4936-9eed-343eab94a58c" containerName="registry-server" Jan 29 14:18:04 crc kubenswrapper[4753]: E0129 14:18:04.051066 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675342d8-29a2-4936-9eed-343eab94a58c" containerName="extract-content" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.051078 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="675342d8-29a2-4936-9eed-343eab94a58c" containerName="extract-content" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.051348 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="675342d8-29a2-4936-9eed-343eab94a58c" containerName="registry-server" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.052033 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.053974 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.054634 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-gz6mw" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.054765 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.057303 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-r7ss6" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.069694 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.075718 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.079980 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.080875 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.120719 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-jsxwn" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.130469 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.211319 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.214585 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.223899 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-g4v82" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.225553 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.226380 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.231121 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-27n2c" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.232209 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.232934 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.260049 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-8n87r" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.262786 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.276787 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9tz4\" (UniqueName: \"kubernetes.io/projected/8ef2b554-7857-404c-adce-f82ebcf71f72-kube-api-access-z9tz4\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-28p9j\" (UID: \"8ef2b554-7857-404c-adce-f82ebcf71f72\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.278226 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6wph\" (UniqueName: \"kubernetes.io/projected/9d8da066-fe2d-4cf7-b721-3155f8f11510-kube-api-access-g6wph\") pod \"cinder-operator-controller-manager-8d874c8fc-h5nxt\" (UID: \"9d8da066-fe2d-4cf7-b721-3155f8f11510\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.278349 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcnvn\" (UniqueName: \"kubernetes.io/projected/ea463c74-766e-424b-a930-cc8cad45ea88-kube-api-access-rcnvn\") pod \"heat-operator-controller-manager-69d6db494d-57kmq\" (UID: \"ea463c74-766e-424b-a930-cc8cad45ea88\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.278487 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvrwh\" (UniqueName: \"kubernetes.io/projected/81d6f0fa-ae4f-46e3-8103-bc97b1afc209-kube-api-access-xvrwh\") pod \"horizon-operator-controller-manager-5fb775575f-2mvrq\" (UID: \"81d6f0fa-ae4f-46e3-8103-bc97b1afc209\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.278586 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs68g\" (UniqueName: \"kubernetes.io/projected/f20cfb79-500e-4652-b32d-098d8b27e031-kube-api-access-zs68g\") pod \"glance-operator-controller-manager-8886f4c47-n6kx7\" (UID: \"f20cfb79-500e-4652-b32d-098d8b27e031\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.278871 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c66x8\" (UniqueName: \"kubernetes.io/projected/3482315b-e8cc-4dcf-9eb6-fb120739e361-kube-api-access-c66x8\") pod \"designate-operator-controller-manager-6d9697b7f4-vc8jd\" (UID: \"3482315b-e8cc-4dcf-9eb6-fb120739e361\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.301217 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.337497 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.341562 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.351900 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-cjjl2" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.352324 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.382191 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c66x8\" (UniqueName: \"kubernetes.io/projected/3482315b-e8cc-4dcf-9eb6-fb120739e361-kube-api-access-c66x8\") pod \"designate-operator-controller-manager-6d9697b7f4-vc8jd\" (UID: \"3482315b-e8cc-4dcf-9eb6-fb120739e361\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.382704 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9tz4\" (UniqueName: \"kubernetes.io/projected/8ef2b554-7857-404c-adce-f82ebcf71f72-kube-api-access-z9tz4\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-28p9j\" (UID: \"8ef2b554-7857-404c-adce-f82ebcf71f72\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.382772 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.382823 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6wph\" (UniqueName: \"kubernetes.io/projected/9d8da066-fe2d-4cf7-b721-3155f8f11510-kube-api-access-g6wph\") pod \"cinder-operator-controller-manager-8d874c8fc-h5nxt\" (UID: \"9d8da066-fe2d-4cf7-b721-3155f8f11510\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.382981 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcnvn\" (UniqueName: \"kubernetes.io/projected/ea463c74-766e-424b-a930-cc8cad45ea88-kube-api-access-rcnvn\") pod \"heat-operator-controller-manager-69d6db494d-57kmq\" (UID: \"ea463c74-766e-424b-a930-cc8cad45ea88\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.383235 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvrwh\" (UniqueName: \"kubernetes.io/projected/81d6f0fa-ae4f-46e3-8103-bc97b1afc209-kube-api-access-xvrwh\") pod \"horizon-operator-controller-manager-5fb775575f-2mvrq\" (UID: \"81d6f0fa-ae4f-46e3-8103-bc97b1afc209\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.383399 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs68g\" (UniqueName: \"kubernetes.io/projected/f20cfb79-500e-4652-b32d-098d8b27e031-kube-api-access-zs68g\") pod \"glance-operator-controller-manager-8886f4c47-n6kx7\" (UID: \"f20cfb79-500e-4652-b32d-098d8b27e031\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.383549 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk9mf\" (UniqueName: \"kubernetes.io/projected/6bfa698b-c528-4171-88ee-3480e2715dc9-kube-api-access-xk9mf\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.390362 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.413499 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.414495 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.432723 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-5dt6b" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.432948 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.437023 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcnvn\" (UniqueName: \"kubernetes.io/projected/ea463c74-766e-424b-a930-cc8cad45ea88-kube-api-access-rcnvn\") pod \"heat-operator-controller-manager-69d6db494d-57kmq\" (UID: \"ea463c74-766e-424b-a930-cc8cad45ea88\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.448527 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.450135 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.459136 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9tz4\" (UniqueName: \"kubernetes.io/projected/8ef2b554-7857-404c-adce-f82ebcf71f72-kube-api-access-z9tz4\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-28p9j\" (UID: \"8ef2b554-7857-404c-adce-f82ebcf71f72\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.476433 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6wph\" (UniqueName: \"kubernetes.io/projected/9d8da066-fe2d-4cf7-b721-3155f8f11510-kube-api-access-g6wph\") pod \"cinder-operator-controller-manager-8d874c8fc-h5nxt\" (UID: \"9d8da066-fe2d-4cf7-b721-3155f8f11510\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.476784 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvrwh\" (UniqueName: \"kubernetes.io/projected/81d6f0fa-ae4f-46e3-8103-bc97b1afc209-kube-api-access-xvrwh\") pod \"horizon-operator-controller-manager-5fb775575f-2mvrq\" (UID: \"81d6f0fa-ae4f-46e3-8103-bc97b1afc209\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.478532 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c66x8\" (UniqueName: \"kubernetes.io/projected/3482315b-e8cc-4dcf-9eb6-fb120739e361-kube-api-access-c66x8\") pod \"designate-operator-controller-manager-6d9697b7f4-vc8jd\" (UID: \"3482315b-e8cc-4dcf-9eb6-fb120739e361\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.480399 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.490904 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npgv5\" (UniqueName: \"kubernetes.io/projected/91f93520-085a-4d27-bffa-f8bc1956c686-kube-api-access-npgv5\") pod \"ironic-operator-controller-manager-5f4b8bd54d-jnsmr\" (UID: \"91f93520-085a-4d27-bffa-f8bc1956c686\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.491214 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk9mf\" (UniqueName: \"kubernetes.io/projected/6bfa698b-c528-4171-88ee-3480e2715dc9-kube-api-access-xk9mf\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.491321 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.491576 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99h4f\" (UniqueName: \"kubernetes.io/projected/88bc469c-847c-4e52-9612-cfd238cbcf3d-kube-api-access-99h4f\") pod \"keystone-operator-controller-manager-84f48565d4-tjf4x\" (UID: \"88bc469c-847c-4e52-9612-cfd238cbcf3d\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.495348 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.496529 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.499337 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt" Jan 29 14:18:04 crc kubenswrapper[4753]: E0129 14:18:04.499717 4753 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:04 crc kubenswrapper[4753]: E0129 14:18:04.499792 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert podName:6bfa698b-c528-4171-88ee-3480e2715dc9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:04.999770882 +0000 UTC m=+919.694505264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert") pod "infra-operator-controller-manager-79955696d6-mgwrk" (UID: "6bfa698b-c528-4171-88ee-3480e2715dc9") : secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.502900 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs68g\" (UniqueName: \"kubernetes.io/projected/f20cfb79-500e-4652-b32d-098d8b27e031-kube-api-access-zs68g\") pod \"glance-operator-controller-manager-8886f4c47-n6kx7\" (UID: \"f20cfb79-500e-4652-b32d-098d8b27e031\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.511144 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.513599 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lrnk7" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.513835 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-5kj2m" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.535450 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.552896 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.553938 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.557627 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-5dn5d" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.558237 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.559012 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk9mf\" (UniqueName: \"kubernetes.io/projected/6bfa698b-c528-4171-88ee-3480e2715dc9-kube-api-access-xk9mf\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.563387 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.564302 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.570980 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-5jpgz" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.592720 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npgv5\" (UniqueName: \"kubernetes.io/projected/91f93520-085a-4d27-bffa-f8bc1956c686-kube-api-access-npgv5\") pod \"ironic-operator-controller-manager-5f4b8bd54d-jnsmr\" (UID: \"91f93520-085a-4d27-bffa-f8bc1956c686\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.592838 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99h4f\" (UniqueName: \"kubernetes.io/projected/88bc469c-847c-4e52-9612-cfd238cbcf3d-kube-api-access-99h4f\") pod \"keystone-operator-controller-manager-84f48565d4-tjf4x\" (UID: \"88bc469c-847c-4e52-9612-cfd238cbcf3d\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.610410 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.627842 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99h4f\" (UniqueName: \"kubernetes.io/projected/88bc469c-847c-4e52-9612-cfd238cbcf3d-kube-api-access-99h4f\") pod \"keystone-operator-controller-manager-84f48565d4-tjf4x\" (UID: \"88bc469c-847c-4e52-9612-cfd238cbcf3d\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.628761 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.639446 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.648782 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npgv5\" (UniqueName: \"kubernetes.io/projected/91f93520-085a-4d27-bffa-f8bc1956c686-kube-api-access-npgv5\") pod \"ironic-operator-controller-manager-5f4b8bd54d-jnsmr\" (UID: \"91f93520-085a-4d27-bffa-f8bc1956c686\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.648844 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.668461 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.682457 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.694596 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2dst\" (UniqueName: \"kubernetes.io/projected/6d74ea48-8122-4dae-9adf-472a4d2ce3c9-kube-api-access-g2dst\") pod \"neutron-operator-controller-manager-585dbc889-hr6vm\" (UID: \"6d74ea48-8122-4dae-9adf-472a4d2ce3c9\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.694689 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kgck\" (UniqueName: \"kubernetes.io/projected/ed5e20f7-cf91-4238-9472-eba0bcc3183b-kube-api-access-9kgck\") pod \"manila-operator-controller-manager-7dd968899f-n6t9d\" (UID: \"ed5e20f7-cf91-4238-9472-eba0bcc3183b\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.694726 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5d6f\" (UniqueName: \"kubernetes.io/projected/7d8a4a16-258a-4759-980b-98f13fa2e64c-kube-api-access-v5d6f\") pod \"mariadb-operator-controller-manager-67bf948998-9777h\" (UID: \"7d8a4a16-258a-4759-980b-98f13fa2e64c\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.708678 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.709662 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.712986 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.717509 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-qdmhn" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.731025 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.744486 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rsmvf" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.744722 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.747333 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.752986 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.753475 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.756589 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.761469 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-cflgb" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.767525 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2gzns" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.768823 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.776418 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.782547 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.783434 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.798661 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8p7sc" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.803255 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kgck\" (UniqueName: \"kubernetes.io/projected/ed5e20f7-cf91-4238-9472-eba0bcc3183b-kube-api-access-9kgck\") pod \"manila-operator-controller-manager-7dd968899f-n6t9d\" (UID: \"ed5e20f7-cf91-4238-9472-eba0bcc3183b\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.803299 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5d6f\" (UniqueName: \"kubernetes.io/projected/7d8a4a16-258a-4759-980b-98f13fa2e64c-kube-api-access-v5d6f\") pod \"mariadb-operator-controller-manager-67bf948998-9777h\" (UID: \"7d8a4a16-258a-4759-980b-98f13fa2e64c\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.803405 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2dst\" (UniqueName: \"kubernetes.io/projected/6d74ea48-8122-4dae-9adf-472a4d2ce3c9-kube-api-access-g2dst\") pod \"neutron-operator-controller-manager-585dbc889-hr6vm\" (UID: \"6d74ea48-8122-4dae-9adf-472a4d2ce3c9\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.845846 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2dst\" (UniqueName: \"kubernetes.io/projected/6d74ea48-8122-4dae-9adf-472a4d2ce3c9-kube-api-access-g2dst\") pod \"neutron-operator-controller-manager-585dbc889-hr6vm\" (UID: \"6d74ea48-8122-4dae-9adf-472a4d2ce3c9\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.845851 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5d6f\" (UniqueName: \"kubernetes.io/projected/7d8a4a16-258a-4759-980b-98f13fa2e64c-kube-api-access-v5d6f\") pod \"mariadb-operator-controller-manager-67bf948998-9777h\" (UID: \"7d8a4a16-258a-4759-980b-98f13fa2e64c\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.846132 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.875213 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kgck\" (UniqueName: \"kubernetes.io/projected/ed5e20f7-cf91-4238-9472-eba0bcc3183b-kube-api-access-9kgck\") pod \"manila-operator-controller-manager-7dd968899f-n6t9d\" (UID: \"ed5e20f7-cf91-4238-9472-eba0bcc3183b\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.904879 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.904965 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qhsf\" (UniqueName: \"kubernetes.io/projected/20e19d51-387b-4da6-8e39-652b24176ef9-kube-api-access-9qhsf\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.905005 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnqfr\" (UniqueName: \"kubernetes.io/projected/9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb-kube-api-access-cnqfr\") pod \"octavia-operator-controller-manager-6687f8d877-8wdkj\" (UID: \"9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.905070 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sshfl\" (UniqueName: \"kubernetes.io/projected/10c551ac-f50b-4773-8c83-e3e10e76f0c1-kube-api-access-sshfl\") pod \"ovn-operator-controller-manager-788c46999f-c6w2c\" (UID: \"10c551ac-f50b-4773-8c83-e3e10e76f0c1\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.905116 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqtrk\" (UniqueName: \"kubernetes.io/projected/4f2207b2-9101-4661-8ccf-d2eb0c57092a-kube-api-access-sqtrk\") pod \"nova-operator-controller-manager-55bff696bd-lml7t\" (UID: \"4f2207b2-9101-4661-8ccf-d2eb0c57092a\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.905138 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbt5d\" (UniqueName: \"kubernetes.io/projected/a3142007-86c8-4dda-a225-813a250be829-kube-api-access-lbt5d\") pod \"placement-operator-controller-manager-5b964cf4cd-pkk94\" (UID: \"a3142007-86c8-4dda-a225-813a250be829\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.909464 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.932763 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.937579 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.952523 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.958702 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-pnn29" Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.978498 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94"] Jan 29 14:18:04 crc kubenswrapper[4753]: I0129 14:18:04.983955 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.003337 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.006251 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqtrk\" (UniqueName: \"kubernetes.io/projected/4f2207b2-9101-4661-8ccf-d2eb0c57092a-kube-api-access-sqtrk\") pod \"nova-operator-controller-manager-55bff696bd-lml7t\" (UID: \"4f2207b2-9101-4661-8ccf-d2eb0c57092a\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.006287 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbt5d\" (UniqueName: \"kubernetes.io/projected/a3142007-86c8-4dda-a225-813a250be829-kube-api-access-lbt5d\") pod \"placement-operator-controller-manager-5b964cf4cd-pkk94\" (UID: \"a3142007-86c8-4dda-a225-813a250be829\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.006314 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.006357 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qhsf\" (UniqueName: \"kubernetes.io/projected/20e19d51-387b-4da6-8e39-652b24176ef9-kube-api-access-9qhsf\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.006388 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnqfr\" (UniqueName: \"kubernetes.io/projected/9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb-kube-api-access-cnqfr\") pod \"octavia-operator-controller-manager-6687f8d877-8wdkj\" (UID: \"9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.006414 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.006445 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sshfl\" (UniqueName: \"kubernetes.io/projected/10c551ac-f50b-4773-8c83-e3e10e76f0c1-kube-api-access-sshfl\") pod \"ovn-operator-controller-manager-788c46999f-c6w2c\" (UID: \"10c551ac-f50b-4773-8c83-e3e10e76f0c1\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c" Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.006992 4753 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.007034 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert podName:20e19d51-387b-4da6-8e39-652b24176ef9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:05.507020106 +0000 UTC m=+920.201754488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" (UID: "20e19d51-387b-4da6-8e39-652b24176ef9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.007297 4753 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.007321 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert podName:6bfa698b-c528-4171-88ee-3480e2715dc9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:06.007313674 +0000 UTC m=+920.702048056 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert") pod "infra-operator-controller-manager-79955696d6-mgwrk" (UID: "6bfa698b-c528-4171-88ee-3480e2715dc9") : secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.019321 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.020167 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.051683 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnqfr\" (UniqueName: \"kubernetes.io/projected/9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb-kube-api-access-cnqfr\") pod \"octavia-operator-controller-manager-6687f8d877-8wdkj\" (UID: \"9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.058796 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sshfl\" (UniqueName: \"kubernetes.io/projected/10c551ac-f50b-4773-8c83-e3e10e76f0c1-kube-api-access-sshfl\") pod \"ovn-operator-controller-manager-788c46999f-c6w2c\" (UID: \"10c551ac-f50b-4773-8c83-e3e10e76f0c1\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.067689 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.068955 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.070339 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.072653 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bqq2d" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.080621 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qhsf\" (UniqueName: \"kubernetes.io/projected/20e19d51-387b-4da6-8e39-652b24176ef9-kube-api-access-9qhsf\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.081900 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqtrk\" (UniqueName: \"kubernetes.io/projected/4f2207b2-9101-4661-8ccf-d2eb0c57092a-kube-api-access-sqtrk\") pod \"nova-operator-controller-manager-55bff696bd-lml7t\" (UID: \"4f2207b2-9101-4661-8ccf-d2eb0c57092a\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.093681 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbt5d\" (UniqueName: \"kubernetes.io/projected/a3142007-86c8-4dda-a225-813a250be829-kube-api-access-lbt5d\") pod \"placement-operator-controller-manager-5b964cf4cd-pkk94\" (UID: \"a3142007-86c8-4dda-a225-813a250be829\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.093917 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-wfszx"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.101316 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.120272 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.121835 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbxfc\" (UniqueName: \"kubernetes.io/projected/fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4-kube-api-access-hbxfc\") pod \"swift-operator-controller-manager-68fc8c869-56nqb\" (UID: \"fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.122760 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-r6fj4" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.123519 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.124369 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.128993 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hjsgk" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.138490 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.150113 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-wfszx"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.154631 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.170626 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.171747 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.179270 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.185473 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.185668 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.185791 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2gsj6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.217864 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.218895 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.219474 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.223396 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ww2lp" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.223798 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbxfc\" (UniqueName: \"kubernetes.io/projected/fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4-kube-api-access-hbxfc\") pod \"swift-operator-controller-manager-68fc8c869-56nqb\" (UID: \"fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.223833 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.223853 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.223874 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kz9g\" (UniqueName: \"kubernetes.io/projected/ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0-kube-api-access-5kz9g\") pod \"test-operator-controller-manager-56f8bfcd9f-57nnj\" (UID: \"ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.223891 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmp9c\" (UniqueName: \"kubernetes.io/projected/d7e2152d-0998-475e-b645-23df5698e858-kube-api-access-fmp9c\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.223934 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z69r9\" (UniqueName: \"kubernetes.io/projected/53a3af2e-de03-483f-ba36-253eb5e9db1d-kube-api-access-z69r9\") pod \"watcher-operator-controller-manager-564965969-wfszx\" (UID: \"53a3af2e-de03-483f-ba36-253eb5e9db1d\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.223952 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8jw4\" (UniqueName: \"kubernetes.io/projected/26b13b81-bb4c-4b22-88a3-975875eb76dc-kube-api-access-x8jw4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pkpp6\" (UID: \"26b13b81-bb4c-4b22-88a3-975875eb76dc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.223970 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc89j\" (UniqueName: \"kubernetes.io/projected/460c2c4f-24cd-4817-8145-20641c54a23e-kube-api-access-lc89j\") pod \"telemetry-operator-controller-manager-64b5b76f97-5nnl6\" (UID: \"460c2c4f-24cd-4817-8145-20641c54a23e\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.232582 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.254313 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xvnht"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.255018 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.256712 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.258006 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbxfc\" (UniqueName: \"kubernetes.io/projected/fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4-kube-api-access-hbxfc\") pod \"swift-operator-controller-manager-68fc8c869-56nqb\" (UID: \"fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.267443 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvnht"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.274898 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.302459 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.324811 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.324844 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.324867 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kz9g\" (UniqueName: \"kubernetes.io/projected/ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0-kube-api-access-5kz9g\") pod \"test-operator-controller-manager-56f8bfcd9f-57nnj\" (UID: \"ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.324886 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmp9c\" (UniqueName: \"kubernetes.io/projected/d7e2152d-0998-475e-b645-23df5698e858-kube-api-access-fmp9c\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.324937 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z69r9\" (UniqueName: \"kubernetes.io/projected/53a3af2e-de03-483f-ba36-253eb5e9db1d-kube-api-access-z69r9\") pod \"watcher-operator-controller-manager-564965969-wfszx\" (UID: \"53a3af2e-de03-483f-ba36-253eb5e9db1d\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.324952 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8jw4\" (UniqueName: \"kubernetes.io/projected/26b13b81-bb4c-4b22-88a3-975875eb76dc-kube-api-access-x8jw4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pkpp6\" (UID: \"26b13b81-bb4c-4b22-88a3-975875eb76dc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.324969 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc89j\" (UniqueName: \"kubernetes.io/projected/460c2c4f-24cd-4817-8145-20641c54a23e-kube-api-access-lc89j\") pod \"telemetry-operator-controller-manager-64b5b76f97-5nnl6\" (UID: \"460c2c4f-24cd-4817-8145-20641c54a23e\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.325400 4753 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.325443 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:05.825428188 +0000 UTC m=+920.520162570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "metrics-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.332942 4753 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.333001 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:05.832986762 +0000 UTC m=+920.527721144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "webhook-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.362673 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc89j\" (UniqueName: \"kubernetes.io/projected/460c2c4f-24cd-4817-8145-20641c54a23e-kube-api-access-lc89j\") pod \"telemetry-operator-controller-manager-64b5b76f97-5nnl6\" (UID: \"460c2c4f-24cd-4817-8145-20641c54a23e\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.365589 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8jw4\" (UniqueName: \"kubernetes.io/projected/26b13b81-bb4c-4b22-88a3-975875eb76dc-kube-api-access-x8jw4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pkpp6\" (UID: \"26b13b81-bb4c-4b22-88a3-975875eb76dc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.366258 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmp9c\" (UniqueName: \"kubernetes.io/projected/d7e2152d-0998-475e-b645-23df5698e858-kube-api-access-fmp9c\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.369631 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kz9g\" (UniqueName: \"kubernetes.io/projected/ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0-kube-api-access-5kz9g\") pod \"test-operator-controller-manager-56f8bfcd9f-57nnj\" (UID: \"ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.370575 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z69r9\" (UniqueName: \"kubernetes.io/projected/53a3af2e-de03-483f-ba36-253eb5e9db1d-kube-api-access-z69r9\") pod \"watcher-operator-controller-manager-564965969-wfszx\" (UID: \"53a3af2e-de03-483f-ba36-253eb5e9db1d\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.430167 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-utilities\") pod \"community-operators-xvnht\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.430234 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2r6h\" (UniqueName: \"kubernetes.io/projected/151add79-d51f-439d-a014-d0a42aa992f1-kube-api-access-k2r6h\") pod \"community-operators-xvnht\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.430283 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-catalog-content\") pod \"community-operators-xvnht\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.489876 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.512576 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.532845 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-utilities\") pod \"community-operators-xvnht\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.532904 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2r6h\" (UniqueName: \"kubernetes.io/projected/151add79-d51f-439d-a014-d0a42aa992f1-kube-api-access-k2r6h\") pod \"community-operators-xvnht\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.532942 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.532966 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-catalog-content\") pod \"community-operators-xvnht\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.533549 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-catalog-content\") pod \"community-operators-xvnht\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.533764 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-utilities\") pod \"community-operators-xvnht\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.534075 4753 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.534120 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert podName:20e19d51-387b-4da6-8e39-652b24176ef9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:06.534106706 +0000 UTC m=+921.228841088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" (UID: "20e19d51-387b-4da6-8e39-652b24176ef9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.546751 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.564732 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.567517 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2r6h\" (UniqueName: \"kubernetes.io/projected/151add79-d51f-439d-a014-d0a42aa992f1-kube-api-access-k2r6h\") pod \"community-operators-xvnht\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.573240 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.590769 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.625588 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7"] Jan 29 14:18:05 crc kubenswrapper[4753]: W0129 14:18:05.637800 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d8da066_fe2d_4cf7_b721_3155f8f11510.slice/crio-1e2872373ff8e5c60aa1e44453f0d03e69cce44f2b1a8b59e775e117e65825af WatchSource:0}: Error finding container 1e2872373ff8e5c60aa1e44453f0d03e69cce44f2b1a8b59e775e117e65825af: Status 404 returned error can't find the container with id 1e2872373ff8e5c60aa1e44453f0d03e69cce44f2b1a8b59e775e117e65825af Jan 29 14:18:05 crc kubenswrapper[4753]: W0129 14:18:05.700646 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf20cfb79_500e_4652_b32d_098d8b27e031.slice/crio-a53887d77d561f57f0c4a666a7bfb79045f9e17fff5e36ff0d6a5d7cef693fcb WatchSource:0}: Error finding container a53887d77d561f57f0c4a666a7bfb79045f9e17fff5e36ff0d6a5d7cef693fcb: Status 404 returned error can't find the container with id a53887d77d561f57f0c4a666a7bfb79045f9e17fff5e36ff0d6a5d7cef693fcb Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.704845 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.707679 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x"] Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.840063 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:05 crc kubenswrapper[4753]: I0129 14:18:05.840109 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.840270 4753 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.840330 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:06.840315188 +0000 UTC m=+921.535049570 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "webhook-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.840523 4753 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 14:18:05 crc kubenswrapper[4753]: E0129 14:18:05.840640 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:06.840612017 +0000 UTC m=+921.535346399 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "metrics-server-cert" not found Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.009839 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.017322 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.026285 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.032806 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.042321 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.042547 4753 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.042599 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert podName:6bfa698b-c528-4171-88ee-3480e2715dc9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:08.042585783 +0000 UTC m=+922.737320165 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert") pod "infra-operator-controller-manager-79955696d6-mgwrk" (UID: "6bfa698b-c528-4171-88ee-3480e2715dc9") : secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.249642 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.260221 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x" event={"ID":"88bc469c-847c-4e52-9612-cfd238cbcf3d","Type":"ContainerStarted","Data":"902aa88521c38f17c61dbd0a02ed3b46b1b8138f15d77cf43a48b1bb98c8882d"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.260631 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.261686 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd" event={"ID":"3482315b-e8cc-4dcf-9eb6-fb120739e361","Type":"ContainerStarted","Data":"5497829b2fad16d76d2e62915fef85e6e31c0c247d7724cdc35097d95fa0c3ec"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.263394 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" event={"ID":"7d8a4a16-258a-4759-980b-98f13fa2e64c","Type":"ContainerStarted","Data":"5d5475842d2165d111f41038f1c9ce691ac5145fcd15acc781b991b7a4d3746b"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.265867 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" event={"ID":"9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb","Type":"ContainerStarted","Data":"5ebe0bbe6d3632bdd7aeef36286f280dd2b545854730175f436c42eed396b618"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.267507 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm" event={"ID":"6d74ea48-8122-4dae-9adf-472a4d2ce3c9","Type":"ContainerStarted","Data":"e19fdd01ea29c83c2bb32d2f30a911ddcb9f0481ed703b6454d40f0bab566120"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.268356 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr" event={"ID":"91f93520-085a-4d27-bffa-f8bc1956c686","Type":"ContainerStarted","Data":"f205aa7ad0a5f79bc83b7e414c3773f3ecd93e7aa13c9f3f2878312829aff7ca"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.268419 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.269240 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d" event={"ID":"ed5e20f7-cf91-4238-9472-eba0bcc3183b","Type":"ContainerStarted","Data":"ff91d7e6b169a6a844d154478f97e5b84773bca530d613fb5a26b998bfed5d94"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.270138 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j" event={"ID":"8ef2b554-7857-404c-adce-f82ebcf71f72","Type":"ContainerStarted","Data":"da684f16d061b6e5f79ab3dfc1506279658ada8daf49f894b718091252840a08"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.271065 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7" event={"ID":"f20cfb79-500e-4652-b32d-098d8b27e031","Type":"ContainerStarted","Data":"a53887d77d561f57f0c4a666a7bfb79045f9e17fff5e36ff0d6a5d7cef693fcb"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.271822 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt" event={"ID":"9d8da066-fe2d-4cf7-b721-3155f8f11510","Type":"ContainerStarted","Data":"1e2872373ff8e5c60aa1e44453f0d03e69cce44f2b1a8b59e775e117e65825af"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.272742 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq" event={"ID":"81d6f0fa-ae4f-46e3-8103-bc97b1afc209","Type":"ContainerStarted","Data":"205663c5be18fbcb9ab05e898c7cffbe2ec473eb40a76f7f424e95df8461b6bb"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.274106 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.275433 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq" event={"ID":"ea463c74-766e-424b-a930-cc8cad45ea88","Type":"ContainerStarted","Data":"5203eaacb20fc36f25af605dd7a4e8059cbc8346fd405f2c07e391d770583144"} Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.421185 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.426385 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94"] Jan 29 14:18:06 crc kubenswrapper[4753]: W0129 14:18:06.427620 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c551ac_f50b_4773_8c83_e3e10e76f0c1.slice/crio-c4f8928b6f1bf49e9a5a3a5f0291f340e0b65df0f84cbb6ca1a115a20d8c024a WatchSource:0}: Error finding container c4f8928b6f1bf49e9a5a3a5f0291f340e0b65df0f84cbb6ca1a115a20d8c024a: Status 404 returned error can't find the container with id c4f8928b6f1bf49e9a5a3a5f0291f340e0b65df0f84cbb6ca1a115a20d8c024a Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.432898 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb"] Jan 29 14:18:06 crc kubenswrapper[4753]: W0129 14:18:06.434749 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3142007_86c8_4dda_a225_813a250be829.slice/crio-bc89759536b6567a7fb8e70477d13796319de2993a305029563a749607e9d486 WatchSource:0}: Error finding container bc89759536b6567a7fb8e70477d13796319de2993a305029563a749607e9d486: Status 404 returned error can't find the container with id bc89759536b6567a7fb8e70477d13796319de2993a305029563a749607e9d486 Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.437694 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lbt5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-pkk94_openstack-operators(a3142007-86c8-4dda-a225-813a250be829): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.438953 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" podUID="a3142007-86c8-4dda-a225-813a250be829" Jan 29 14:18:06 crc kubenswrapper[4753]: W0129 14:18:06.441190 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc8c4def_3a2e_4f72_a398_fb1c7d6e47e4.slice/crio-16963c32d6ca3550bdd197c0abcd31d8041d23859589db27555f362c446d573d WatchSource:0}: Error finding container 16963c32d6ca3550bdd197c0abcd31d8041d23859589db27555f362c446d573d: Status 404 returned error can't find the container with id 16963c32d6ca3550bdd197c0abcd31d8041d23859589db27555f362c446d573d Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.442532 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hbxfc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-56nqb_openstack-operators(fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.444633 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" podUID="fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4" Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.556097 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.556320 4753 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.556408 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert podName:20e19d51-387b-4da6-8e39-652b24176ef9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:08.556390464 +0000 UTC m=+923.251124836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" (UID: "20e19d51-387b-4da6-8e39-652b24176ef9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.640205 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.655269 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-wfszx"] Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.675937 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5kz9g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-57nnj_openstack-operators(ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.677990 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" podUID="ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0" Jan 29 14:18:06 crc kubenswrapper[4753]: W0129 14:18:06.688769 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26b13b81_bb4c_4b22_88a3_975875eb76dc.slice/crio-02117f46a86caa7d822ea8786b00e7711621c47352fceb97e36dfef0ccdaf1c8 WatchSource:0}: Error finding container 02117f46a86caa7d822ea8786b00e7711621c47352fceb97e36dfef0ccdaf1c8: Status 404 returned error can't find the container with id 02117f46a86caa7d822ea8786b00e7711621c47352fceb97e36dfef0ccdaf1c8 Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.688889 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sqtrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-lml7t_openstack-operators(4f2207b2-9101-4661-8ccf-d2eb0c57092a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.690429 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" podUID="4f2207b2-9101-4661-8ccf-d2eb0c57092a" Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.691432 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lc89j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-64b5b76f97-5nnl6_openstack-operators(460c2c4f-24cd-4817-8145-20641c54a23e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.692620 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" podUID="460c2c4f-24cd-4817-8145-20641c54a23e" Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.694860 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.703780 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.709564 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj"] Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.715104 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvnht"] Jan 29 14:18:06 crc kubenswrapper[4753]: W0129 14:18:06.769176 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod151add79_d51f_439d_a014_d0a42aa992f1.slice/crio-e9b38ac8834a7b9869be89e40d450f2e32804b79c55c6ea2ecf9db72aff160d2 WatchSource:0}: Error finding container e9b38ac8834a7b9869be89e40d450f2e32804b79c55c6ea2ecf9db72aff160d2: Status 404 returned error can't find the container with id e9b38ac8834a7b9869be89e40d450f2e32804b79c55c6ea2ecf9db72aff160d2 Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.861842 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:06 crc kubenswrapper[4753]: I0129 14:18:06.861884 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.862047 4753 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.862078 4753 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.862128 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:08.862106353 +0000 UTC m=+923.556840735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "metrics-server-cert" not found Jan 29 14:18:06 crc kubenswrapper[4753]: E0129 14:18:06.862164 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:08.862141694 +0000 UTC m=+923.556876156 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "webhook-server-cert" not found Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.297990 4753 generic.go:334] "Generic (PLEG): container finished" podID="151add79-d51f-439d-a014-d0a42aa992f1" containerID="6d32fb976a86cd4fad767bd7972f3129ac09134742dbb3023a7513a9cc95eff1" exitCode=0 Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.298080 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvnht" event={"ID":"151add79-d51f-439d-a014-d0a42aa992f1","Type":"ContainerDied","Data":"6d32fb976a86cd4fad767bd7972f3129ac09134742dbb3023a7513a9cc95eff1"} Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.298115 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvnht" event={"ID":"151add79-d51f-439d-a014-d0a42aa992f1","Type":"ContainerStarted","Data":"e9b38ac8834a7b9869be89e40d450f2e32804b79c55c6ea2ecf9db72aff160d2"} Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.301069 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6" event={"ID":"26b13b81-bb4c-4b22-88a3-975875eb76dc","Type":"ContainerStarted","Data":"02117f46a86caa7d822ea8786b00e7711621c47352fceb97e36dfef0ccdaf1c8"} Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.310954 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" event={"ID":"a3142007-86c8-4dda-a225-813a250be829","Type":"ContainerStarted","Data":"bc89759536b6567a7fb8e70477d13796319de2993a305029563a749607e9d486"} Jan 29 14:18:07 crc kubenswrapper[4753]: E0129 14:18:07.312479 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" podUID="a3142007-86c8-4dda-a225-813a250be829" Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.331814 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" event={"ID":"4f2207b2-9101-4661-8ccf-d2eb0c57092a","Type":"ContainerStarted","Data":"8cc7d3f6e1f616ac81497bef00bc53759fa7ddab95c6722795fbf24b13207701"} Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.334542 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" event={"ID":"460c2c4f-24cd-4817-8145-20641c54a23e","Type":"ContainerStarted","Data":"7a61e478f874740f4ad9df31983b12468287a7f1ace2f5e08f0dc8cd3bd5f86b"} Jan 29 14:18:07 crc kubenswrapper[4753]: E0129 14:18:07.334679 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" podUID="4f2207b2-9101-4661-8ccf-d2eb0c57092a" Jan 29 14:18:07 crc kubenswrapper[4753]: E0129 14:18:07.337020 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" podUID="460c2c4f-24cd-4817-8145-20641c54a23e" Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.342558 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c" event={"ID":"10c551ac-f50b-4773-8c83-e3e10e76f0c1","Type":"ContainerStarted","Data":"c4f8928b6f1bf49e9a5a3a5f0291f340e0b65df0f84cbb6ca1a115a20d8c024a"} Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.346189 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" event={"ID":"fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4","Type":"ContainerStarted","Data":"16963c32d6ca3550bdd197c0abcd31d8041d23859589db27555f362c446d573d"} Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.349610 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" event={"ID":"ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0","Type":"ContainerStarted","Data":"86b0f3aed91223e2527eca91c4445b23f7570d6c817877f564c9c75b56365fb6"} Jan 29 14:18:07 crc kubenswrapper[4753]: E0129 14:18:07.351096 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" podUID="fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4" Jan 29 14:18:07 crc kubenswrapper[4753]: E0129 14:18:07.353924 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" podUID="ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0" Jan 29 14:18:07 crc kubenswrapper[4753]: I0129 14:18:07.358403 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" event={"ID":"53a3af2e-de03-483f-ba36-253eb5e9db1d","Type":"ContainerStarted","Data":"fe61d0f932c0835259d5eee497b74e3ac7bbdc3d5f31dbe221a263579ed25a43"} Jan 29 14:18:08 crc kubenswrapper[4753]: I0129 14:18:08.085507 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.085842 4753 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.086096 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert podName:6bfa698b-c528-4171-88ee-3480e2715dc9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:12.086076691 +0000 UTC m=+926.780811073 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert") pod "infra-operator-controller-manager-79955696d6-mgwrk" (UID: "6bfa698b-c528-4171-88ee-3480e2715dc9") : secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.404829 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" podUID="ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0" Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.405801 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" podUID="a3142007-86c8-4dda-a225-813a250be829" Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.405792 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" podUID="460c2c4f-24cd-4817-8145-20641c54a23e" Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.406082 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" podUID="4f2207b2-9101-4661-8ccf-d2eb0c57092a" Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.406235 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" podUID="fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4" Jan 29 14:18:08 crc kubenswrapper[4753]: I0129 14:18:08.632033 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.633438 4753 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.633530 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert podName:20e19d51-387b-4da6-8e39-652b24176ef9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:12.63351051 +0000 UTC m=+927.328244892 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" (UID: "20e19d51-387b-4da6-8e39-652b24176ef9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:08 crc kubenswrapper[4753]: I0129 14:18:08.941639 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:08 crc kubenswrapper[4753]: I0129 14:18:08.941838 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.941784 4753 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.941925 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:12.941911213 +0000 UTC m=+927.636645595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "metrics-server-cert" not found Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.942099 4753 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 14:18:08 crc kubenswrapper[4753]: E0129 14:18:08.942193 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:12.94217249 +0000 UTC m=+927.636906872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "webhook-server-cert" not found Jan 29 14:18:09 crc kubenswrapper[4753]: I0129 14:18:09.414034 4753 generic.go:334] "Generic (PLEG): container finished" podID="151add79-d51f-439d-a014-d0a42aa992f1" containerID="15a8d427875fab0492c5e2b931ed5a6070e7fe3d8404e6e5779d476c8556f7c2" exitCode=0 Jan 29 14:18:09 crc kubenswrapper[4753]: I0129 14:18:09.414116 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvnht" event={"ID":"151add79-d51f-439d-a014-d0a42aa992f1","Type":"ContainerDied","Data":"15a8d427875fab0492c5e2b931ed5a6070e7fe3d8404e6e5779d476c8556f7c2"} Jan 29 14:18:12 crc kubenswrapper[4753]: I0129 14:18:12.103082 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:12 crc kubenswrapper[4753]: E0129 14:18:12.103346 4753 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:12 crc kubenswrapper[4753]: E0129 14:18:12.104227 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert podName:6bfa698b-c528-4171-88ee-3480e2715dc9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:20.104197256 +0000 UTC m=+934.798931638 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert") pod "infra-operator-controller-manager-79955696d6-mgwrk" (UID: "6bfa698b-c528-4171-88ee-3480e2715dc9") : secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:12 crc kubenswrapper[4753]: I0129 14:18:12.717724 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:12 crc kubenswrapper[4753]: E0129 14:18:12.717938 4753 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:12 crc kubenswrapper[4753]: E0129 14:18:12.718023 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert podName:20e19d51-387b-4da6-8e39-652b24176ef9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:20.71800398 +0000 UTC m=+935.412738362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" (UID: "20e19d51-387b-4da6-8e39-652b24176ef9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:13 crc kubenswrapper[4753]: I0129 14:18:13.022227 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:13 crc kubenswrapper[4753]: I0129 14:18:13.022299 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:13 crc kubenswrapper[4753]: E0129 14:18:13.022577 4753 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 14:18:13 crc kubenswrapper[4753]: E0129 14:18:13.022645 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:21.022625309 +0000 UTC m=+935.717359691 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "webhook-server-cert" not found Jan 29 14:18:13 crc kubenswrapper[4753]: E0129 14:18:13.023083 4753 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 14:18:13 crc kubenswrapper[4753]: E0129 14:18:13.023117 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:21.023110123 +0000 UTC m=+935.717844505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "metrics-server-cert" not found Jan 29 14:18:19 crc kubenswrapper[4753]: E0129 14:18:19.121580 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be" Jan 29 14:18:19 crc kubenswrapper[4753]: E0129 14:18:19.122237 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cnqfr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-8wdkj_openstack-operators(9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:18:19 crc kubenswrapper[4753]: E0129 14:18:19.123399 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" podUID="9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb" Jan 29 14:18:19 crc kubenswrapper[4753]: E0129 14:18:19.504453 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" podUID="9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb" Jan 29 14:18:20 crc kubenswrapper[4753]: I0129 14:18:20.155245 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:20 crc kubenswrapper[4753]: E0129 14:18:20.155561 4753 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:20 crc kubenswrapper[4753]: E0129 14:18:20.155646 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert podName:6bfa698b-c528-4171-88ee-3480e2715dc9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:36.155624738 +0000 UTC m=+950.850359130 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert") pod "infra-operator-controller-manager-79955696d6-mgwrk" (UID: "6bfa698b-c528-4171-88ee-3480e2715dc9") : secret "infra-operator-webhook-server-cert" not found Jan 29 14:18:20 crc kubenswrapper[4753]: I0129 14:18:20.766245 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:20 crc kubenswrapper[4753]: E0129 14:18:20.766527 4753 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:20 crc kubenswrapper[4753]: E0129 14:18:20.766637 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert podName:20e19d51-387b-4da6-8e39-652b24176ef9 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:36.766608964 +0000 UTC m=+951.461343386 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" (UID: "20e19d51-387b-4da6-8e39-652b24176ef9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 29 14:18:21 crc kubenswrapper[4753]: I0129 14:18:21.071266 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:21 crc kubenswrapper[4753]: I0129 14:18:21.071337 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:21 crc kubenswrapper[4753]: E0129 14:18:21.071548 4753 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 29 14:18:21 crc kubenswrapper[4753]: E0129 14:18:21.071610 4753 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 29 14:18:21 crc kubenswrapper[4753]: E0129 14:18:21.071649 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:37.071624064 +0000 UTC m=+951.766358486 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "metrics-server-cert" not found Jan 29 14:18:21 crc kubenswrapper[4753]: E0129 14:18:21.071683 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs podName:d7e2152d-0998-475e-b645-23df5698e858 nodeName:}" failed. No retries permitted until 2026-01-29 14:18:37.071661985 +0000 UTC m=+951.766396407 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-dtkh6" (UID: "d7e2152d-0998-475e-b645-23df5698e858") : secret "webhook-server-cert" not found Jan 29 14:18:26 crc kubenswrapper[4753]: E0129 14:18:26.260199 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf" Jan 29 14:18:26 crc kubenswrapper[4753]: E0129 14:18:26.262052 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v5d6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67bf948998-9777h_openstack-operators(7d8a4a16-258a-4759-980b-98f13fa2e64c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:18:26 crc kubenswrapper[4753]: E0129 14:18:26.263610 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" podUID="7d8a4a16-258a-4759-980b-98f13fa2e64c" Jan 29 14:18:26 crc kubenswrapper[4753]: E0129 14:18:26.560892 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" podUID="7d8a4a16-258a-4759-980b-98f13fa2e64c" Jan 29 14:18:26 crc kubenswrapper[4753]: E0129 14:18:26.773961 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b" Jan 29 14:18:26 crc kubenswrapper[4753]: E0129 14:18:26.774245 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z69r9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-564965969-wfszx_openstack-operators(53a3af2e-de03-483f-ba36-253eb5e9db1d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:18:26 crc kubenswrapper[4753]: E0129 14:18:26.775642 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" podUID="53a3af2e-de03-483f-ba36-253eb5e9db1d" Jan 29 14:18:27 crc kubenswrapper[4753]: I0129 14:18:27.054595 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:18:27 crc kubenswrapper[4753]: I0129 14:18:27.054660 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:18:27 crc kubenswrapper[4753]: E0129 14:18:27.602386 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" podUID="53a3af2e-de03-483f-ba36-253eb5e9db1d" Jan 29 14:18:28 crc kubenswrapper[4753]: E0129 14:18:28.518204 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 29 14:18:28 crc kubenswrapper[4753]: E0129 14:18:28.518779 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x8jw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pkpp6_openstack-operators(26b13b81-bb4c-4b22-88a3-975875eb76dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:18:28 crc kubenswrapper[4753]: E0129 14:18:28.520435 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6" podUID="26b13b81-bb4c-4b22-88a3-975875eb76dc" Jan 29 14:18:28 crc kubenswrapper[4753]: E0129 14:18:28.608844 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6" podUID="26b13b81-bb4c-4b22-88a3-975875eb76dc" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.641980 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" event={"ID":"460c2c4f-24cd-4817-8145-20641c54a23e","Type":"ContainerStarted","Data":"fb460d149d8cfc649c6bf9a85efbabcd91e93061329bd4661d6cd505eff64cbe"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.642692 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.644317 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd" event={"ID":"3482315b-e8cc-4dcf-9eb6-fb120739e361","Type":"ContainerStarted","Data":"a57bf7c2b5f3659f2ae2994242b5740f50216e968b01d2da232bdaedb32af50c"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.644454 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.645760 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" event={"ID":"ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0","Type":"ContainerStarted","Data":"82ba448a42128bb546b0092b4359d409ccf88da6385038bc07247714a7e122bd"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.645920 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.647327 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq" event={"ID":"ea463c74-766e-424b-a930-cc8cad45ea88","Type":"ContainerStarted","Data":"5a8d982e9ab034372a8de94374dc16c65de17e13657b98e4c577a7eedff6b8b2"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.647665 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.648724 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" event={"ID":"a3142007-86c8-4dda-a225-813a250be829","Type":"ContainerStarted","Data":"858c733c2d3ce1945fe8704ed3b4dfafdc2e5755a426384f2e8dfd31f5cfd18d"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.649041 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.650512 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j" event={"ID":"8ef2b554-7857-404c-adce-f82ebcf71f72","Type":"ContainerStarted","Data":"959fc1e67e2af9659313c371bd76543449ad1a9b4793f0132a653a4a3927f23d"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.650832 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.651833 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" event={"ID":"fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4","Type":"ContainerStarted","Data":"c8df1a0ec6be11cc2a57e21bdc7ebbf40717260a10a9c4364607034f99599f14"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.652137 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.653099 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x" event={"ID":"88bc469c-847c-4e52-9612-cfd238cbcf3d","Type":"ContainerStarted","Data":"fac8c4e2fc2e78bc8daee3c99457939a2aadcb213f5b59e6effe2705d22d2a96"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.653423 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.654456 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq" event={"ID":"81d6f0fa-ae4f-46e3-8103-bc97b1afc209","Type":"ContainerStarted","Data":"a3bfa3fcabb8370d7eb4023950953d45eaf03cb48686ff969408232c1c4a1091"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.654784 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.655792 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" event={"ID":"4f2207b2-9101-4661-8ccf-d2eb0c57092a","Type":"ContainerStarted","Data":"d9ae266b70bb2dc15fd1dd52ba1dae1bec4844a66a8ce494ec281f8a70e6d5e8"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.656096 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.657970 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvnht" event={"ID":"151add79-d51f-439d-a014-d0a42aa992f1","Type":"ContainerStarted","Data":"fdbbb7aaac9353a8359ee36cbfdf4b0064d4d7c9d9f388cbefd7a2163f7dc7cc"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.659011 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7" event={"ID":"f20cfb79-500e-4652-b32d-098d8b27e031","Type":"ContainerStarted","Data":"169f122a5e6aa66b5da7d98797b144f9fb963ec0390a6f963bf0f3e72b5ecc81"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.659363 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.660360 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm" event={"ID":"6d74ea48-8122-4dae-9adf-472a4d2ce3c9","Type":"ContainerStarted","Data":"4b7f896b206e7f2bbb9cc29da49c57ece33d6bb8f11288e46c8ecb1046798415"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.660682 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.661780 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d" event={"ID":"ed5e20f7-cf91-4238-9472-eba0bcc3183b","Type":"ContainerStarted","Data":"8dedcbf8d99c8f8534534f12fd5e33975d870b496e5e32ad843c3d91edf73272"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.662119 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.663093 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c" event={"ID":"10c551ac-f50b-4773-8c83-e3e10e76f0c1","Type":"ContainerStarted","Data":"2088acb2e1eaf71a1ec40ada63da3b86cea904bbc812bcc8c2b7244228072593"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.663435 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.665905 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr" event={"ID":"91f93520-085a-4d27-bffa-f8bc1956c686","Type":"ContainerStarted","Data":"8494f08dcf9213489b4dba43c6695d1fbf1e7543481462c0b1635681a6d27f6d"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.666077 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.667233 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt" event={"ID":"9d8da066-fe2d-4cf7-b721-3155f8f11510","Type":"ContainerStarted","Data":"449c3f3ebfe77b667c546e2c713855be42380de61453e299bdd1d1f456df8e0c"} Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.667430 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.669738 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" podStartSLOduration=3.937714972 podStartE2EDuration="28.669722691s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.691305409 +0000 UTC m=+921.386039791" lastFinishedPulling="2026-01-29 14:18:31.423313128 +0000 UTC m=+946.118047510" observedRunningTime="2026-01-29 14:18:32.665657211 +0000 UTC m=+947.360391593" watchObservedRunningTime="2026-01-29 14:18:32.669722691 +0000 UTC m=+947.364457073" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.700839 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x" podStartSLOduration=5.528717456 podStartE2EDuration="28.700824922s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:05.898370607 +0000 UTC m=+920.593104989" lastFinishedPulling="2026-01-29 14:18:29.070478063 +0000 UTC m=+943.765212455" observedRunningTime="2026-01-29 14:18:32.697086751 +0000 UTC m=+947.391821133" watchObservedRunningTime="2026-01-29 14:18:32.700824922 +0000 UTC m=+947.395559304" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.772955 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d" podStartSLOduration=5.486775903 podStartE2EDuration="28.77294241s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.241642441 +0000 UTC m=+920.936376823" lastFinishedPulling="2026-01-29 14:18:29.527808948 +0000 UTC m=+944.222543330" observedRunningTime="2026-01-29 14:18:32.76998313 +0000 UTC m=+947.464717512" watchObservedRunningTime="2026-01-29 14:18:32.77294241 +0000 UTC m=+947.467676792" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.774750 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" podStartSLOduration=4.042706309 podStartE2EDuration="28.774745328s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.675664067 +0000 UTC m=+921.370398449" lastFinishedPulling="2026-01-29 14:18:31.407703046 +0000 UTC m=+946.102437468" observedRunningTime="2026-01-29 14:18:32.740682878 +0000 UTC m=+947.435417260" watchObservedRunningTime="2026-01-29 14:18:32.774745328 +0000 UTC m=+947.469479710" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.823160 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j" podStartSLOduration=4.745367794 podStartE2EDuration="28.823132386s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.102184264 +0000 UTC m=+920.796918646" lastFinishedPulling="2026-01-29 14:18:30.179948846 +0000 UTC m=+944.874683238" observedRunningTime="2026-01-29 14:18:32.81513951 +0000 UTC m=+947.509873892" watchObservedRunningTime="2026-01-29 14:18:32.823132386 +0000 UTC m=+947.517866768" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.871039 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c" podStartSLOduration=5.77795772 podStartE2EDuration="28.871015829s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.434695247 +0000 UTC m=+921.129429629" lastFinishedPulling="2026-01-29 14:18:29.527753356 +0000 UTC m=+944.222487738" observedRunningTime="2026-01-29 14:18:32.859053767 +0000 UTC m=+947.553788149" watchObservedRunningTime="2026-01-29 14:18:32.871015829 +0000 UTC m=+947.565750211" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.894435 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq" podStartSLOduration=5.518871531 podStartE2EDuration="28.894409282s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:05.695062015 +0000 UTC m=+920.389796397" lastFinishedPulling="2026-01-29 14:18:29.070599766 +0000 UTC m=+943.765334148" observedRunningTime="2026-01-29 14:18:32.888618135 +0000 UTC m=+947.583352517" watchObservedRunningTime="2026-01-29 14:18:32.894409282 +0000 UTC m=+947.589143654" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.923770 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" podStartSLOduration=3.953775376 podStartE2EDuration="28.923741924s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.437461691 +0000 UTC m=+921.132196073" lastFinishedPulling="2026-01-29 14:18:31.407428199 +0000 UTC m=+946.102162621" observedRunningTime="2026-01-29 14:18:32.90988448 +0000 UTC m=+947.604618862" watchObservedRunningTime="2026-01-29 14:18:32.923741924 +0000 UTC m=+947.618476306" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.932967 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd" podStartSLOduration=5.515498299 podStartE2EDuration="28.932943002s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.110422686 +0000 UTC m=+920.805157068" lastFinishedPulling="2026-01-29 14:18:29.527867389 +0000 UTC m=+944.222601771" observedRunningTime="2026-01-29 14:18:32.931356339 +0000 UTC m=+947.626090721" watchObservedRunningTime="2026-01-29 14:18:32.932943002 +0000 UTC m=+947.627677384" Jan 29 14:18:32 crc kubenswrapper[4753]: I0129 14:18:32.982276 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7" podStartSLOduration=4.506569972 podStartE2EDuration="28.982258085s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:05.704337145 +0000 UTC m=+920.399071527" lastFinishedPulling="2026-01-29 14:18:30.180025258 +0000 UTC m=+944.874759640" observedRunningTime="2026-01-29 14:18:32.979359477 +0000 UTC m=+947.674093859" watchObservedRunningTime="2026-01-29 14:18:32.982258085 +0000 UTC m=+947.676992467" Jan 29 14:18:33 crc kubenswrapper[4753]: I0129 14:18:33.014332 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" podStartSLOduration=4.216706299 podStartE2EDuration="29.01430572s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.688735449 +0000 UTC m=+921.383469831" lastFinishedPulling="2026-01-29 14:18:31.48633486 +0000 UTC m=+946.181069252" observedRunningTime="2026-01-29 14:18:33.006341876 +0000 UTC m=+947.701076258" watchObservedRunningTime="2026-01-29 14:18:33.01430572 +0000 UTC m=+947.709040102" Jan 29 14:18:33 crc kubenswrapper[4753]: I0129 14:18:33.045758 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xvnht" podStartSLOduration=6.854916932 podStartE2EDuration="28.04572675s" podCreationTimestamp="2026-01-29 14:18:05 +0000 UTC" firstStartedPulling="2026-01-29 14:18:07.310132838 +0000 UTC m=+922.004867210" lastFinishedPulling="2026-01-29 14:18:28.500942646 +0000 UTC m=+943.195677028" observedRunningTime="2026-01-29 14:18:33.042540844 +0000 UTC m=+947.737275216" watchObservedRunningTime="2026-01-29 14:18:33.04572675 +0000 UTC m=+947.740461142" Jan 29 14:18:33 crc kubenswrapper[4753]: I0129 14:18:33.065364 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm" podStartSLOduration=6.2368882469999996 podStartE2EDuration="29.065340249s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.242064152 +0000 UTC m=+920.936798534" lastFinishedPulling="2026-01-29 14:18:29.070516154 +0000 UTC m=+943.765250536" observedRunningTime="2026-01-29 14:18:33.061582188 +0000 UTC m=+947.756316560" watchObservedRunningTime="2026-01-29 14:18:33.065340249 +0000 UTC m=+947.760074631" Jan 29 14:18:33 crc kubenswrapper[4753]: I0129 14:18:33.083901 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq" podStartSLOduration=6.088996443 podStartE2EDuration="29.08387864s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.075570695 +0000 UTC m=+920.770305077" lastFinishedPulling="2026-01-29 14:18:29.070452892 +0000 UTC m=+943.765187274" observedRunningTime="2026-01-29 14:18:33.080853298 +0000 UTC m=+947.775587680" watchObservedRunningTime="2026-01-29 14:18:33.08387864 +0000 UTC m=+947.778613022" Jan 29 14:18:33 crc kubenswrapper[4753]: I0129 14:18:33.110985 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" podStartSLOduration=4.14711964 podStartE2EDuration="29.110958652s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.442357603 +0000 UTC m=+921.137091985" lastFinishedPulling="2026-01-29 14:18:31.406196575 +0000 UTC m=+946.100930997" observedRunningTime="2026-01-29 14:18:33.105620678 +0000 UTC m=+947.800355060" watchObservedRunningTime="2026-01-29 14:18:33.110958652 +0000 UTC m=+947.805693034" Jan 29 14:18:33 crc kubenswrapper[4753]: I0129 14:18:33.130031 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr" podStartSLOduration=6.170313079 podStartE2EDuration="29.130013786s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.110763805 +0000 UTC m=+920.805498187" lastFinishedPulling="2026-01-29 14:18:29.070464512 +0000 UTC m=+943.765198894" observedRunningTime="2026-01-29 14:18:33.127331924 +0000 UTC m=+947.822066306" watchObservedRunningTime="2026-01-29 14:18:33.130013786 +0000 UTC m=+947.824748168" Jan 29 14:18:33 crc kubenswrapper[4753]: I0129 14:18:33.148683 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt" podStartSLOduration=5.772850461 podStartE2EDuration="29.14866728s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:05.694769447 +0000 UTC m=+920.389503829" lastFinishedPulling="2026-01-29 14:18:29.070586226 +0000 UTC m=+943.765320648" observedRunningTime="2026-01-29 14:18:33.146088611 +0000 UTC m=+947.840823013" watchObservedRunningTime="2026-01-29 14:18:33.14866728 +0000 UTC m=+947.843401662" Jan 29 14:18:35 crc kubenswrapper[4753]: I0129 14:18:35.705963 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:35 crc kubenswrapper[4753]: I0129 14:18:35.706798 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:35 crc kubenswrapper[4753]: I0129 14:18:35.765969 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.254706 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.270948 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bfa698b-c528-4171-88ee-3480e2715dc9-cert\") pod \"infra-operator-controller-manager-79955696d6-mgwrk\" (UID: \"6bfa698b-c528-4171-88ee-3480e2715dc9\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.484556 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.698481 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" event={"ID":"9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb","Type":"ContainerStarted","Data":"8be43a46db40b3296d2416b4a6d2fae0ef1b70c7d6c819a70ed5dd2c559301fd"} Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.699167 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.727555 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" podStartSLOduration=3.30912258 podStartE2EDuration="32.727538399s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.242079352 +0000 UTC m=+920.936813734" lastFinishedPulling="2026-01-29 14:18:35.660495141 +0000 UTC m=+950.355229553" observedRunningTime="2026-01-29 14:18:36.718366441 +0000 UTC m=+951.413100823" watchObservedRunningTime="2026-01-29 14:18:36.727538399 +0000 UTC m=+951.422272781" Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.757144 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk"] Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.778869 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.824549 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvnht"] Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.864845 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.873045 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20e19d51-387b-4da6-8e39-652b24176ef9-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq\" (UID: \"20e19d51-387b-4da6-8e39-652b24176ef9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:36 crc kubenswrapper[4753]: I0129 14:18:36.970467 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:37 crc kubenswrapper[4753]: I0129 14:18:37.169747 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:37 crc kubenswrapper[4753]: I0129 14:18:37.170494 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:37 crc kubenswrapper[4753]: I0129 14:18:37.178627 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:37 crc kubenswrapper[4753]: I0129 14:18:37.189139 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7e2152d-0998-475e-b645-23df5698e858-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-dtkh6\" (UID: \"d7e2152d-0998-475e-b645-23df5698e858\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:37 crc kubenswrapper[4753]: I0129 14:18:37.236783 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq"] Jan 29 14:18:37 crc kubenswrapper[4753]: I0129 14:18:37.353915 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:37 crc kubenswrapper[4753]: I0129 14:18:37.722400 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" event={"ID":"20e19d51-387b-4da6-8e39-652b24176ef9","Type":"ContainerStarted","Data":"3afcee5f3430ad74b7b2a06fdc7a7f2c569e0152144267a8f786d10912577b4c"} Jan 29 14:18:37 crc kubenswrapper[4753]: I0129 14:18:37.728777 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" event={"ID":"6bfa698b-c528-4171-88ee-3480e2715dc9","Type":"ContainerStarted","Data":"59b90ce7cb2d9a2c53af1f63ca20615cf366a9661a07b401999d8b72bf8d75a1"} Jan 29 14:18:37 crc kubenswrapper[4753]: I0129 14:18:37.737917 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6"] Jan 29 14:18:37 crc kubenswrapper[4753]: W0129 14:18:37.745802 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7e2152d_0998_475e_b645_23df5698e858.slice/crio-e4fe4dab5dcf1889078a004d9e1a48e90a7a1e01dff2901d382b15fe1b699071 WatchSource:0}: Error finding container e4fe4dab5dcf1889078a004d9e1a48e90a7a1e01dff2901d382b15fe1b699071: Status 404 returned error can't find the container with id e4fe4dab5dcf1889078a004d9e1a48e90a7a1e01dff2901d382b15fe1b699071 Jan 29 14:18:38 crc kubenswrapper[4753]: I0129 14:18:38.735859 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xvnht" podUID="151add79-d51f-439d-a014-d0a42aa992f1" containerName="registry-server" containerID="cri-o://fdbbb7aaac9353a8359ee36cbfdf4b0064d4d7c9d9f388cbefd7a2163f7dc7cc" gracePeriod=2 Jan 29 14:18:38 crc kubenswrapper[4753]: I0129 14:18:38.737280 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" event={"ID":"d7e2152d-0998-475e-b645-23df5698e858","Type":"ContainerStarted","Data":"024922ab0c6efd50918f01cf59326f4895c6e1fdf6d3db7a307957c98bd47de2"} Jan 29 14:18:38 crc kubenswrapper[4753]: I0129 14:18:38.737308 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:38 crc kubenswrapper[4753]: I0129 14:18:38.737317 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" event={"ID":"d7e2152d-0998-475e-b645-23df5698e858","Type":"ContainerStarted","Data":"e4fe4dab5dcf1889078a004d9e1a48e90a7a1e01dff2901d382b15fe1b699071"} Jan 29 14:18:38 crc kubenswrapper[4753]: I0129 14:18:38.781398 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" podStartSLOduration=34.781380486 podStartE2EDuration="34.781380486s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:18:38.7630241 +0000 UTC m=+953.457758482" watchObservedRunningTime="2026-01-29 14:18:38.781380486 +0000 UTC m=+953.476114868" Jan 29 14:18:39 crc kubenswrapper[4753]: I0129 14:18:39.751681 4753 generic.go:334] "Generic (PLEG): container finished" podID="151add79-d51f-439d-a014-d0a42aa992f1" containerID="fdbbb7aaac9353a8359ee36cbfdf4b0064d4d7c9d9f388cbefd7a2163f7dc7cc" exitCode=0 Jan 29 14:18:39 crc kubenswrapper[4753]: I0129 14:18:39.751747 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvnht" event={"ID":"151add79-d51f-439d-a014-d0a42aa992f1","Type":"ContainerDied","Data":"fdbbb7aaac9353a8359ee36cbfdf4b0064d4d7c9d9f388cbefd7a2163f7dc7cc"} Jan 29 14:18:39 crc kubenswrapper[4753]: I0129 14:18:39.955108 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.030918 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-catalog-content\") pod \"151add79-d51f-439d-a014-d0a42aa992f1\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.030960 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-utilities\") pod \"151add79-d51f-439d-a014-d0a42aa992f1\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.031070 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2r6h\" (UniqueName: \"kubernetes.io/projected/151add79-d51f-439d-a014-d0a42aa992f1-kube-api-access-k2r6h\") pod \"151add79-d51f-439d-a014-d0a42aa992f1\" (UID: \"151add79-d51f-439d-a014-d0a42aa992f1\") " Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.032384 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-utilities" (OuterVolumeSpecName: "utilities") pod "151add79-d51f-439d-a014-d0a42aa992f1" (UID: "151add79-d51f-439d-a014-d0a42aa992f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.045506 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151add79-d51f-439d-a014-d0a42aa992f1-kube-api-access-k2r6h" (OuterVolumeSpecName: "kube-api-access-k2r6h") pod "151add79-d51f-439d-a014-d0a42aa992f1" (UID: "151add79-d51f-439d-a014-d0a42aa992f1"). InnerVolumeSpecName "kube-api-access-k2r6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.109563 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "151add79-d51f-439d-a014-d0a42aa992f1" (UID: "151add79-d51f-439d-a014-d0a42aa992f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.132866 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.132926 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/151add79-d51f-439d-a014-d0a42aa992f1-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.132937 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2r6h\" (UniqueName: \"kubernetes.io/projected/151add79-d51f-439d-a014-d0a42aa992f1-kube-api-access-k2r6h\") on node \"crc\" DevicePath \"\"" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.760218 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvnht" event={"ID":"151add79-d51f-439d-a014-d0a42aa992f1","Type":"ContainerDied","Data":"e9b38ac8834a7b9869be89e40d450f2e32804b79c55c6ea2ecf9db72aff160d2"} Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.760815 4753 scope.go:117] "RemoveContainer" containerID="fdbbb7aaac9353a8359ee36cbfdf4b0064d4d7c9d9f388cbefd7a2163f7dc7cc" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.760244 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvnht" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.761253 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" event={"ID":"7d8a4a16-258a-4759-980b-98f13fa2e64c","Type":"ContainerStarted","Data":"10edb4a98a199a97e9c35298f851374deea5e1b586130ba28ac33bdd0251334d"} Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.761505 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.763044 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" event={"ID":"20e19d51-387b-4da6-8e39-652b24176ef9","Type":"ContainerStarted","Data":"2604a63deca512204212a371587a09300032ecb6f6a9c4ee249030eb96a91588"} Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.763125 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.773214 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" event={"ID":"53a3af2e-de03-483f-ba36-253eb5e9db1d","Type":"ContainerStarted","Data":"7156f5e4a32d1d0097f7f625753c402987575f7e17fe8d2c51b286d08c780ea4"} Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.773570 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.775554 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" event={"ID":"6bfa698b-c528-4171-88ee-3480e2715dc9","Type":"ContainerStarted","Data":"19a4f8214079a350cd20d005dc6347bc3596de4b614076ddc917adadabd0481b"} Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.775741 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.779779 4753 scope.go:117] "RemoveContainer" containerID="15a8d427875fab0492c5e2b931ed5a6070e7fe3d8404e6e5779d476c8556f7c2" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.800709 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" podStartSLOduration=3.023402732 podStartE2EDuration="36.800690051s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.237302914 +0000 UTC m=+920.932037296" lastFinishedPulling="2026-01-29 14:18:40.014590193 +0000 UTC m=+954.709324615" observedRunningTime="2026-01-29 14:18:40.796174018 +0000 UTC m=+955.490908430" watchObservedRunningTime="2026-01-29 14:18:40.800690051 +0000 UTC m=+955.495424443" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.850819 4753 scope.go:117] "RemoveContainer" containerID="6d32fb976a86cd4fad767bd7972f3129ac09134742dbb3023a7513a9cc95eff1" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.871331 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" podStartSLOduration=34.112795584 podStartE2EDuration="36.871312899s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:37.244937577 +0000 UTC m=+951.939671959" lastFinishedPulling="2026-01-29 14:18:40.003454892 +0000 UTC m=+954.698189274" observedRunningTime="2026-01-29 14:18:40.866144529 +0000 UTC m=+955.560878911" watchObservedRunningTime="2026-01-29 14:18:40.871312899 +0000 UTC m=+955.566047281" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.905418 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvnht"] Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.917847 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xvnht"] Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.971827 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" podStartSLOduration=3.6198467340000002 podStartE2EDuration="36.971802354s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.670447435 +0000 UTC m=+921.365181827" lastFinishedPulling="2026-01-29 14:18:40.022403075 +0000 UTC m=+954.717137447" observedRunningTime="2026-01-29 14:18:40.967801906 +0000 UTC m=+955.662536288" watchObservedRunningTime="2026-01-29 14:18:40.971802354 +0000 UTC m=+955.666536746" Jan 29 14:18:40 crc kubenswrapper[4753]: I0129 14:18:40.976027 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" podStartSLOduration=33.826028987 podStartE2EDuration="36.976013298s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:36.76830013 +0000 UTC m=+951.463034512" lastFinishedPulling="2026-01-29 14:18:39.918284441 +0000 UTC m=+954.613018823" observedRunningTime="2026-01-29 14:18:40.935285107 +0000 UTC m=+955.630019499" watchObservedRunningTime="2026-01-29 14:18:40.976013298 +0000 UTC m=+955.670747680" Jan 29 14:18:41 crc kubenswrapper[4753]: I0129 14:18:41.791551 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6" event={"ID":"26b13b81-bb4c-4b22-88a3-975875eb76dc","Type":"ContainerStarted","Data":"5f2ab3a53744b1764f1dc2f987baedcfa994cf572af1353b1dd1d45c2b18b654"} Jan 29 14:18:41 crc kubenswrapper[4753]: I0129 14:18:41.843284 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pkpp6" podStartSLOduration=3.998408863 podStartE2EDuration="37.843258777s" podCreationTimestamp="2026-01-29 14:18:04 +0000 UTC" firstStartedPulling="2026-01-29 14:18:06.725812962 +0000 UTC m=+921.420547344" lastFinishedPulling="2026-01-29 14:18:40.570662886 +0000 UTC m=+955.265397258" observedRunningTime="2026-01-29 14:18:41.834124061 +0000 UTC m=+956.528858503" watchObservedRunningTime="2026-01-29 14:18:41.843258777 +0000 UTC m=+956.537993159" Jan 29 14:18:42 crc kubenswrapper[4753]: I0129 14:18:42.156146 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151add79-d51f-439d-a014-d0a42aa992f1" path="/var/lib/kubelet/pods/151add79-d51f-439d-a014-d0a42aa992f1/volumes" Jan 29 14:18:44 crc kubenswrapper[4753]: I0129 14:18:44.503346 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-h5nxt" Jan 29 14:18:44 crc kubenswrapper[4753]: I0129 14:18:44.562315 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-n6kx7" Jan 29 14:18:44 crc kubenswrapper[4753]: I0129 14:18:44.617114 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-57kmq" Jan 29 14:18:44 crc kubenswrapper[4753]: I0129 14:18:44.635015 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-tjf4x" Jan 29 14:18:44 crc kubenswrapper[4753]: I0129 14:18:44.646205 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-2mvrq" Jan 29 14:18:44 crc kubenswrapper[4753]: I0129 14:18:44.687131 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-28p9j" Jan 29 14:18:44 crc kubenswrapper[4753]: I0129 14:18:44.779337 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-vc8jd" Jan 29 14:18:44 crc kubenswrapper[4753]: I0129 14:18:44.848449 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-jnsmr" Jan 29 14:18:44 crc kubenswrapper[4753]: I0129 14:18:44.986422 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-n6t9d" Jan 29 14:18:45 crc kubenswrapper[4753]: I0129 14:18:45.031510 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-9777h" Jan 29 14:18:45 crc kubenswrapper[4753]: I0129 14:18:45.117738 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hr6vm" Jan 29 14:18:45 crc kubenswrapper[4753]: I0129 14:18:45.129427 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8wdkj" Jan 29 14:18:45 crc kubenswrapper[4753]: I0129 14:18:45.221502 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-lml7t" Jan 29 14:18:45 crc kubenswrapper[4753]: I0129 14:18:45.257287 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-c6w2c" Jan 29 14:18:45 crc kubenswrapper[4753]: I0129 14:18:45.286936 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-pkk94" Jan 29 14:18:45 crc kubenswrapper[4753]: I0129 14:18:45.312451 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-56nqb" Jan 29 14:18:45 crc kubenswrapper[4753]: I0129 14:18:45.493334 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-5nnl6" Jan 29 14:18:45 crc kubenswrapper[4753]: I0129 14:18:45.516534 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-wfszx" Jan 29 14:18:45 crc kubenswrapper[4753]: I0129 14:18:45.555331 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-57nnj" Jan 29 14:18:46 crc kubenswrapper[4753]: I0129 14:18:46.495437 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-mgwrk" Jan 29 14:18:46 crc kubenswrapper[4753]: I0129 14:18:46.977678 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq" Jan 29 14:18:47 crc kubenswrapper[4753]: I0129 14:18:47.366458 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-dtkh6" Jan 29 14:18:57 crc kubenswrapper[4753]: I0129 14:18:57.055470 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:18:57 crc kubenswrapper[4753]: I0129 14:18:57.056090 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:18:57 crc kubenswrapper[4753]: I0129 14:18:57.056181 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:18:57 crc kubenswrapper[4753]: I0129 14:18:57.057044 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60212ebb28237ec94902995089383e664d1c6ec845691e27febd40b2f34c00cd"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:18:57 crc kubenswrapper[4753]: I0129 14:18:57.057133 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://60212ebb28237ec94902995089383e664d1c6ec845691e27febd40b2f34c00cd" gracePeriod=600 Jan 29 14:18:57 crc kubenswrapper[4753]: I0129 14:18:57.950914 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="60212ebb28237ec94902995089383e664d1c6ec845691e27febd40b2f34c00cd" exitCode=0 Jan 29 14:18:57 crc kubenswrapper[4753]: I0129 14:18:57.950983 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"60212ebb28237ec94902995089383e664d1c6ec845691e27febd40b2f34c00cd"} Jan 29 14:18:57 crc kubenswrapper[4753]: I0129 14:18:57.951652 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"12d7924f9ff7f63db0598221481b584d9481ba358c87450c2b5683ad81272c03"} Jan 29 14:18:57 crc kubenswrapper[4753]: I0129 14:18:57.951722 4753 scope.go:117] "RemoveContainer" containerID="30a83f4047e7a21e63740d93f083e75525a0e3fe674659ba74e59493ea388ecf" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.534641 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-89t5g"] Jan 29 14:19:03 crc kubenswrapper[4753]: E0129 14:19:03.536276 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151add79-d51f-439d-a014-d0a42aa992f1" containerName="extract-utilities" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.536300 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="151add79-d51f-439d-a014-d0a42aa992f1" containerName="extract-utilities" Jan 29 14:19:03 crc kubenswrapper[4753]: E0129 14:19:03.536315 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151add79-d51f-439d-a014-d0a42aa992f1" containerName="extract-content" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.536326 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="151add79-d51f-439d-a014-d0a42aa992f1" containerName="extract-content" Jan 29 14:19:03 crc kubenswrapper[4753]: E0129 14:19:03.536358 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151add79-d51f-439d-a014-d0a42aa992f1" containerName="registry-server" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.536367 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="151add79-d51f-439d-a014-d0a42aa992f1" containerName="registry-server" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.536628 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="151add79-d51f-439d-a014-d0a42aa992f1" containerName="registry-server" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.538090 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.544785 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.545133 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.545344 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.545475 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-n4ghc" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.551619 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-89t5g"] Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.615463 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-qj4v7"] Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.617267 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.620932 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.637286 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-qj4v7"] Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.715898 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-config\") pod \"dnsmasq-dns-5f854695bc-qj4v7\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.715940 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7q96\" (UniqueName: \"kubernetes.io/projected/e170336b-8ce2-419e-b9e7-59d73f08f294-kube-api-access-c7q96\") pod \"dnsmasq-dns-5f854695bc-qj4v7\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.715972 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-dns-svc\") pod \"dnsmasq-dns-5f854695bc-qj4v7\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.716068 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxvls\" (UniqueName: \"kubernetes.io/projected/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-kube-api-access-cxvls\") pod \"dnsmasq-dns-84bb9d8bd9-89t5g\" (UID: \"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.716129 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-config\") pod \"dnsmasq-dns-84bb9d8bd9-89t5g\" (UID: \"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.817323 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxvls\" (UniqueName: \"kubernetes.io/projected/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-kube-api-access-cxvls\") pod \"dnsmasq-dns-84bb9d8bd9-89t5g\" (UID: \"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.817405 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-config\") pod \"dnsmasq-dns-84bb9d8bd9-89t5g\" (UID: \"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.817436 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-config\") pod \"dnsmasq-dns-5f854695bc-qj4v7\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.817462 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7q96\" (UniqueName: \"kubernetes.io/projected/e170336b-8ce2-419e-b9e7-59d73f08f294-kube-api-access-c7q96\") pod \"dnsmasq-dns-5f854695bc-qj4v7\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.817487 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-dns-svc\") pod \"dnsmasq-dns-5f854695bc-qj4v7\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.818533 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-dns-svc\") pod \"dnsmasq-dns-5f854695bc-qj4v7\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.818701 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-config\") pod \"dnsmasq-dns-84bb9d8bd9-89t5g\" (UID: \"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.819111 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-config\") pod \"dnsmasq-dns-5f854695bc-qj4v7\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.841574 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7q96\" (UniqueName: \"kubernetes.io/projected/e170336b-8ce2-419e-b9e7-59d73f08f294-kube-api-access-c7q96\") pod \"dnsmasq-dns-5f854695bc-qj4v7\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.846024 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxvls\" (UniqueName: \"kubernetes.io/projected/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-kube-api-access-cxvls\") pod \"dnsmasq-dns-84bb9d8bd9-89t5g\" (UID: \"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.888393 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" Jan 29 14:19:03 crc kubenswrapper[4753]: I0129 14:19:03.935694 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:04 crc kubenswrapper[4753]: I0129 14:19:04.513911 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 14:19:04 crc kubenswrapper[4753]: I0129 14:19:04.514650 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-89t5g"] Jan 29 14:19:04 crc kubenswrapper[4753]: I0129 14:19:04.523979 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-qj4v7"] Jan 29 14:19:05 crc kubenswrapper[4753]: I0129 14:19:05.024881 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" event={"ID":"e170336b-8ce2-419e-b9e7-59d73f08f294","Type":"ContainerStarted","Data":"bd714745be7913b0066854c7c9582fbc5d2fa07d7086c15255c9fc7f6923c169"} Jan 29 14:19:05 crc kubenswrapper[4753]: I0129 14:19:05.026333 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" event={"ID":"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a","Type":"ContainerStarted","Data":"f23b92a8bd82d658a07a5a82552e96a880650bd96cf09158bb7b538c32b9fc84"} Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.202468 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-qj4v7"] Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.228425 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-hnxhk"] Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.230419 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.258878 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-hnxhk"] Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.309543 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz87v\" (UniqueName: \"kubernetes.io/projected/8ff2aba6-2861-4506-be8d-5da3acf76f6d-kube-api-access-jz87v\") pod \"dnsmasq-dns-744ffd65bc-hnxhk\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.309623 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-config\") pod \"dnsmasq-dns-744ffd65bc-hnxhk\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.309648 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-hnxhk\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.410603 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz87v\" (UniqueName: \"kubernetes.io/projected/8ff2aba6-2861-4506-be8d-5da3acf76f6d-kube-api-access-jz87v\") pod \"dnsmasq-dns-744ffd65bc-hnxhk\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.410693 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-config\") pod \"dnsmasq-dns-744ffd65bc-hnxhk\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.410719 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-hnxhk\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.411568 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-hnxhk\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.412316 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-config\") pod \"dnsmasq-dns-744ffd65bc-hnxhk\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.480320 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz87v\" (UniqueName: \"kubernetes.io/projected/8ff2aba6-2861-4506-be8d-5da3acf76f6d-kube-api-access-jz87v\") pod \"dnsmasq-dns-744ffd65bc-hnxhk\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.552897 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-89t5g"] Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.562590 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.580901 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-4t289"] Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.582177 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.591270 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-4t289"] Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.725866 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-dns-svc\") pod \"dnsmasq-dns-95f5f6995-4t289\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.726008 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtr9r\" (UniqueName: \"kubernetes.io/projected/a2e787ea-764f-4911-8ffb-d6c213fded08-kube-api-access-wtr9r\") pod \"dnsmasq-dns-95f5f6995-4t289\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.726032 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-config\") pod \"dnsmasq-dns-95f5f6995-4t289\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.827261 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-dns-svc\") pod \"dnsmasq-dns-95f5f6995-4t289\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.827615 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtr9r\" (UniqueName: \"kubernetes.io/projected/a2e787ea-764f-4911-8ffb-d6c213fded08-kube-api-access-wtr9r\") pod \"dnsmasq-dns-95f5f6995-4t289\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.827635 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-config\") pod \"dnsmasq-dns-95f5f6995-4t289\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.828412 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-dns-svc\") pod \"dnsmasq-dns-95f5f6995-4t289\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.830257 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-config\") pod \"dnsmasq-dns-95f5f6995-4t289\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.847427 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtr9r\" (UniqueName: \"kubernetes.io/projected/a2e787ea-764f-4911-8ffb-d6c213fded08-kube-api-access-wtr9r\") pod \"dnsmasq-dns-95f5f6995-4t289\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:06 crc kubenswrapper[4753]: I0129 14:19:06.963972 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.147502 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-hnxhk"] Jan 29 14:19:07 crc kubenswrapper[4753]: W0129 14:19:07.157636 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ff2aba6_2861_4506_be8d_5da3acf76f6d.slice/crio-0551a3ac8e0d09962df5d325708aedd57c5dc2dffb20decde3436d3bfcebe953 WatchSource:0}: Error finding container 0551a3ac8e0d09962df5d325708aedd57c5dc2dffb20decde3436d3bfcebe953: Status 404 returned error can't find the container with id 0551a3ac8e0d09962df5d325708aedd57c5dc2dffb20decde3436d3bfcebe953 Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.396322 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.402285 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.406526 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.407975 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.408215 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.410626 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qdlrv" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.410894 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.411259 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.412422 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.432516 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.453936 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-4t289"] Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543297 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad5c04aa-ed92-4c33-ad37-4420b362e237-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543348 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543382 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543578 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543649 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl52d\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-kube-api-access-cl52d\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543689 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543719 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543780 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543816 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543830 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.543873 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad5c04aa-ed92-4c33-ad37-4420b362e237-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.650954 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad5c04aa-ed92-4c33-ad37-4420b362e237-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.650999 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.651020 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.651058 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.651084 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl52d\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-kube-api-access-cl52d\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.651104 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.651120 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.651143 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.651174 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.651188 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.651207 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad5c04aa-ed92-4c33-ad37-4420b362e237-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.652281 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.652433 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.652594 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.654592 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.654884 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.654937 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.659336 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad5c04aa-ed92-4c33-ad37-4420b362e237-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.659860 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.660491 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad5c04aa-ed92-4c33-ad37-4420b362e237-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.662373 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.667888 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl52d\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-kube-api-access-cl52d\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.675416 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.729496 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.730945 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.735064 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wqxsc" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.735403 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.735950 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.740637 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.740940 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.741213 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.750522 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.750705 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.882711 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.882819 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.882873 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrhjh\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-kube-api-access-qrhjh\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.882932 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.882955 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f7e3e27-a036-4623-8d63-557a3c0d76e6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.882970 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.883023 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.883096 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f7e3e27-a036-4623-8d63-557a3c0d76e6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.883122 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.883141 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.883202 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.922244 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987427 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f7e3e27-a036-4623-8d63-557a3c0d76e6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987518 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987549 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987592 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987641 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987671 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987693 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrhjh\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-kube-api-access-qrhjh\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987735 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987769 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f7e3e27-a036-4623-8d63-557a3c0d76e6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987799 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.987835 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.988674 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.989033 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.989422 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.992961 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f7e3e27-a036-4623-8d63-557a3c0d76e6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.993987 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:07 crc kubenswrapper[4753]: I0129 14:19:07.994247 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.003885 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.006092 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.022280 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f7e3e27-a036-4623-8d63-557a3c0d76e6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.032350 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrhjh\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-kube-api-access-qrhjh\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.036020 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.047341 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.102930 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.108441 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" event={"ID":"8ff2aba6-2861-4506-be8d-5da3acf76f6d","Type":"ContainerStarted","Data":"0551a3ac8e0d09962df5d325708aedd57c5dc2dffb20decde3436d3bfcebe953"} Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.138704 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-4t289" event={"ID":"a2e787ea-764f-4911-8ffb-d6c213fded08","Type":"ContainerStarted","Data":"7b19e0caa1f1083f656f9a48fac914e3f963e0b7e648a492b565a8b609011733"} Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.614673 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 14:19:08 crc kubenswrapper[4753]: W0129 14:19:08.629546 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad5c04aa_ed92_4c33_ad37_4420b362e237.slice/crio-e04a382dc583dcc5b5c0702ebe67f18b09aff1f243ceb257a57d7d0071dc19ec WatchSource:0}: Error finding container e04a382dc583dcc5b5c0702ebe67f18b09aff1f243ceb257a57d7d0071dc19ec: Status 404 returned error can't find the container with id e04a382dc583dcc5b5c0702ebe67f18b09aff1f243ceb257a57d7d0071dc19ec Jan 29 14:19:08 crc kubenswrapper[4753]: I0129 14:19:08.740170 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.081894 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.085687 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.088694 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.089341 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-m9jsg" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.091538 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.093676 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.097834 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.101733 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.158852 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f7e3e27-a036-4623-8d63-557a3c0d76e6","Type":"ContainerStarted","Data":"c44106bb8710cc08a418ad82d9da2d097116bce22fb7ae11db8985c45ecf9081"} Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.162743 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad5c04aa-ed92-4c33-ad37-4420b362e237","Type":"ContainerStarted","Data":"e04a382dc583dcc5b5c0702ebe67f18b09aff1f243ceb257a57d7d0071dc19ec"} Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.226860 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.226905 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.226926 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.226950 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jnmh\" (UniqueName: \"kubernetes.io/projected/94157b6b-3cc9-44e9-9625-64d34611046a-kube-api-access-2jnmh\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.227095 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.227121 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.227319 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-default\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.227492 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-kolla-config\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.329455 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.329494 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.329525 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.329546 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jnmh\" (UniqueName: \"kubernetes.io/projected/94157b6b-3cc9-44e9-9625-64d34611046a-kube-api-access-2jnmh\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.329590 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.329624 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.329679 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-default\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.329698 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-kolla-config\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.330460 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-kolla-config\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.331491 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.332021 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.332422 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-default\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.332647 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.339715 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.340129 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.349283 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jnmh\" (UniqueName: \"kubernetes.io/projected/94157b6b-3cc9-44e9-9625-64d34611046a-kube-api-access-2jnmh\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.354916 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " pod="openstack/openstack-galera-0" Jan 29 14:19:09 crc kubenswrapper[4753]: I0129 14:19:09.482337 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.617419 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.645470 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.673892 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.679443 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.679717 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.682618 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nqtnk" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.719667 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.727522 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.734725 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.742504 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5cl64" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.745450 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.781130 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.811005 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.814280 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrrrr\" (UniqueName: \"kubernetes.io/projected/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kube-api-access-nrrrr\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.814310 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-config-data\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.814376 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kolla-config\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.814443 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.893855 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.916957 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917032 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917071 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917126 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917146 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrrrr\" (UniqueName: \"kubernetes.io/projected/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kube-api-access-nrrrr\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917185 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-config-data\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917213 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917276 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917299 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kolla-config\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917368 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917402 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917428 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.917448 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tf7g\" (UniqueName: \"kubernetes.io/projected/001ea12a-a725-4cd9-a12e-1442d56f7068-kube-api-access-5tf7g\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.919582 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-config-data\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.920197 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kolla-config\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.948625 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.948704 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:10 crc kubenswrapper[4753]: I0129 14:19:10.967884 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrrrr\" (UniqueName: \"kubernetes.io/projected/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kube-api-access-nrrrr\") pod \"memcached-0\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " pod="openstack/memcached-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.023037 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.023109 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.023141 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.023188 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.023233 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.023283 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.023309 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.023332 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tf7g\" (UniqueName: \"kubernetes.io/projected/001ea12a-a725-4cd9-a12e-1442d56f7068-kube-api-access-5tf7g\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.024548 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.024701 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.025518 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.027961 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.030050 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.030513 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.045309 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.045536 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.046220 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tf7g\" (UniqueName: \"kubernetes.io/projected/001ea12a-a725-4cd9-a12e-1442d56f7068-kube-api-access-5tf7g\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.054879 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:11 crc kubenswrapper[4753]: I0129 14:19:11.160887 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 29 14:19:12 crc kubenswrapper[4753]: I0129 14:19:12.385430 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:19:12 crc kubenswrapper[4753]: I0129 14:19:12.391790 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 14:19:12 crc kubenswrapper[4753]: I0129 14:19:12.395543 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-qtxkg" Jan 29 14:19:12 crc kubenswrapper[4753]: I0129 14:19:12.400594 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:19:12 crc kubenswrapper[4753]: I0129 14:19:12.453776 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzpqc\" (UniqueName: \"kubernetes.io/projected/9045fb02-a12f-47c9-afe8-80f12d1599c2-kube-api-access-zzpqc\") pod \"kube-state-metrics-0\" (UID: \"9045fb02-a12f-47c9-afe8-80f12d1599c2\") " pod="openstack/kube-state-metrics-0" Jan 29 14:19:12 crc kubenswrapper[4753]: I0129 14:19:12.557002 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzpqc\" (UniqueName: \"kubernetes.io/projected/9045fb02-a12f-47c9-afe8-80f12d1599c2-kube-api-access-zzpqc\") pod \"kube-state-metrics-0\" (UID: \"9045fb02-a12f-47c9-afe8-80f12d1599c2\") " pod="openstack/kube-state-metrics-0" Jan 29 14:19:12 crc kubenswrapper[4753]: I0129 14:19:12.578127 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzpqc\" (UniqueName: \"kubernetes.io/projected/9045fb02-a12f-47c9-afe8-80f12d1599c2-kube-api-access-zzpqc\") pod \"kube-state-metrics-0\" (UID: \"9045fb02-a12f-47c9-afe8-80f12d1599c2\") " pod="openstack/kube-state-metrics-0" Jan 29 14:19:12 crc kubenswrapper[4753]: I0129 14:19:12.712275 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.813655 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rm9d5"] Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.814791 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.817545 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.818264 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-s5pdr" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.818977 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.828575 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rm9d5"] Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.842022 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-98h7m"] Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.844519 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.863072 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-98h7m"] Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952312 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-ovn-controller-tls-certs\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952373 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmrcd\" (UniqueName: \"kubernetes.io/projected/a17eeeff-955e-4718-9e0e-15fae4b8d9db-kube-api-access-fmrcd\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952406 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-combined-ca-bundle\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952453 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run-ovn\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952478 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqb2l\" (UniqueName: \"kubernetes.io/projected/5ca7a69c-2f29-46d8-ab2a-67393114629f-kube-api-access-mqb2l\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952498 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-log-ovn\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952523 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-lib\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952538 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a17eeeff-955e-4718-9e0e-15fae4b8d9db-scripts\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952745 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-log\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952827 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-etc-ovs\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952922 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.952959 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ca7a69c-2f29-46d8-ab2a-67393114629f-scripts\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:16 crc kubenswrapper[4753]: I0129 14:19:16.953040 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-run\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055593 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-combined-ca-bundle\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055688 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run-ovn\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055732 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqb2l\" (UniqueName: \"kubernetes.io/projected/5ca7a69c-2f29-46d8-ab2a-67393114629f-kube-api-access-mqb2l\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055771 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-log-ovn\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055807 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-lib\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055831 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a17eeeff-955e-4718-9e0e-15fae4b8d9db-scripts\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055866 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-log\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055892 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-etc-ovs\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055924 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055948 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ca7a69c-2f29-46d8-ab2a-67393114629f-scripts\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.055981 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-run\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.056024 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-ovn-controller-tls-certs\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.056058 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmrcd\" (UniqueName: \"kubernetes.io/projected/a17eeeff-955e-4718-9e0e-15fae4b8d9db-kube-api-access-fmrcd\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.056304 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run-ovn\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.056363 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-etc-ovs\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.056513 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-log\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.056683 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-run\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.056723 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-log-ovn\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.056748 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.057091 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-lib\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.058430 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ca7a69c-2f29-46d8-ab2a-67393114629f-scripts\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.061271 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a17eeeff-955e-4718-9e0e-15fae4b8d9db-scripts\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.075333 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-ovn-controller-tls-certs\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.082659 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-combined-ca-bundle\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.087363 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqb2l\" (UniqueName: \"kubernetes.io/projected/5ca7a69c-2f29-46d8-ab2a-67393114629f-kube-api-access-mqb2l\") pod \"ovn-controller-rm9d5\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.116530 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmrcd\" (UniqueName: \"kubernetes.io/projected/a17eeeff-955e-4718-9e0e-15fae4b8d9db-kube-api-access-fmrcd\") pod \"ovn-controller-ovs-98h7m\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.127373 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.137411 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.139906 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.144043 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.144361 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.145597 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.145906 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-2xfqb" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.147844 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.150416 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.182643 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.259382 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-config\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.259435 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.259466 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.259488 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.259505 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.259548 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.259724 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.259830 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdg55\" (UniqueName: \"kubernetes.io/projected/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-kube-api-access-kdg55\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.361130 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.361204 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.361232 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.361247 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.361279 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.361327 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.361360 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdg55\" (UniqueName: \"kubernetes.io/projected/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-kube-api-access-kdg55\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.361392 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-config\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.361620 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.362275 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-config\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.363010 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.364003 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.367267 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.367535 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.386501 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.389883 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.391386 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdg55\" (UniqueName: \"kubernetes.io/projected/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-kube-api-access-kdg55\") pod \"ovsdbserver-nb-0\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:17 crc kubenswrapper[4753]: I0129 14:19:17.467693 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.724887 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.726987 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.731561 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-t5grj" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.731712 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.731722 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.731779 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.745130 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.811936 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.812476 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6sc5\" (UniqueName: \"kubernetes.io/projected/08759bfe-4b2e-4da9-b0b0-2149a71a831e-kube-api-access-x6sc5\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.812597 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.812632 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-config\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.812707 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.812963 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.813196 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.813294 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.915263 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.915355 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6sc5\" (UniqueName: \"kubernetes.io/projected/08759bfe-4b2e-4da9-b0b0-2149a71a831e-kube-api-access-x6sc5\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.915383 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.915406 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-config\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.915485 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.915550 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.915600 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.915631 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.916924 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.917563 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.918356 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-config\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.919110 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.927245 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.928241 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.937713 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6sc5\" (UniqueName: \"kubernetes.io/projected/08759bfe-4b2e-4da9-b0b0-2149a71a831e-kube-api-access-x6sc5\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.937970 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:19 crc kubenswrapper[4753]: I0129 14:19:19.948218 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:20 crc kubenswrapper[4753]: I0129 14:19:20.052500 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:29 crc kubenswrapper[4753]: E0129 14:19:29.771101 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d" Jan 29 14:19:29 crc kubenswrapper[4753]: E0129 14:19:29.772498 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cl52d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(ad5c04aa-ed92-4c33-ad37-4420b362e237): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:19:29 crc kubenswrapper[4753]: E0129 14:19:29.773605 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="ad5c04aa-ed92-4c33-ad37-4420b362e237" Jan 29 14:19:30 crc kubenswrapper[4753]: I0129 14:19:30.475845 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 14:19:30 crc kubenswrapper[4753]: W0129 14:19:30.915224 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd12b7e9_4fd2_4ae9_8e74_d8499726d995.slice/crio-67fde236cb8f75ff617756354524ff65f2bab7137de4fa8807321371899c1ccf WatchSource:0}: Error finding container 67fde236cb8f75ff617756354524ff65f2bab7137de4fa8807321371899c1ccf: Status 404 returned error can't find the container with id 67fde236cb8f75ff617756354524ff65f2bab7137de4fa8807321371899c1ccf Jan 29 14:19:30 crc kubenswrapper[4753]: E0129 14:19:30.977430 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 29 14:19:30 crc kubenswrapper[4753]: E0129 14:19:30.978238 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c7q96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f854695bc-qj4v7_openstack(e170336b-8ce2-419e-b9e7-59d73f08f294): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:19:30 crc kubenswrapper[4753]: E0129 14:19:30.978586 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 29 14:19:30 crc kubenswrapper[4753]: E0129 14:19:30.978715 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jz87v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-744ffd65bc-hnxhk_openstack(8ff2aba6-2861-4506-be8d-5da3acf76f6d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:19:30 crc kubenswrapper[4753]: E0129 14:19:30.980301 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" podUID="8ff2aba6-2861-4506-be8d-5da3acf76f6d" Jan 29 14:19:30 crc kubenswrapper[4753]: E0129 14:19:30.980362 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" podUID="e170336b-8ce2-419e-b9e7-59d73f08f294" Jan 29 14:19:31 crc kubenswrapper[4753]: E0129 14:19:31.011848 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 29 14:19:31 crc kubenswrapper[4753]: E0129 14:19:31.014136 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wtr9r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-95f5f6995-4t289_openstack(a2e787ea-764f-4911-8ffb-d6c213fded08): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:19:31 crc kubenswrapper[4753]: E0129 14:19:31.016620 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-95f5f6995-4t289" podUID="a2e787ea-764f-4911-8ffb-d6c213fded08" Jan 29 14:19:31 crc kubenswrapper[4753]: E0129 14:19:31.046295 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 29 14:19:31 crc kubenswrapper[4753]: E0129 14:19:31.046539 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxvls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84bb9d8bd9-89t5g_openstack(c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:19:31 crc kubenswrapper[4753]: E0129 14:19:31.048502 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" podUID="c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a" Jan 29 14:19:31 crc kubenswrapper[4753]: I0129 14:19:31.469421 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dd12b7e9-4fd2-4ae9-8e74-d8499726d995","Type":"ContainerStarted","Data":"67fde236cb8f75ff617756354524ff65f2bab7137de4fa8807321371899c1ccf"} Jan 29 14:19:31 crc kubenswrapper[4753]: E0129 14:19:31.471979 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-95f5f6995-4t289" podUID="a2e787ea-764f-4911-8ffb-d6c213fded08" Jan 29 14:19:31 crc kubenswrapper[4753]: E0129 14:19:31.472506 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" podUID="8ff2aba6-2861-4506-be8d-5da3acf76f6d" Jan 29 14:19:31 crc kubenswrapper[4753]: W0129 14:19:31.595928 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff1df95b_0bf3_47ef_a25f_e5e8d7181ecb.slice/crio-82068b6aff78f1d96c9577ba7dcca553c78e7c01638829e64f5c13ad5a34ca4d WatchSource:0}: Error finding container 82068b6aff78f1d96c9577ba7dcca553c78e7c01638829e64f5c13ad5a34ca4d: Status 404 returned error can't find the container with id 82068b6aff78f1d96c9577ba7dcca553c78e7c01638829e64f5c13ad5a34ca4d Jan 29 14:19:31 crc kubenswrapper[4753]: I0129 14:19:31.614952 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 29 14:19:31 crc kubenswrapper[4753]: W0129 14:19:31.615401 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9045fb02_a12f_47c9_afe8_80f12d1599c2.slice/crio-2b69f0188b65b1f471344137471b113f829a52450704527065301b617bac75d7 WatchSource:0}: Error finding container 2b69f0188b65b1f471344137471b113f829a52450704527065301b617bac75d7: Status 404 returned error can't find the container with id 2b69f0188b65b1f471344137471b113f829a52450704527065301b617bac75d7 Jan 29 14:19:31 crc kubenswrapper[4753]: I0129 14:19:31.626520 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rm9d5"] Jan 29 14:19:31 crc kubenswrapper[4753]: I0129 14:19:31.643754 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:19:31 crc kubenswrapper[4753]: I0129 14:19:31.660268 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 14:19:31 crc kubenswrapper[4753]: I0129 14:19:31.715222 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 14:19:31 crc kubenswrapper[4753]: I0129 14:19:31.725511 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 14:19:31 crc kubenswrapper[4753]: I0129 14:19:31.868395 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:31 crc kubenswrapper[4753]: I0129 14:19:31.965369 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.001865 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7q96\" (UniqueName: \"kubernetes.io/projected/e170336b-8ce2-419e-b9e7-59d73f08f294-kube-api-access-c7q96\") pod \"e170336b-8ce2-419e-b9e7-59d73f08f294\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.001938 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-dns-svc\") pod \"e170336b-8ce2-419e-b9e7-59d73f08f294\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.002043 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-config\") pod \"e170336b-8ce2-419e-b9e7-59d73f08f294\" (UID: \"e170336b-8ce2-419e-b9e7-59d73f08f294\") " Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.002852 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-config" (OuterVolumeSpecName: "config") pod "e170336b-8ce2-419e-b9e7-59d73f08f294" (UID: "e170336b-8ce2-419e-b9e7-59d73f08f294"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.003333 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e170336b-8ce2-419e-b9e7-59d73f08f294" (UID: "e170336b-8ce2-419e-b9e7-59d73f08f294"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.103893 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-config\") pod \"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a\" (UID: \"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a\") " Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.104367 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-config" (OuterVolumeSpecName: "config") pod "c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a" (UID: "c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.104420 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxvls\" (UniqueName: \"kubernetes.io/projected/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-kube-api-access-cxvls\") pod \"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a\" (UID: \"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a\") " Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.104739 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.104754 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.104764 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e170336b-8ce2-419e-b9e7-59d73f08f294-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.108498 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-kube-api-access-cxvls" (OuterVolumeSpecName: "kube-api-access-cxvls") pod "c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a" (UID: "c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a"). InnerVolumeSpecName "kube-api-access-cxvls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.193282 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e170336b-8ce2-419e-b9e7-59d73f08f294-kube-api-access-c7q96" (OuterVolumeSpecName: "kube-api-access-c7q96") pod "e170336b-8ce2-419e-b9e7-59d73f08f294" (UID: "e170336b-8ce2-419e-b9e7-59d73f08f294"). InnerVolumeSpecName "kube-api-access-c7q96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.207629 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxvls\" (UniqueName: \"kubernetes.io/projected/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a-kube-api-access-cxvls\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.207689 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7q96\" (UniqueName: \"kubernetes.io/projected/e170336b-8ce2-419e-b9e7-59d73f08f294-kube-api-access-c7q96\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.390930 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-98h7m"] Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.478357 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.478393 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-89t5g" event={"ID":"c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a","Type":"ContainerDied","Data":"f23b92a8bd82d658a07a5a82552e96a880650bd96cf09158bb7b538c32b9fc84"} Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.480856 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad5c04aa-ed92-4c33-ad37-4420b362e237","Type":"ContainerStarted","Data":"6ea11bc6de1dca2ccb62590a48ec34192f4584015fe67a14444f1dcae0fb9d8d"} Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.482454 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94157b6b-3cc9-44e9-9625-64d34611046a","Type":"ContainerStarted","Data":"5fbe6adbe68b71996173814eff92879f61d37767a107508b1ea3b41021023941"} Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.484416 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb","Type":"ContainerStarted","Data":"82068b6aff78f1d96c9577ba7dcca553c78e7c01638829e64f5c13ad5a34ca4d"} Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.485510 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rm9d5" event={"ID":"5ca7a69c-2f29-46d8-ab2a-67393114629f","Type":"ContainerStarted","Data":"50e506e8e9f1b22f13930607560c49f3d08c0311073e60c9d11c1906427e185e"} Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.487426 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"001ea12a-a725-4cd9-a12e-1442d56f7068","Type":"ContainerStarted","Data":"b73bc2dcdd1e10d160fcb2dd67ce0c404d53f92df5fb95335e5b3080b1552073"} Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.488435 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9045fb02-a12f-47c9-afe8-80f12d1599c2","Type":"ContainerStarted","Data":"2b69f0188b65b1f471344137471b113f829a52450704527065301b617bac75d7"} Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.489861 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" event={"ID":"e170336b-8ce2-419e-b9e7-59d73f08f294","Type":"ContainerDied","Data":"bd714745be7913b0066854c7c9582fbc5d2fa07d7086c15255c9fc7f6923c169"} Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.489904 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-qj4v7" Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.492205 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"08759bfe-4b2e-4da9-b0b0-2149a71a831e","Type":"ContainerStarted","Data":"be1e3ce1ccf7a3365d57f5121e4998099ecf88dd17f49e0f0e325e53962e3e79"} Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.608566 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-89t5g"] Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.676359 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-89t5g"] Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.761217 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-qj4v7"] Jan 29 14:19:32 crc kubenswrapper[4753]: I0129 14:19:32.789051 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-qj4v7"] Jan 29 14:19:33 crc kubenswrapper[4753]: I0129 14:19:33.502810 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-98h7m" event={"ID":"a17eeeff-955e-4718-9e0e-15fae4b8d9db","Type":"ContainerStarted","Data":"92d9bdc0d276e91faa557fcc71325bda4d2fef360a9a067f152c8787ad06822e"} Jan 29 14:19:33 crc kubenswrapper[4753]: I0129 14:19:33.504890 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f7e3e27-a036-4623-8d63-557a3c0d76e6","Type":"ContainerStarted","Data":"81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd"} Jan 29 14:19:34 crc kubenswrapper[4753]: I0129 14:19:34.160852 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a" path="/var/lib/kubelet/pods/c6d0333f-fb0d-43cb-b418-0e2d75f0ce1a/volumes" Jan 29 14:19:34 crc kubenswrapper[4753]: I0129 14:19:34.161587 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e170336b-8ce2-419e-b9e7-59d73f08f294" path="/var/lib/kubelet/pods/e170336b-8ce2-419e-b9e7-59d73f08f294/volumes" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.215282 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rdpjg"] Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.217365 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.221915 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.250059 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rdpjg"] Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.333253 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-combined-ca-bundle\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.333460 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cnrt\" (UniqueName: \"kubernetes.io/projected/1e919242-0fa9-4c26-90c5-718fec0a9109-kube-api-access-9cnrt\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.333481 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.333529 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovn-rundir\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.333564 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovs-rundir\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.333593 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e919242-0fa9-4c26-90c5-718fec0a9109-config\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.422745 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-hnxhk"] Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.435869 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovn-rundir\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.435919 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovs-rundir\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.435952 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e919242-0fa9-4c26-90c5-718fec0a9109-config\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.436015 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-combined-ca-bundle\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.436032 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cnrt\" (UniqueName: \"kubernetes.io/projected/1e919242-0fa9-4c26-90c5-718fec0a9109-kube-api-access-9cnrt\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.436047 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.438014 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e919242-0fa9-4c26-90c5-718fec0a9109-config\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.438286 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovn-rundir\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.438339 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovs-rundir\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.469718 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-combined-ca-bundle\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.478885 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.502569 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-4ccz9"] Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.514814 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.521283 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cnrt\" (UniqueName: \"kubernetes.io/projected/1e919242-0fa9-4c26-90c5-718fec0a9109-kube-api-access-9cnrt\") pod \"ovn-controller-metrics-rdpjg\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.521523 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.539685 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-4ccz9"] Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.558892 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.645291 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.645558 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-config\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.645593 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nvrd\" (UniqueName: \"kubernetes.io/projected/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-kube-api-access-7nvrd\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.645630 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-dns-svc\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.701686 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-4t289"] Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.753889 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-dns-svc\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.754222 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.754282 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-config\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.754313 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nvrd\" (UniqueName: \"kubernetes.io/projected/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-kube-api-access-7nvrd\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.756903 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-dns-svc\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.758953 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.759365 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-config\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.777635 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9lql4"] Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.779063 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.779490 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nvrd\" (UniqueName: \"kubernetes.io/projected/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-kube-api-access-7nvrd\") pod \"dnsmasq-dns-5b79764b65-4ccz9\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.786178 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.809087 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9lql4"] Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.855708 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.855743 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-dns-svc\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.855761 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-config\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.855810 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb8w4\" (UniqueName: \"kubernetes.io/projected/45a11ce9-e11f-4d7a-927c-142a0e302298-kube-api-access-xb8w4\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.855842 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.918935 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.956956 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb8w4\" (UniqueName: \"kubernetes.io/projected/45a11ce9-e11f-4d7a-927c-142a0e302298-kube-api-access-xb8w4\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.957021 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.957120 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.957144 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-dns-svc\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.957174 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-config\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.958094 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-config\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.959108 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-sb\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.959292 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-nb\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.959671 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-dns-svc\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.979301 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb8w4\" (UniqueName: \"kubernetes.io/projected/45a11ce9-e11f-4d7a-927c-142a0e302298-kube-api-access-xb8w4\") pod \"dnsmasq-dns-586b989cdc-9lql4\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:40 crc kubenswrapper[4753]: I0129 14:19:40.982285 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.058112 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz87v\" (UniqueName: \"kubernetes.io/projected/8ff2aba6-2861-4506-be8d-5da3acf76f6d-kube-api-access-jz87v\") pod \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.058637 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-dns-svc\") pod \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.058690 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-config\") pod \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\" (UID: \"8ff2aba6-2861-4506-be8d-5da3acf76f6d\") " Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.061607 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-config" (OuterVolumeSpecName: "config") pod "8ff2aba6-2861-4506-be8d-5da3acf76f6d" (UID: "8ff2aba6-2861-4506-be8d-5da3acf76f6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.062001 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8ff2aba6-2861-4506-be8d-5da3acf76f6d" (UID: "8ff2aba6-2861-4506-be8d-5da3acf76f6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.063434 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ff2aba6-2861-4506-be8d-5da3acf76f6d-kube-api-access-jz87v" (OuterVolumeSpecName: "kube-api-access-jz87v") pod "8ff2aba6-2861-4506-be8d-5da3acf76f6d" (UID: "8ff2aba6-2861-4506-be8d-5da3acf76f6d"). InnerVolumeSpecName "kube-api-access-jz87v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.132465 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.165360 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz87v\" (UniqueName: \"kubernetes.io/projected/8ff2aba6-2861-4506-be8d-5da3acf76f6d-kube-api-access-jz87v\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.165387 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.165397 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff2aba6-2861-4506-be8d-5da3acf76f6d-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.223669 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.266090 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtr9r\" (UniqueName: \"kubernetes.io/projected/a2e787ea-764f-4911-8ffb-d6c213fded08-kube-api-access-wtr9r\") pod \"a2e787ea-764f-4911-8ffb-d6c213fded08\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.266240 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-config\") pod \"a2e787ea-764f-4911-8ffb-d6c213fded08\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.266318 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-dns-svc\") pod \"a2e787ea-764f-4911-8ffb-d6c213fded08\" (UID: \"a2e787ea-764f-4911-8ffb-d6c213fded08\") " Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.266905 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-config" (OuterVolumeSpecName: "config") pod "a2e787ea-764f-4911-8ffb-d6c213fded08" (UID: "a2e787ea-764f-4911-8ffb-d6c213fded08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.267111 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2e787ea-764f-4911-8ffb-d6c213fded08" (UID: "a2e787ea-764f-4911-8ffb-d6c213fded08"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.273477 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2e787ea-764f-4911-8ffb-d6c213fded08-kube-api-access-wtr9r" (OuterVolumeSpecName: "kube-api-access-wtr9r") pod "a2e787ea-764f-4911-8ffb-d6c213fded08" (UID: "a2e787ea-764f-4911-8ffb-d6c213fded08"). InnerVolumeSpecName "kube-api-access-wtr9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.318751 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rdpjg"] Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.371455 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtr9r\" (UniqueName: \"kubernetes.io/projected/a2e787ea-764f-4911-8ffb-d6c213fded08-kube-api-access-wtr9r\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.371490 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.371500 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2e787ea-764f-4911-8ffb-d6c213fded08-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.504846 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-4ccz9"] Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.650092 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rdpjg" event={"ID":"1e919242-0fa9-4c26-90c5-718fec0a9109","Type":"ContainerStarted","Data":"c4195b05dd8b9b921e979b735020e1af4d9377731db0ebb4181aff5c714cd9db"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.651528 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" event={"ID":"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c","Type":"ContainerStarted","Data":"74d4fb495226086a92d681c6ba3cfcae6b05b09857ee144140151ec8ed7e1165"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.653070 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-98h7m" event={"ID":"a17eeeff-955e-4718-9e0e-15fae4b8d9db","Type":"ContainerStarted","Data":"554ab450b98c8af45b7d97da5f184a9f2c588b955ad8fc71c36ce3fca25f5e71"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.654694 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dd12b7e9-4fd2-4ae9-8e74-d8499726d995","Type":"ContainerStarted","Data":"1b90c85fe79f6d643048d2a3c0d72a78c8d8d608ab45d4967e59e9a9eb61c33c"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.657523 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"08759bfe-4b2e-4da9-b0b0-2149a71a831e","Type":"ContainerStarted","Data":"0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.660218 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.660220 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-hnxhk" event={"ID":"8ff2aba6-2861-4506-be8d-5da3acf76f6d","Type":"ContainerDied","Data":"0551a3ac8e0d09962df5d325708aedd57c5dc2dffb20decde3436d3bfcebe953"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.666014 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"001ea12a-a725-4cd9-a12e-1442d56f7068","Type":"ContainerStarted","Data":"c48f2caaa7b2b2dcc7a3ca761e1a8901c8e8087e528a5186cf1bb1488756685e"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.671748 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94157b6b-3cc9-44e9-9625-64d34611046a","Type":"ContainerStarted","Data":"0f883b984a6efe1a3b4819a8200d14cf585a8a3d8843d988a84fda1b04aaa30e"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.676184 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb","Type":"ContainerStarted","Data":"5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.676558 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.678398 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-4t289" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.678427 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-4t289" event={"ID":"a2e787ea-764f-4911-8ffb-d6c213fded08","Type":"ContainerDied","Data":"7b19e0caa1f1083f656f9a48fac914e3f963e0b7e648a492b565a8b609011733"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.680667 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rm9d5" event={"ID":"5ca7a69c-2f29-46d8-ab2a-67393114629f","Type":"ContainerStarted","Data":"a6dfbee60ef1dbfc42673695d5f8844faa43eef9fa5e99e94df7f7bc0fbae6ec"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.681079 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rm9d5" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.683801 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9045fb02-a12f-47c9-afe8-80f12d1599c2","Type":"ContainerStarted","Data":"961176ee521f54c0ddc517579ff16849595beaa66b484e3eb7f1bd0bcea7b8e3"} Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.684108 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.754179 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.738124541 podStartE2EDuration="31.75414421s" podCreationTimestamp="2026-01-29 14:19:10 +0000 UTC" firstStartedPulling="2026-01-29 14:19:31.601919488 +0000 UTC m=+1006.296653870" lastFinishedPulling="2026-01-29 14:19:39.617939137 +0000 UTC m=+1014.312673539" observedRunningTime="2026-01-29 14:19:41.744734757 +0000 UTC m=+1016.439469149" watchObservedRunningTime="2026-01-29 14:19:41.75414421 +0000 UTC m=+1016.448878592" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.767277 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9lql4"] Jan 29 14:19:41 crc kubenswrapper[4753]: W0129 14:19:41.773414 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45a11ce9_e11f_4d7a_927c_142a0e302298.slice/crio-216842678ceedda14e94b524bfe84892c6c86962545d778fcb8b6228e6629ef0 WatchSource:0}: Error finding container 216842678ceedda14e94b524bfe84892c6c86962545d778fcb8b6228e6629ef0: Status 404 returned error can't find the container with id 216842678ceedda14e94b524bfe84892c6c86962545d778fcb8b6228e6629ef0 Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.774353 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rm9d5" podStartSLOduration=17.264620597 podStartE2EDuration="25.774337206s" podCreationTimestamp="2026-01-29 14:19:16 +0000 UTC" firstStartedPulling="2026-01-29 14:19:31.648127286 +0000 UTC m=+1006.342861668" lastFinishedPulling="2026-01-29 14:19:40.157843855 +0000 UTC m=+1014.852578277" observedRunningTime="2026-01-29 14:19:41.767764928 +0000 UTC m=+1016.462499330" watchObservedRunningTime="2026-01-29 14:19:41.774337206 +0000 UTC m=+1016.469071598" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.799182 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=21.050240298 podStartE2EDuration="29.799146275s" podCreationTimestamp="2026-01-29 14:19:12 +0000 UTC" firstStartedPulling="2026-01-29 14:19:31.62601135 +0000 UTC m=+1006.320745732" lastFinishedPulling="2026-01-29 14:19:40.374917327 +0000 UTC m=+1015.069651709" observedRunningTime="2026-01-29 14:19:41.786654259 +0000 UTC m=+1016.481388641" watchObservedRunningTime="2026-01-29 14:19:41.799146275 +0000 UTC m=+1016.493880657" Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.843012 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-4t289"] Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.857682 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-4t289"] Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.887620 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-hnxhk"] Jan 29 14:19:41 crc kubenswrapper[4753]: I0129 14:19:41.894645 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-hnxhk"] Jan 29 14:19:42 crc kubenswrapper[4753]: I0129 14:19:42.160575 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ff2aba6-2861-4506-be8d-5da3acf76f6d" path="/var/lib/kubelet/pods/8ff2aba6-2861-4506-be8d-5da3acf76f6d/volumes" Jan 29 14:19:42 crc kubenswrapper[4753]: I0129 14:19:42.161348 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2e787ea-764f-4911-8ffb-d6c213fded08" path="/var/lib/kubelet/pods/a2e787ea-764f-4911-8ffb-d6c213fded08/volumes" Jan 29 14:19:42 crc kubenswrapper[4753]: I0129 14:19:42.696227 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" event={"ID":"45a11ce9-e11f-4d7a-927c-142a0e302298","Type":"ContainerStarted","Data":"216842678ceedda14e94b524bfe84892c6c86962545d778fcb8b6228e6629ef0"} Jan 29 14:19:43 crc kubenswrapper[4753]: I0129 14:19:43.706021 4753 generic.go:334] "Generic (PLEG): container finished" podID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerID="554ab450b98c8af45b7d97da5f184a9f2c588b955ad8fc71c36ce3fca25f5e71" exitCode=0 Jan 29 14:19:43 crc kubenswrapper[4753]: I0129 14:19:43.706068 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-98h7m" event={"ID":"a17eeeff-955e-4718-9e0e-15fae4b8d9db","Type":"ContainerDied","Data":"554ab450b98c8af45b7d97da5f184a9f2c588b955ad8fc71c36ce3fca25f5e71"} Jan 29 14:19:44 crc kubenswrapper[4753]: I0129 14:19:44.716663 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-98h7m" event={"ID":"a17eeeff-955e-4718-9e0e-15fae4b8d9db","Type":"ContainerStarted","Data":"f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742"} Jan 29 14:19:45 crc kubenswrapper[4753]: I0129 14:19:45.732689 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-98h7m" event={"ID":"a17eeeff-955e-4718-9e0e-15fae4b8d9db","Type":"ContainerStarted","Data":"49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0"} Jan 29 14:19:45 crc kubenswrapper[4753]: I0129 14:19:45.733308 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:45 crc kubenswrapper[4753]: I0129 14:19:45.733378 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:19:46 crc kubenswrapper[4753]: I0129 14:19:46.032349 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 29 14:19:46 crc kubenswrapper[4753]: I0129 14:19:46.086235 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-98h7m" podStartSLOduration=22.439554125 podStartE2EDuration="30.086201949s" podCreationTimestamp="2026-01-29 14:19:16 +0000 UTC" firstStartedPulling="2026-01-29 14:19:32.51112551 +0000 UTC m=+1007.205859892" lastFinishedPulling="2026-01-29 14:19:40.157773334 +0000 UTC m=+1014.852507716" observedRunningTime="2026-01-29 14:19:45.761776379 +0000 UTC m=+1020.456510801" watchObservedRunningTime="2026-01-29 14:19:46.086201949 +0000 UTC m=+1020.780936401" Jan 29 14:19:51 crc kubenswrapper[4753]: I0129 14:19:51.931206 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rdpjg" event={"ID":"1e919242-0fa9-4c26-90c5-718fec0a9109","Type":"ContainerStarted","Data":"2e8b4350e06c3e2f7876fe0bc0220340e6c1e15c1569e84e4a38084b04546f2d"} Jan 29 14:19:51 crc kubenswrapper[4753]: I0129 14:19:51.937258 4753 generic.go:334] "Generic (PLEG): container finished" podID="f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" containerID="21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2" exitCode=0 Jan 29 14:19:51 crc kubenswrapper[4753]: I0129 14:19:51.937396 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" event={"ID":"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c","Type":"ContainerDied","Data":"21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2"} Jan 29 14:19:51 crc kubenswrapper[4753]: I0129 14:19:51.943402 4753 generic.go:334] "Generic (PLEG): container finished" podID="45a11ce9-e11f-4d7a-927c-142a0e302298" containerID="094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759" exitCode=0 Jan 29 14:19:51 crc kubenswrapper[4753]: I0129 14:19:51.943514 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" event={"ID":"45a11ce9-e11f-4d7a-927c-142a0e302298","Type":"ContainerDied","Data":"094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759"} Jan 29 14:19:51 crc kubenswrapper[4753]: I0129 14:19:51.958974 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dd12b7e9-4fd2-4ae9-8e74-d8499726d995","Type":"ContainerStarted","Data":"8be6e8569c31079d6223e7f119db680fdc968fb974c6cb3d3a9876712caa477d"} Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.020573 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rdpjg" podStartSLOduration=2.6129341200000002 podStartE2EDuration="12.020545414s" podCreationTimestamp="2026-01-29 14:19:40 +0000 UTC" firstStartedPulling="2026-01-29 14:19:41.328774684 +0000 UTC m=+1016.023509066" lastFinishedPulling="2026-01-29 14:19:50.736385978 +0000 UTC m=+1025.431120360" observedRunningTime="2026-01-29 14:19:51.981192301 +0000 UTC m=+1026.675926683" watchObservedRunningTime="2026-01-29 14:19:52.020545414 +0000 UTC m=+1026.715279806" Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.023608 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"08759bfe-4b2e-4da9-b0b0-2149a71a831e","Type":"ContainerStarted","Data":"fa247e070c5c59fdd8d782d513689c0f7fef6b0f207b3bdcee7043f734fd870a"} Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.099397 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.253067712 podStartE2EDuration="36.099377502s" podCreationTimestamp="2026-01-29 14:19:16 +0000 UTC" firstStartedPulling="2026-01-29 14:19:30.923105539 +0000 UTC m=+1005.617839931" lastFinishedPulling="2026-01-29 14:19:50.769415339 +0000 UTC m=+1025.464149721" observedRunningTime="2026-01-29 14:19:52.058649163 +0000 UTC m=+1026.753383555" watchObservedRunningTime="2026-01-29 14:19:52.099377502 +0000 UTC m=+1026.794111884" Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.208356 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=15.164729999 podStartE2EDuration="34.208333205s" podCreationTimestamp="2026-01-29 14:19:18 +0000 UTC" firstStartedPulling="2026-01-29 14:19:31.732104524 +0000 UTC m=+1006.426838906" lastFinishedPulling="2026-01-29 14:19:50.77570773 +0000 UTC m=+1025.470442112" observedRunningTime="2026-01-29 14:19:52.130500263 +0000 UTC m=+1026.825234655" watchObservedRunningTime="2026-01-29 14:19:52.208333205 +0000 UTC m=+1026.903067587" Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.468791 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.650072 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9lql4"] Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.754588 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.774637 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-gzftt"] Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.776363 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.800078 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-gzftt"] Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.943578 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.943654 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.943684 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.943746 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srk8g\" (UniqueName: \"kubernetes.io/projected/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-kube-api-access-srk8g\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:52 crc kubenswrapper[4753]: I0129 14:19:52.943948 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-config\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.034987 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" event={"ID":"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c","Type":"ContainerStarted","Data":"f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e"} Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.035097 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.038285 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" event={"ID":"45a11ce9-e11f-4d7a-927c-142a0e302298","Type":"ContainerStarted","Data":"0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279"} Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.039103 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.045596 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-config\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.045660 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.045708 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.045742 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.045791 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srk8g\" (UniqueName: \"kubernetes.io/projected/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-kube-api-access-srk8g\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.046795 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-config\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.047582 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-nb\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.047817 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-dns-svc\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.047998 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-sb\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.052738 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.067088 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" podStartSLOduration=3.830489589 podStartE2EDuration="13.067062333s" podCreationTimestamp="2026-01-29 14:19:40 +0000 UTC" firstStartedPulling="2026-01-29 14:19:41.526306999 +0000 UTC m=+1016.221041381" lastFinishedPulling="2026-01-29 14:19:50.762879743 +0000 UTC m=+1025.457614125" observedRunningTime="2026-01-29 14:19:53.055191613 +0000 UTC m=+1027.749925995" watchObservedRunningTime="2026-01-29 14:19:53.067062333 +0000 UTC m=+1027.761796725" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.068686 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srk8g\" (UniqueName: \"kubernetes.io/projected/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-kube-api-access-srk8g\") pod \"dnsmasq-dns-67fdf7998c-gzftt\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.086625 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" podStartSLOduration=4.092867483 podStartE2EDuration="13.086610341s" podCreationTimestamp="2026-01-29 14:19:40 +0000 UTC" firstStartedPulling="2026-01-29 14:19:41.775682992 +0000 UTC m=+1016.470417364" lastFinishedPulling="2026-01-29 14:19:50.76942585 +0000 UTC m=+1025.464160222" observedRunningTime="2026-01-29 14:19:53.082272014 +0000 UTC m=+1027.777006406" watchObservedRunningTime="2026-01-29 14:19:53.086610341 +0000 UTC m=+1027.781344723" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.102940 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.138792 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.468906 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.514743 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.604573 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-gzftt"] Jan 29 14:19:53 crc kubenswrapper[4753]: W0129 14:19:53.621090 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59b14dcd_09ad_4186_98d6_781ef2a5c3f6.slice/crio-ee5e4f51a17b89e86aa2886792b34ef32ec14f6a239fe22c97870d3c610e5d52 WatchSource:0}: Error finding container ee5e4f51a17b89e86aa2886792b34ef32ec14f6a239fe22c97870d3c610e5d52: Status 404 returned error can't find the container with id ee5e4f51a17b89e86aa2886792b34ef32ec14f6a239fe22c97870d3c610e5d52 Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.863024 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.868888 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.871024 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.871484 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.872395 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-n7bgt" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.872667 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.888781 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.967617 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6d9169-050f-40e0-91ff-80d0afa6ff53-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.969032 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.969235 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-cache\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.969384 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.969525 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vktf4\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-kube-api-access-vktf4\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:53 crc kubenswrapper[4753]: I0129 14:19:53.969664 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-lock\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.047702 4753 generic.go:334] "Generic (PLEG): container finished" podID="001ea12a-a725-4cd9-a12e-1442d56f7068" containerID="c48f2caaa7b2b2dcc7a3ca761e1a8901c8e8087e528a5186cf1bb1488756685e" exitCode=0 Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.047783 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"001ea12a-a725-4cd9-a12e-1442d56f7068","Type":"ContainerDied","Data":"c48f2caaa7b2b2dcc7a3ca761e1a8901c8e8087e528a5186cf1bb1488756685e"} Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.051367 4753 generic.go:334] "Generic (PLEG): container finished" podID="94157b6b-3cc9-44e9-9625-64d34611046a" containerID="0f883b984a6efe1a3b4819a8200d14cf585a8a3d8843d988a84fda1b04aaa30e" exitCode=0 Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.051682 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94157b6b-3cc9-44e9-9625-64d34611046a","Type":"ContainerDied","Data":"0f883b984a6efe1a3b4819a8200d14cf585a8a3d8843d988a84fda1b04aaa30e"} Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.054223 4753 generic.go:334] "Generic (PLEG): container finished" podID="59b14dcd-09ad-4186-98d6-781ef2a5c3f6" containerID="6ce01ce4b7948a73a401799af6d07ea4296e432d2c38856a74b0bc9ca6672d2a" exitCode=0 Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.054309 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" event={"ID":"59b14dcd-09ad-4186-98d6-781ef2a5c3f6","Type":"ContainerDied","Data":"6ce01ce4b7948a73a401799af6d07ea4296e432d2c38856a74b0bc9ca6672d2a"} Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.054341 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" event={"ID":"59b14dcd-09ad-4186-98d6-781ef2a5c3f6","Type":"ContainerStarted","Data":"ee5e4f51a17b89e86aa2886792b34ef32ec14f6a239fe22c97870d3c610e5d52"} Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.055042 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" podUID="45a11ce9-e11f-4d7a-927c-142a0e302298" containerName="dnsmasq-dns" containerID="cri-o://0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279" gracePeriod=10 Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.055139 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.070892 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6d9169-050f-40e0-91ff-80d0afa6ff53-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.071397 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.071445 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-cache\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.071480 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.071530 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vktf4\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-kube-api-access-vktf4\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.071576 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-lock\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.072501 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.072549 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-cache\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: E0129 14:19:54.072708 4753 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 14:19:54 crc kubenswrapper[4753]: E0129 14:19:54.072728 4753 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 14:19:54 crc kubenswrapper[4753]: E0129 14:19:54.072774 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift podName:ac6d9169-050f-40e0-91ff-80d0afa6ff53 nodeName:}" failed. No retries permitted until 2026-01-29 14:19:54.572755299 +0000 UTC m=+1029.267489691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift") pod "swift-storage-0" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53") : configmap "swift-ring-files" not found Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.073402 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-lock\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.081954 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6d9169-050f-40e0-91ff-80d0afa6ff53-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.108919 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vktf4\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-kube-api-access-vktf4\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.119579 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.124038 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.124102 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.516367 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.579247 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-sb\") pod \"45a11ce9-e11f-4d7a-927c-142a0e302298\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.579326 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb8w4\" (UniqueName: \"kubernetes.io/projected/45a11ce9-e11f-4d7a-927c-142a0e302298-kube-api-access-xb8w4\") pod \"45a11ce9-e11f-4d7a-927c-142a0e302298\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.579349 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-dns-svc\") pod \"45a11ce9-e11f-4d7a-927c-142a0e302298\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.579366 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-config\") pod \"45a11ce9-e11f-4d7a-927c-142a0e302298\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.579400 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-nb\") pod \"45a11ce9-e11f-4d7a-927c-142a0e302298\" (UID: \"45a11ce9-e11f-4d7a-927c-142a0e302298\") " Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.579578 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:54 crc kubenswrapper[4753]: E0129 14:19:54.579728 4753 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 14:19:54 crc kubenswrapper[4753]: E0129 14:19:54.579742 4753 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 14:19:54 crc kubenswrapper[4753]: E0129 14:19:54.579781 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift podName:ac6d9169-050f-40e0-91ff-80d0afa6ff53 nodeName:}" failed. No retries permitted until 2026-01-29 14:19:55.579768451 +0000 UTC m=+1030.274502833 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift") pod "swift-storage-0" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53") : configmap "swift-ring-files" not found Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.591589 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a11ce9-e11f-4d7a-927c-142a0e302298-kube-api-access-xb8w4" (OuterVolumeSpecName: "kube-api-access-xb8w4") pod "45a11ce9-e11f-4d7a-927c-142a0e302298" (UID: "45a11ce9-e11f-4d7a-927c-142a0e302298"). InnerVolumeSpecName "kube-api-access-xb8w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.634061 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-config" (OuterVolumeSpecName: "config") pod "45a11ce9-e11f-4d7a-927c-142a0e302298" (UID: "45a11ce9-e11f-4d7a-927c-142a0e302298"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.635830 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45a11ce9-e11f-4d7a-927c-142a0e302298" (UID: "45a11ce9-e11f-4d7a-927c-142a0e302298"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.639347 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45a11ce9-e11f-4d7a-927c-142a0e302298" (UID: "45a11ce9-e11f-4d7a-927c-142a0e302298"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.649951 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 29 14:19:54 crc kubenswrapper[4753]: E0129 14:19:54.650319 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a11ce9-e11f-4d7a-927c-142a0e302298" containerName="dnsmasq-dns" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.650337 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a11ce9-e11f-4d7a-927c-142a0e302298" containerName="dnsmasq-dns" Jan 29 14:19:54 crc kubenswrapper[4753]: E0129 14:19:54.650352 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a11ce9-e11f-4d7a-927c-142a0e302298" containerName="init" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.650359 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a11ce9-e11f-4d7a-927c-142a0e302298" containerName="init" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.650731 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="45a11ce9-e11f-4d7a-927c-142a0e302298" containerName="dnsmasq-dns" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.655388 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.659596 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.659656 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45a11ce9-e11f-4d7a-927c-142a0e302298" (UID: "45a11ce9-e11f-4d7a-927c-142a0e302298"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.659916 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.660048 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.662512 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-6clg9" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680719 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680777 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680805 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-scripts\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680852 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk8k9\" (UniqueName: \"kubernetes.io/projected/31019cc8-ce90-453f-be4f-949ed45a5873-kube-api-access-zk8k9\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680869 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680887 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680915 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-config\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680954 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb8w4\" (UniqueName: \"kubernetes.io/projected/45a11ce9-e11f-4d7a-927c-142a0e302298-kube-api-access-xb8w4\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680965 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680973 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680982 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.680991 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45a11ce9-e11f-4d7a-927c-142a0e302298-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.684498 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.783277 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.783356 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.783392 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-scripts\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.783450 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk8k9\" (UniqueName: \"kubernetes.io/projected/31019cc8-ce90-453f-be4f-949ed45a5873-kube-api-access-zk8k9\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.783473 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.783503 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.783559 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-config\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.784113 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.784734 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-config\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.785131 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-scripts\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.787861 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.787864 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.788484 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.804121 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk8k9\" (UniqueName: \"kubernetes.io/projected/31019cc8-ce90-453f-be4f-949ed45a5873-kube-api-access-zk8k9\") pod \"ovn-northd-0\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " pod="openstack/ovn-northd-0" Jan 29 14:19:54 crc kubenswrapper[4753]: I0129 14:19:54.977226 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.071142 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" event={"ID":"59b14dcd-09ad-4186-98d6-781ef2a5c3f6","Type":"ContainerStarted","Data":"e4cc9a6cff40c07ad338240f406155a82abcd20c309cad37862dd5977188bc04"} Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.071550 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.073318 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"001ea12a-a725-4cd9-a12e-1442d56f7068","Type":"ContainerStarted","Data":"b638be6aff7479c4c3b3e3e264266ee7a0c8949f1731326d7adf23a76a43271b"} Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.076992 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94157b6b-3cc9-44e9-9625-64d34611046a","Type":"ContainerStarted","Data":"85e717f2d1168cff52e5656e97f7028853eb41a9dedbe7b1a8d1cda97bf06e35"} Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.079290 4753 generic.go:334] "Generic (PLEG): container finished" podID="45a11ce9-e11f-4d7a-927c-142a0e302298" containerID="0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279" exitCode=0 Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.079393 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.080137 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" event={"ID":"45a11ce9-e11f-4d7a-927c-142a0e302298","Type":"ContainerDied","Data":"0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279"} Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.080273 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586b989cdc-9lql4" event={"ID":"45a11ce9-e11f-4d7a-927c-142a0e302298","Type":"ContainerDied","Data":"216842678ceedda14e94b524bfe84892c6c86962545d778fcb8b6228e6629ef0"} Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.080352 4753 scope.go:117] "RemoveContainer" containerID="0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.131273 4753 scope.go:117] "RemoveContainer" containerID="094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.144396 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" podStartSLOduration=3.144357016 podStartE2EDuration="3.144357016s" podCreationTimestamp="2026-01-29 14:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:19:55.094054038 +0000 UTC m=+1029.788788470" watchObservedRunningTime="2026-01-29 14:19:55.144357016 +0000 UTC m=+1029.839091418" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.151503 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=39.26683584 podStartE2EDuration="47.151490709s" podCreationTimestamp="2026-01-29 14:19:08 +0000 UTC" firstStartedPulling="2026-01-29 14:19:31.628934899 +0000 UTC m=+1006.323669281" lastFinishedPulling="2026-01-29 14:19:39.513589758 +0000 UTC m=+1014.208324150" observedRunningTime="2026-01-29 14:19:55.148981801 +0000 UTC m=+1029.843716183" watchObservedRunningTime="2026-01-29 14:19:55.151490709 +0000 UTC m=+1029.846225081" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.186581 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=37.694429162 podStartE2EDuration="46.186560266s" podCreationTimestamp="2026-01-29 14:19:09 +0000 UTC" firstStartedPulling="2026-01-29 14:19:31.664401576 +0000 UTC m=+1006.359135958" lastFinishedPulling="2026-01-29 14:19:40.15653268 +0000 UTC m=+1014.851267062" observedRunningTime="2026-01-29 14:19:55.179889465 +0000 UTC m=+1029.874623847" watchObservedRunningTime="2026-01-29 14:19:55.186560266 +0000 UTC m=+1029.881294648" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.196652 4753 scope.go:117] "RemoveContainer" containerID="0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279" Jan 29 14:19:55 crc kubenswrapper[4753]: E0129 14:19:55.200762 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279\": container with ID starting with 0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279 not found: ID does not exist" containerID="0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.200814 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279"} err="failed to get container status \"0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279\": rpc error: code = NotFound desc = could not find container \"0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279\": container with ID starting with 0ed506601b8a0b0021525baa089636dc1255bf8bdaa397783e1073aa500a5279 not found: ID does not exist" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.200845 4753 scope.go:117] "RemoveContainer" containerID="094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759" Jan 29 14:19:55 crc kubenswrapper[4753]: E0129 14:19:55.201264 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759\": container with ID starting with 094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759 not found: ID does not exist" containerID="094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.201288 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759"} err="failed to get container status \"094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759\": rpc error: code = NotFound desc = could not find container \"094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759\": container with ID starting with 094efe0ff18c7d20ace1ce25e24c9d806d47342d0ecf0e3881f94226b875c759 not found: ID does not exist" Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.210476 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9lql4"] Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.216972 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586b989cdc-9lql4"] Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.365295 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 14:19:55 crc kubenswrapper[4753]: W0129 14:19:55.368482 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31019cc8_ce90_453f_be4f_949ed45a5873.slice/crio-65675f1aabccde5b55da60f3fa9c1ffebba3bb8602450f43d737e1445a53afbf WatchSource:0}: Error finding container 65675f1aabccde5b55da60f3fa9c1ffebba3bb8602450f43d737e1445a53afbf: Status 404 returned error can't find the container with id 65675f1aabccde5b55da60f3fa9c1ffebba3bb8602450f43d737e1445a53afbf Jan 29 14:19:55 crc kubenswrapper[4753]: I0129 14:19:55.606070 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:55 crc kubenswrapper[4753]: E0129 14:19:55.606326 4753 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 14:19:55 crc kubenswrapper[4753]: E0129 14:19:55.606832 4753 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 14:19:55 crc kubenswrapper[4753]: E0129 14:19:55.606899 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift podName:ac6d9169-050f-40e0-91ff-80d0afa6ff53 nodeName:}" failed. No retries permitted until 2026-01-29 14:19:57.606879266 +0000 UTC m=+1032.301613648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift") pod "swift-storage-0" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53") : configmap "swift-ring-files" not found Jan 29 14:19:56 crc kubenswrapper[4753]: I0129 14:19:56.091809 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"31019cc8-ce90-453f-be4f-949ed45a5873","Type":"ContainerStarted","Data":"65675f1aabccde5b55da60f3fa9c1ffebba3bb8602450f43d737e1445a53afbf"} Jan 29 14:19:56 crc kubenswrapper[4753]: I0129 14:19:56.161311 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a11ce9-e11f-4d7a-927c-142a0e302298" path="/var/lib/kubelet/pods/45a11ce9-e11f-4d7a-927c-142a0e302298/volumes" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.645407 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:19:57 crc kubenswrapper[4753]: E0129 14:19:57.645703 4753 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 14:19:57 crc kubenswrapper[4753]: E0129 14:19:57.646115 4753 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 14:19:57 crc kubenswrapper[4753]: E0129 14:19:57.646231 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift podName:ac6d9169-050f-40e0-91ff-80d0afa6ff53 nodeName:}" failed. No retries permitted until 2026-01-29 14:20:01.646204864 +0000 UTC m=+1036.340939276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift") pod "swift-storage-0" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53") : configmap "swift-ring-files" not found Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.766607 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-72fvk"] Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.768535 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.770400 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.770550 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.771678 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.805172 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s47kf"] Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.806299 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.819685 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-72fvk"] Jan 29 14:19:57 crc kubenswrapper[4753]: E0129 14:19:57.820330 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-xppsn ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-xppsn ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-72fvk" podUID="30ac572c-fa1c-49dc-b0b1-27b7350c133b" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.833111 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s47kf"] Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850533 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s52rp\" (UniqueName: \"kubernetes.io/projected/fd640e3a-86af-406d-83c6-2df59b891fc3-kube-api-access-s52rp\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850591 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-dispersionconf\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850635 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd640e3a-86af-406d-83c6-2df59b891fc3-etc-swift\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850698 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-combined-ca-bundle\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850717 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-combined-ca-bundle\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850737 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/30ac572c-fa1c-49dc-b0b1-27b7350c133b-etc-swift\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850756 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-swiftconf\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850772 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-dispersionconf\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850791 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-scripts\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850809 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-ring-data-devices\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850867 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-ring-data-devices\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.850931 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xppsn\" (UniqueName: \"kubernetes.io/projected/30ac572c-fa1c-49dc-b0b1-27b7350c133b-kube-api-access-xppsn\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.851188 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-scripts\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.851281 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-swiftconf\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.870133 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-72fvk"] Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953296 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-combined-ca-bundle\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953338 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-combined-ca-bundle\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953357 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/30ac572c-fa1c-49dc-b0b1-27b7350c133b-etc-swift\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953378 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-swiftconf\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953392 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-dispersionconf\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953412 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-scripts\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953428 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-ring-data-devices\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953449 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-ring-data-devices\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953467 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xppsn\" (UniqueName: \"kubernetes.io/projected/30ac572c-fa1c-49dc-b0b1-27b7350c133b-kube-api-access-xppsn\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953507 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-scripts\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953537 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-swiftconf\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953567 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s52rp\" (UniqueName: \"kubernetes.io/projected/fd640e3a-86af-406d-83c6-2df59b891fc3-kube-api-access-s52rp\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953592 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-dispersionconf\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.953611 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd640e3a-86af-406d-83c6-2df59b891fc3-etc-swift\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.955362 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/30ac572c-fa1c-49dc-b0b1-27b7350c133b-etc-swift\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.955436 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd640e3a-86af-406d-83c6-2df59b891fc3-etc-swift\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.955454 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-ring-data-devices\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.955884 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-ring-data-devices\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.955905 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-scripts\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.956074 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-scripts\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.966236 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-swiftconf\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.966405 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-dispersionconf\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.966537 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-swiftconf\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.966832 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-dispersionconf\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.967398 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-combined-ca-bundle\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.969516 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xppsn\" (UniqueName: \"kubernetes.io/projected/30ac572c-fa1c-49dc-b0b1-27b7350c133b-kube-api-access-xppsn\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.969951 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-combined-ca-bundle\") pod \"swift-ring-rebalance-72fvk\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:57 crc kubenswrapper[4753]: I0129 14:19:57.976752 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s52rp\" (UniqueName: \"kubernetes.io/projected/fd640e3a-86af-406d-83c6-2df59b891fc3-kube-api-access-s52rp\") pod \"swift-ring-rebalance-s47kf\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.121523 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.129103 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.133310 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.156539 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-scripts\") pod \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.156775 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-dispersionconf\") pod \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.156856 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-swiftconf\") pod \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.157011 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-combined-ca-bundle\") pod \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.157137 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/30ac572c-fa1c-49dc-b0b1-27b7350c133b-etc-swift\") pod \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.157254 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xppsn\" (UniqueName: \"kubernetes.io/projected/30ac572c-fa1c-49dc-b0b1-27b7350c133b-kube-api-access-xppsn\") pod \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.157361 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-ring-data-devices\") pod \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\" (UID: \"30ac572c-fa1c-49dc-b0b1-27b7350c133b\") " Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.157709 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-scripts" (OuterVolumeSpecName: "scripts") pod "30ac572c-fa1c-49dc-b0b1-27b7350c133b" (UID: "30ac572c-fa1c-49dc-b0b1-27b7350c133b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.157861 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.158145 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30ac572c-fa1c-49dc-b0b1-27b7350c133b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "30ac572c-fa1c-49dc-b0b1-27b7350c133b" (UID: "30ac572c-fa1c-49dc-b0b1-27b7350c133b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.158289 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "30ac572c-fa1c-49dc-b0b1-27b7350c133b" (UID: "30ac572c-fa1c-49dc-b0b1-27b7350c133b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.162503 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30ac572c-fa1c-49dc-b0b1-27b7350c133b" (UID: "30ac572c-fa1c-49dc-b0b1-27b7350c133b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.162686 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ac572c-fa1c-49dc-b0b1-27b7350c133b-kube-api-access-xppsn" (OuterVolumeSpecName: "kube-api-access-xppsn") pod "30ac572c-fa1c-49dc-b0b1-27b7350c133b" (UID: "30ac572c-fa1c-49dc-b0b1-27b7350c133b"). InnerVolumeSpecName "kube-api-access-xppsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.166707 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "30ac572c-fa1c-49dc-b0b1-27b7350c133b" (UID: "30ac572c-fa1c-49dc-b0b1-27b7350c133b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.167439 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "30ac572c-fa1c-49dc-b0b1-27b7350c133b" (UID: "30ac572c-fa1c-49dc-b0b1-27b7350c133b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.260426 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.260474 4753 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/30ac572c-fa1c-49dc-b0b1-27b7350c133b-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.260487 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xppsn\" (UniqueName: \"kubernetes.io/projected/30ac572c-fa1c-49dc-b0b1-27b7350c133b-kube-api-access-xppsn\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.260504 4753 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/30ac572c-fa1c-49dc-b0b1-27b7350c133b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.260515 4753 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.260527 4753 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/30ac572c-fa1c-49dc-b0b1-27b7350c133b-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 29 14:19:58 crc kubenswrapper[4753]: I0129 14:19:58.626573 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s47kf"] Jan 29 14:19:58 crc kubenswrapper[4753]: W0129 14:19:58.630407 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd640e3a_86af_406d_83c6_2df59b891fc3.slice/crio-e581e52d8cec50ba5e8993255685b79a23d4de3fc28b9688af9948a825c6bfc9 WatchSource:0}: Error finding container e581e52d8cec50ba5e8993255685b79a23d4de3fc28b9688af9948a825c6bfc9: Status 404 returned error can't find the container with id e581e52d8cec50ba5e8993255685b79a23d4de3fc28b9688af9948a825c6bfc9 Jan 29 14:19:59 crc kubenswrapper[4753]: I0129 14:19:59.130508 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"31019cc8-ce90-453f-be4f-949ed45a5873","Type":"ContainerStarted","Data":"0c9a422d95efc2b8a373980fb0f3a46037a883ada7821c87b5bc7209541856f4"} Jan 29 14:19:59 crc kubenswrapper[4753]: I0129 14:19:59.130553 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"31019cc8-ce90-453f-be4f-949ed45a5873","Type":"ContainerStarted","Data":"b19308d0814c3df635fdb38228ce9b7ebf5a99fefcc0274c1834d736932a59bd"} Jan 29 14:19:59 crc kubenswrapper[4753]: I0129 14:19:59.131528 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 29 14:19:59 crc kubenswrapper[4753]: I0129 14:19:59.133240 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-72fvk" Jan 29 14:19:59 crc kubenswrapper[4753]: I0129 14:19:59.133767 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s47kf" event={"ID":"fd640e3a-86af-406d-83c6-2df59b891fc3","Type":"ContainerStarted","Data":"e581e52d8cec50ba5e8993255685b79a23d4de3fc28b9688af9948a825c6bfc9"} Jan 29 14:19:59 crc kubenswrapper[4753]: I0129 14:19:59.176663 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.334871725 podStartE2EDuration="5.17663569s" podCreationTimestamp="2026-01-29 14:19:54 +0000 UTC" firstStartedPulling="2026-01-29 14:19:55.37260031 +0000 UTC m=+1030.067334702" lastFinishedPulling="2026-01-29 14:19:58.214364275 +0000 UTC m=+1032.909098667" observedRunningTime="2026-01-29 14:19:59.161860081 +0000 UTC m=+1033.856594463" watchObservedRunningTime="2026-01-29 14:19:59.17663569 +0000 UTC m=+1033.871370112" Jan 29 14:19:59 crc kubenswrapper[4753]: I0129 14:19:59.221186 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-72fvk"] Jan 29 14:19:59 crc kubenswrapper[4753]: I0129 14:19:59.226239 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-72fvk"] Jan 29 14:19:59 crc kubenswrapper[4753]: I0129 14:19:59.483699 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 29 14:19:59 crc kubenswrapper[4753]: I0129 14:19:59.483770 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 29 14:19:59 crc kubenswrapper[4753]: E0129 14:19:59.843218 4753 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.142:53500->38.102.83.142:37393: write tcp 38.102.83.142:53500->38.102.83.142:37393: write: broken pipe Jan 29 14:20:00 crc kubenswrapper[4753]: I0129 14:20:00.159070 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ac572c-fa1c-49dc-b0b1-27b7350c133b" path="/var/lib/kubelet/pods/30ac572c-fa1c-49dc-b0b1-27b7350c133b/volumes" Jan 29 14:20:00 crc kubenswrapper[4753]: I0129 14:20:00.921369 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:20:01 crc kubenswrapper[4753]: I0129 14:20:01.162800 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 29 14:20:01 crc kubenswrapper[4753]: I0129 14:20:01.162873 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 29 14:20:01 crc kubenswrapper[4753]: I0129 14:20:01.279354 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 29 14:20:01 crc kubenswrapper[4753]: I0129 14:20:01.648874 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:20:01 crc kubenswrapper[4753]: E0129 14:20:01.649096 4753 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 29 14:20:01 crc kubenswrapper[4753]: E0129 14:20:01.649128 4753 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 29 14:20:01 crc kubenswrapper[4753]: E0129 14:20:01.649223 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift podName:ac6d9169-050f-40e0-91ff-80d0afa6ff53 nodeName:}" failed. No retries permitted until 2026-01-29 14:20:09.649205347 +0000 UTC m=+1044.343939719 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift") pod "swift-storage-0" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53") : configmap "swift-ring-files" not found Jan 29 14:20:02 crc kubenswrapper[4753]: I0129 14:20:02.058946 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 29 14:20:02 crc kubenswrapper[4753]: I0129 14:20:02.175669 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 29 14:20:02 crc kubenswrapper[4753]: I0129 14:20:02.299123 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.104385 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.185187 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s47kf" event={"ID":"fd640e3a-86af-406d-83c6-2df59b891fc3","Type":"ContainerStarted","Data":"e9817aee647b3f9d81adf2e4ce14d5bae32c01236812cacdf0002005b7c75760"} Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.190740 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-4ccz9"] Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.191046 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" podUID="f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" containerName="dnsmasq-dns" containerID="cri-o://f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e" gracePeriod=10 Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.231593 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-s47kf" podStartSLOduration=2.631698907 podStartE2EDuration="6.231573246s" podCreationTimestamp="2026-01-29 14:19:57 +0000 UTC" firstStartedPulling="2026-01-29 14:19:58.633132323 +0000 UTC m=+1033.327866715" lastFinishedPulling="2026-01-29 14:20:02.233006672 +0000 UTC m=+1036.927741054" observedRunningTime="2026-01-29 14:20:03.220667601 +0000 UTC m=+1037.915401993" watchObservedRunningTime="2026-01-29 14:20:03.231573246 +0000 UTC m=+1037.926307648" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.741596 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.893699 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-ovsdbserver-sb\") pod \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.893776 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nvrd\" (UniqueName: \"kubernetes.io/projected/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-kube-api-access-7nvrd\") pod \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.893817 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-config\") pod \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.893915 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-dns-svc\") pod \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\" (UID: \"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c\") " Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.899277 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-kube-api-access-7nvrd" (OuterVolumeSpecName: "kube-api-access-7nvrd") pod "f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" (UID: "f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c"). InnerVolumeSpecName "kube-api-access-7nvrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.928347 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" (UID: "f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.943337 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" (UID: "f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.948071 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-config" (OuterVolumeSpecName: "config") pod "f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" (UID: "f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.995993 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.996670 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nvrd\" (UniqueName: \"kubernetes.io/projected/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-kube-api-access-7nvrd\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.996788 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:03 crc kubenswrapper[4753]: I0129 14:20:03.996858 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.193396 4753 generic.go:334] "Generic (PLEG): container finished" podID="f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" containerID="f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e" exitCode=0 Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.194173 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.194947 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" event={"ID":"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c","Type":"ContainerDied","Data":"f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e"} Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.195231 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b79764b65-4ccz9" event={"ID":"f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c","Type":"ContainerDied","Data":"74d4fb495226086a92d681c6ba3cfcae6b05b09857ee144140151ec8ed7e1165"} Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.195250 4753 scope.go:117] "RemoveContainer" containerID="f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e" Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.229903 4753 scope.go:117] "RemoveContainer" containerID="21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2" Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.231055 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-4ccz9"] Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.238651 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b79764b65-4ccz9"] Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.262324 4753 scope.go:117] "RemoveContainer" containerID="f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e" Jan 29 14:20:04 crc kubenswrapper[4753]: E0129 14:20:04.262867 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e\": container with ID starting with f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e not found: ID does not exist" containerID="f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e" Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.263119 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e"} err="failed to get container status \"f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e\": rpc error: code = NotFound desc = could not find container \"f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e\": container with ID starting with f86d977e14a7993f3024bf75374bdcabb436139b3c50edfa0ec0ea0bc0ad348e not found: ID does not exist" Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.263357 4753 scope.go:117] "RemoveContainer" containerID="21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2" Jan 29 14:20:04 crc kubenswrapper[4753]: E0129 14:20:04.263867 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2\": container with ID starting with 21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2 not found: ID does not exist" containerID="21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2" Jan 29 14:20:04 crc kubenswrapper[4753]: I0129 14:20:04.263906 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2"} err="failed to get container status \"21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2\": rpc error: code = NotFound desc = could not find container \"21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2\": container with ID starting with 21aa2588d734f947a6e7feaba830db952a0dc10eb6f3125ac0a79b68b24bcad2 not found: ID does not exist" Jan 29 14:20:05 crc kubenswrapper[4753]: I0129 14:20:05.215717 4753 generic.go:334] "Generic (PLEG): container finished" podID="5f7e3e27-a036-4623-8d63-557a3c0d76e6" containerID="81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd" exitCode=0 Jan 29 14:20:05 crc kubenswrapper[4753]: I0129 14:20:05.215785 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f7e3e27-a036-4623-8d63-557a3c0d76e6","Type":"ContainerDied","Data":"81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd"} Jan 29 14:20:05 crc kubenswrapper[4753]: I0129 14:20:05.218020 4753 generic.go:334] "Generic (PLEG): container finished" podID="ad5c04aa-ed92-4c33-ad37-4420b362e237" containerID="6ea11bc6de1dca2ccb62590a48ec34192f4584015fe67a14444f1dcae0fb9d8d" exitCode=0 Jan 29 14:20:05 crc kubenswrapper[4753]: I0129 14:20:05.218072 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad5c04aa-ed92-4c33-ad37-4420b362e237","Type":"ContainerDied","Data":"6ea11bc6de1dca2ccb62590a48ec34192f4584015fe67a14444f1dcae0fb9d8d"} Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.172240 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" path="/var/lib/kubelet/pods/f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c/volumes" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.229758 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad5c04aa-ed92-4c33-ad37-4420b362e237","Type":"ContainerStarted","Data":"3b34f95853a15ff9210f7c5a34e53924e5ea049fe09b94a0c39100cd6c83fdab"} Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.230020 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.232859 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f7e3e27-a036-4623-8d63-557a3c0d76e6","Type":"ContainerStarted","Data":"a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c"} Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.233188 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.283775 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371976.57102 podStartE2EDuration="1m0.283755024s" podCreationTimestamp="2026-01-29 14:19:06 +0000 UTC" firstStartedPulling="2026-01-29 14:19:08.658161818 +0000 UTC m=+983.352896200" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:20:06.272006496 +0000 UTC m=+1040.966740888" watchObservedRunningTime="2026-01-29 14:20:06.283755024 +0000 UTC m=+1040.978489406" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.297583 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c053-account-create-update-pbzxx"] Jan 29 14:20:06 crc kubenswrapper[4753]: E0129 14:20:06.297988 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" containerName="init" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.298008 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" containerName="init" Jan 29 14:20:06 crc kubenswrapper[4753]: E0129 14:20:06.298042 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" containerName="dnsmasq-dns" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.298051 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" containerName="dnsmasq-dns" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.298274 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a4eb20-8bfe-47ab-86ff-2e030a0f7d2c" containerName="dnsmasq-dns" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.298868 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c053-account-create-update-pbzxx" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.300591 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.304792 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.152666856 podStartE2EDuration="1m0.304775002s" podCreationTimestamp="2026-01-29 14:19:06 +0000 UTC" firstStartedPulling="2026-01-29 14:19:08.766980985 +0000 UTC m=+983.461715367" lastFinishedPulling="2026-01-29 14:19:30.919089121 +0000 UTC m=+1005.613823513" observedRunningTime="2026-01-29 14:20:06.298520912 +0000 UTC m=+1040.993255294" watchObservedRunningTime="2026-01-29 14:20:06.304775002 +0000 UTC m=+1040.999509384" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.317684 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c053-account-create-update-pbzxx"] Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.323697 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-m9d5c"] Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.324750 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m9d5c" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.344258 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-m9d5c"] Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.447984 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz7kv\" (UniqueName: \"kubernetes.io/projected/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-kube-api-access-sz7kv\") pod \"glance-c053-account-create-update-pbzxx\" (UID: \"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0\") " pod="openstack/glance-c053-account-create-update-pbzxx" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.448090 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3381688-f092-411b-b2b2-5cdd67813b7d-operator-scripts\") pod \"glance-db-create-m9d5c\" (UID: \"e3381688-f092-411b-b2b2-5cdd67813b7d\") " pod="openstack/glance-db-create-m9d5c" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.448232 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-operator-scripts\") pod \"glance-c053-account-create-update-pbzxx\" (UID: \"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0\") " pod="openstack/glance-c053-account-create-update-pbzxx" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.448371 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d9wg\" (UniqueName: \"kubernetes.io/projected/e3381688-f092-411b-b2b2-5cdd67813b7d-kube-api-access-2d9wg\") pod \"glance-db-create-m9d5c\" (UID: \"e3381688-f092-411b-b2b2-5cdd67813b7d\") " pod="openstack/glance-db-create-m9d5c" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.549790 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3381688-f092-411b-b2b2-5cdd67813b7d-operator-scripts\") pod \"glance-db-create-m9d5c\" (UID: \"e3381688-f092-411b-b2b2-5cdd67813b7d\") " pod="openstack/glance-db-create-m9d5c" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.549886 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-operator-scripts\") pod \"glance-c053-account-create-update-pbzxx\" (UID: \"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0\") " pod="openstack/glance-c053-account-create-update-pbzxx" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.549943 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d9wg\" (UniqueName: \"kubernetes.io/projected/e3381688-f092-411b-b2b2-5cdd67813b7d-kube-api-access-2d9wg\") pod \"glance-db-create-m9d5c\" (UID: \"e3381688-f092-411b-b2b2-5cdd67813b7d\") " pod="openstack/glance-db-create-m9d5c" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.550024 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz7kv\" (UniqueName: \"kubernetes.io/projected/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-kube-api-access-sz7kv\") pod \"glance-c053-account-create-update-pbzxx\" (UID: \"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0\") " pod="openstack/glance-c053-account-create-update-pbzxx" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.550672 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3381688-f092-411b-b2b2-5cdd67813b7d-operator-scripts\") pod \"glance-db-create-m9d5c\" (UID: \"e3381688-f092-411b-b2b2-5cdd67813b7d\") " pod="openstack/glance-db-create-m9d5c" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.551076 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-operator-scripts\") pod \"glance-c053-account-create-update-pbzxx\" (UID: \"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0\") " pod="openstack/glance-c053-account-create-update-pbzxx" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.579672 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d9wg\" (UniqueName: \"kubernetes.io/projected/e3381688-f092-411b-b2b2-5cdd67813b7d-kube-api-access-2d9wg\") pod \"glance-db-create-m9d5c\" (UID: \"e3381688-f092-411b-b2b2-5cdd67813b7d\") " pod="openstack/glance-db-create-m9d5c" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.579870 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz7kv\" (UniqueName: \"kubernetes.io/projected/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-kube-api-access-sz7kv\") pod \"glance-c053-account-create-update-pbzxx\" (UID: \"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0\") " pod="openstack/glance-c053-account-create-update-pbzxx" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.616756 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c053-account-create-update-pbzxx" Jan 29 14:20:06 crc kubenswrapper[4753]: I0129 14:20:06.639547 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m9d5c" Jan 29 14:20:07 crc kubenswrapper[4753]: I0129 14:20:07.010814 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c053-account-create-update-pbzxx"] Jan 29 14:20:07 crc kubenswrapper[4753]: W0129 14:20:07.011336 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae41990f_dd6c_4bae_ba0c_ebfde88a25a0.slice/crio-a568c97147b3dc6b28577d1e5c62a09d1235197aee6b703437835db363313b2a WatchSource:0}: Error finding container a568c97147b3dc6b28577d1e5c62a09d1235197aee6b703437835db363313b2a: Status 404 returned error can't find the container with id a568c97147b3dc6b28577d1e5c62a09d1235197aee6b703437835db363313b2a Jan 29 14:20:07 crc kubenswrapper[4753]: I0129 14:20:07.243068 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c053-account-create-update-pbzxx" event={"ID":"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0","Type":"ContainerStarted","Data":"a568c97147b3dc6b28577d1e5c62a09d1235197aee6b703437835db363313b2a"} Jan 29 14:20:07 crc kubenswrapper[4753]: W0129 14:20:07.291973 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3381688_f092_411b_b2b2_5cdd67813b7d.slice/crio-ef1ab13b6c8df8f6fc9fff3bdb2b89ff564915f6a8f021b0429c7a24dd657a6a WatchSource:0}: Error finding container ef1ab13b6c8df8f6fc9fff3bdb2b89ff564915f6a8f021b0429c7a24dd657a6a: Status 404 returned error can't find the container with id ef1ab13b6c8df8f6fc9fff3bdb2b89ff564915f6a8f021b0429c7a24dd657a6a Jan 29 14:20:07 crc kubenswrapper[4753]: I0129 14:20:07.292164 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-m9d5c"] Jan 29 14:20:07 crc kubenswrapper[4753]: I0129 14:20:07.948611 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-h6msp"] Jan 29 14:20:07 crc kubenswrapper[4753]: I0129 14:20:07.950104 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h6msp" Jan 29 14:20:07 crc kubenswrapper[4753]: I0129 14:20:07.960143 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 29 14:20:07 crc kubenswrapper[4753]: I0129 14:20:07.960542 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h6msp"] Jan 29 14:20:07 crc kubenswrapper[4753]: I0129 14:20:07.981095 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjw9h\" (UniqueName: \"kubernetes.io/projected/6ad50e7c-b519-401c-8121-ab82de569044-kube-api-access-kjw9h\") pod \"root-account-create-update-h6msp\" (UID: \"6ad50e7c-b519-401c-8121-ab82de569044\") " pod="openstack/root-account-create-update-h6msp" Jan 29 14:20:07 crc kubenswrapper[4753]: I0129 14:20:07.981220 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ad50e7c-b519-401c-8121-ab82de569044-operator-scripts\") pod \"root-account-create-update-h6msp\" (UID: \"6ad50e7c-b519-401c-8121-ab82de569044\") " pod="openstack/root-account-create-update-h6msp" Jan 29 14:20:08 crc kubenswrapper[4753]: I0129 14:20:08.083515 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjw9h\" (UniqueName: \"kubernetes.io/projected/6ad50e7c-b519-401c-8121-ab82de569044-kube-api-access-kjw9h\") pod \"root-account-create-update-h6msp\" (UID: \"6ad50e7c-b519-401c-8121-ab82de569044\") " pod="openstack/root-account-create-update-h6msp" Jan 29 14:20:08 crc kubenswrapper[4753]: I0129 14:20:08.083611 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ad50e7c-b519-401c-8121-ab82de569044-operator-scripts\") pod \"root-account-create-update-h6msp\" (UID: \"6ad50e7c-b519-401c-8121-ab82de569044\") " pod="openstack/root-account-create-update-h6msp" Jan 29 14:20:08 crc kubenswrapper[4753]: I0129 14:20:08.084734 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ad50e7c-b519-401c-8121-ab82de569044-operator-scripts\") pod \"root-account-create-update-h6msp\" (UID: \"6ad50e7c-b519-401c-8121-ab82de569044\") " pod="openstack/root-account-create-update-h6msp" Jan 29 14:20:08 crc kubenswrapper[4753]: I0129 14:20:08.103818 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjw9h\" (UniqueName: \"kubernetes.io/projected/6ad50e7c-b519-401c-8121-ab82de569044-kube-api-access-kjw9h\") pod \"root-account-create-update-h6msp\" (UID: \"6ad50e7c-b519-401c-8121-ab82de569044\") " pod="openstack/root-account-create-update-h6msp" Jan 29 14:20:08 crc kubenswrapper[4753]: I0129 14:20:08.251565 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m9d5c" event={"ID":"e3381688-f092-411b-b2b2-5cdd67813b7d","Type":"ContainerStarted","Data":"ef1ab13b6c8df8f6fc9fff3bdb2b89ff564915f6a8f021b0429c7a24dd657a6a"} Jan 29 14:20:08 crc kubenswrapper[4753]: I0129 14:20:08.321351 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h6msp" Jan 29 14:20:08 crc kubenswrapper[4753]: I0129 14:20:08.807335 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h6msp"] Jan 29 14:20:09 crc kubenswrapper[4753]: I0129 14:20:09.260099 4753 generic.go:334] "Generic (PLEG): container finished" podID="ae41990f-dd6c-4bae-ba0c-ebfde88a25a0" containerID="0f4a8a7dd4d3d6d4c66441e081f1c7bde2dc88522a62f0e1514743d865dd861a" exitCode=0 Jan 29 14:20:09 crc kubenswrapper[4753]: I0129 14:20:09.260191 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c053-account-create-update-pbzxx" event={"ID":"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0","Type":"ContainerDied","Data":"0f4a8a7dd4d3d6d4c66441e081f1c7bde2dc88522a62f0e1514743d865dd861a"} Jan 29 14:20:09 crc kubenswrapper[4753]: I0129 14:20:09.262310 4753 generic.go:334] "Generic (PLEG): container finished" podID="e3381688-f092-411b-b2b2-5cdd67813b7d" containerID="197dc1a71d086547b1bd896b6d0217f47e70ba62b316ca866c8afcd65fa24313" exitCode=0 Jan 29 14:20:09 crc kubenswrapper[4753]: I0129 14:20:09.262365 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m9d5c" event={"ID":"e3381688-f092-411b-b2b2-5cdd67813b7d","Type":"ContainerDied","Data":"197dc1a71d086547b1bd896b6d0217f47e70ba62b316ca866c8afcd65fa24313"} Jan 29 14:20:09 crc kubenswrapper[4753]: I0129 14:20:09.264373 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h6msp" event={"ID":"6ad50e7c-b519-401c-8121-ab82de569044","Type":"ContainerStarted","Data":"cdc4d17706fe091cbd13f2095b63490a23b139430dc546836664bce0499b67f6"} Jan 29 14:20:09 crc kubenswrapper[4753]: I0129 14:20:09.264407 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h6msp" event={"ID":"6ad50e7c-b519-401c-8121-ab82de569044","Type":"ContainerStarted","Data":"3ebc8bd21335ebac47c4741f2e5abc63576ed313cd893b3259ca7e71ef7e40cb"} Jan 29 14:20:09 crc kubenswrapper[4753]: I0129 14:20:09.713325 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:20:09 crc kubenswrapper[4753]: I0129 14:20:09.721834 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift\") pod \"swift-storage-0\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " pod="openstack/swift-storage-0" Jan 29 14:20:09 crc kubenswrapper[4753]: I0129 14:20:09.816064 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.271889 4753 generic.go:334] "Generic (PLEG): container finished" podID="6ad50e7c-b519-401c-8121-ab82de569044" containerID="cdc4d17706fe091cbd13f2095b63490a23b139430dc546836664bce0499b67f6" exitCode=0 Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.271950 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h6msp" event={"ID":"6ad50e7c-b519-401c-8121-ab82de569044","Type":"ContainerDied","Data":"cdc4d17706fe091cbd13f2095b63490a23b139430dc546836664bce0499b67f6"} Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.273615 4753 generic.go:334] "Generic (PLEG): container finished" podID="fd640e3a-86af-406d-83c6-2df59b891fc3" containerID="e9817aee647b3f9d81adf2e4ce14d5bae32c01236812cacdf0002005b7c75760" exitCode=0 Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.273674 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s47kf" event={"ID":"fd640e3a-86af-406d-83c6-2df59b891fc3","Type":"ContainerDied","Data":"e9817aee647b3f9d81adf2e4ce14d5bae32c01236812cacdf0002005b7c75760"} Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.447837 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.490876 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mfqkt"] Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.492058 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mfqkt" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.498385 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mfqkt"] Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.527349 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aba56e7-db42-42d9-9586-fb6a145f2a39-operator-scripts\") pod \"keystone-db-create-mfqkt\" (UID: \"6aba56e7-db42-42d9-9586-fb6a145f2a39\") " pod="openstack/keystone-db-create-mfqkt" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.527417 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsv8w\" (UniqueName: \"kubernetes.io/projected/6aba56e7-db42-42d9-9586-fb6a145f2a39-kube-api-access-dsv8w\") pod \"keystone-db-create-mfqkt\" (UID: \"6aba56e7-db42-42d9-9586-fb6a145f2a39\") " pod="openstack/keystone-db-create-mfqkt" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.587468 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2e4b-account-create-update-sbtfx"] Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.588460 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e4b-account-create-update-sbtfx" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.590608 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.596983 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2e4b-account-create-update-sbtfx"] Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.629250 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsv8w\" (UniqueName: \"kubernetes.io/projected/6aba56e7-db42-42d9-9586-fb6a145f2a39-kube-api-access-dsv8w\") pod \"keystone-db-create-mfqkt\" (UID: \"6aba56e7-db42-42d9-9586-fb6a145f2a39\") " pod="openstack/keystone-db-create-mfqkt" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.629302 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-operator-scripts\") pod \"keystone-2e4b-account-create-update-sbtfx\" (UID: \"30b88724-5390-491e-a5b3-0b3fbcbf8bd2\") " pod="openstack/keystone-2e4b-account-create-update-sbtfx" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.629334 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xklc2\" (UniqueName: \"kubernetes.io/projected/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-kube-api-access-xklc2\") pod \"keystone-2e4b-account-create-update-sbtfx\" (UID: \"30b88724-5390-491e-a5b3-0b3fbcbf8bd2\") " pod="openstack/keystone-2e4b-account-create-update-sbtfx" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.629464 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aba56e7-db42-42d9-9586-fb6a145f2a39-operator-scripts\") pod \"keystone-db-create-mfqkt\" (UID: \"6aba56e7-db42-42d9-9586-fb6a145f2a39\") " pod="openstack/keystone-db-create-mfqkt" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.630169 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aba56e7-db42-42d9-9586-fb6a145f2a39-operator-scripts\") pod \"keystone-db-create-mfqkt\" (UID: \"6aba56e7-db42-42d9-9586-fb6a145f2a39\") " pod="openstack/keystone-db-create-mfqkt" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.655895 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsv8w\" (UniqueName: \"kubernetes.io/projected/6aba56e7-db42-42d9-9586-fb6a145f2a39-kube-api-access-dsv8w\") pod \"keystone-db-create-mfqkt\" (UID: \"6aba56e7-db42-42d9-9586-fb6a145f2a39\") " pod="openstack/keystone-db-create-mfqkt" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.691002 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m9d5c" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.695051 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c053-account-create-update-pbzxx" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.730840 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz7kv\" (UniqueName: \"kubernetes.io/projected/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-kube-api-access-sz7kv\") pod \"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0\" (UID: \"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0\") " Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.730918 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-operator-scripts\") pod \"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0\" (UID: \"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0\") " Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.731019 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3381688-f092-411b-b2b2-5cdd67813b7d-operator-scripts\") pod \"e3381688-f092-411b-b2b2-5cdd67813b7d\" (UID: \"e3381688-f092-411b-b2b2-5cdd67813b7d\") " Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.731121 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d9wg\" (UniqueName: \"kubernetes.io/projected/e3381688-f092-411b-b2b2-5cdd67813b7d-kube-api-access-2d9wg\") pod \"e3381688-f092-411b-b2b2-5cdd67813b7d\" (UID: \"e3381688-f092-411b-b2b2-5cdd67813b7d\") " Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.731426 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-operator-scripts\") pod \"keystone-2e4b-account-create-update-sbtfx\" (UID: \"30b88724-5390-491e-a5b3-0b3fbcbf8bd2\") " pod="openstack/keystone-2e4b-account-create-update-sbtfx" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.731465 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xklc2\" (UniqueName: \"kubernetes.io/projected/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-kube-api-access-xklc2\") pod \"keystone-2e4b-account-create-update-sbtfx\" (UID: \"30b88724-5390-491e-a5b3-0b3fbcbf8bd2\") " pod="openstack/keystone-2e4b-account-create-update-sbtfx" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.733125 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae41990f-dd6c-4bae-ba0c-ebfde88a25a0" (UID: "ae41990f-dd6c-4bae-ba0c-ebfde88a25a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.734274 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-operator-scripts\") pod \"keystone-2e4b-account-create-update-sbtfx\" (UID: \"30b88724-5390-491e-a5b3-0b3fbcbf8bd2\") " pod="openstack/keystone-2e4b-account-create-update-sbtfx" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.734710 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3381688-f092-411b-b2b2-5cdd67813b7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3381688-f092-411b-b2b2-5cdd67813b7d" (UID: "e3381688-f092-411b-b2b2-5cdd67813b7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.736409 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3381688-f092-411b-b2b2-5cdd67813b7d-kube-api-access-2d9wg" (OuterVolumeSpecName: "kube-api-access-2d9wg") pod "e3381688-f092-411b-b2b2-5cdd67813b7d" (UID: "e3381688-f092-411b-b2b2-5cdd67813b7d"). InnerVolumeSpecName "kube-api-access-2d9wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.736835 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-kube-api-access-sz7kv" (OuterVolumeSpecName: "kube-api-access-sz7kv") pod "ae41990f-dd6c-4bae-ba0c-ebfde88a25a0" (UID: "ae41990f-dd6c-4bae-ba0c-ebfde88a25a0"). InnerVolumeSpecName "kube-api-access-sz7kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.749415 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xklc2\" (UniqueName: \"kubernetes.io/projected/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-kube-api-access-xklc2\") pod \"keystone-2e4b-account-create-update-sbtfx\" (UID: \"30b88724-5390-491e-a5b3-0b3fbcbf8bd2\") " pod="openstack/keystone-2e4b-account-create-update-sbtfx" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.817695 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mfqkt" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.833369 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3381688-f092-411b-b2b2-5cdd67813b7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.833401 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d9wg\" (UniqueName: \"kubernetes.io/projected/e3381688-f092-411b-b2b2-5cdd67813b7d-kube-api-access-2d9wg\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.833411 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz7kv\" (UniqueName: \"kubernetes.io/projected/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-kube-api-access-sz7kv\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.833422 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.921110 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e4b-account-create-update-sbtfx" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.954601 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-gqp9n"] Jan 29 14:20:10 crc kubenswrapper[4753]: E0129 14:20:10.955026 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae41990f-dd6c-4bae-ba0c-ebfde88a25a0" containerName="mariadb-account-create-update" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.955042 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae41990f-dd6c-4bae-ba0c-ebfde88a25a0" containerName="mariadb-account-create-update" Jan 29 14:20:10 crc kubenswrapper[4753]: E0129 14:20:10.955069 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3381688-f092-411b-b2b2-5cdd67813b7d" containerName="mariadb-database-create" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.955077 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3381688-f092-411b-b2b2-5cdd67813b7d" containerName="mariadb-database-create" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.955302 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3381688-f092-411b-b2b2-5cdd67813b7d" containerName="mariadb-database-create" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.955325 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae41990f-dd6c-4bae-ba0c-ebfde88a25a0" containerName="mariadb-account-create-update" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.956003 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gqp9n" Jan 29 14:20:10 crc kubenswrapper[4753]: I0129 14:20:10.969840 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gqp9n"] Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.038592 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-operator-scripts\") pod \"placement-db-create-gqp9n\" (UID: \"020d51a4-77d3-4fa1-8184-5867ecf2d6d1\") " pod="openstack/placement-db-create-gqp9n" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.038657 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4pvh\" (UniqueName: \"kubernetes.io/projected/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-kube-api-access-r4pvh\") pod \"placement-db-create-gqp9n\" (UID: \"020d51a4-77d3-4fa1-8184-5867ecf2d6d1\") " pod="openstack/placement-db-create-gqp9n" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.082025 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-403e-account-create-update-fksgc"] Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.084024 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-403e-account-create-update-fksgc" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.085990 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.094344 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-403e-account-create-update-fksgc"] Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.141002 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4pvh\" (UniqueName: \"kubernetes.io/projected/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-kube-api-access-r4pvh\") pod \"placement-db-create-gqp9n\" (UID: \"020d51a4-77d3-4fa1-8184-5867ecf2d6d1\") " pod="openstack/placement-db-create-gqp9n" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.141111 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7xhh\" (UniqueName: \"kubernetes.io/projected/94338586-7af0-45e5-b6aa-780295200d4e-kube-api-access-k7xhh\") pod \"placement-403e-account-create-update-fksgc\" (UID: \"94338586-7af0-45e5-b6aa-780295200d4e\") " pod="openstack/placement-403e-account-create-update-fksgc" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.141138 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94338586-7af0-45e5-b6aa-780295200d4e-operator-scripts\") pod \"placement-403e-account-create-update-fksgc\" (UID: \"94338586-7af0-45e5-b6aa-780295200d4e\") " pod="openstack/placement-403e-account-create-update-fksgc" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.141360 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-operator-scripts\") pod \"placement-db-create-gqp9n\" (UID: \"020d51a4-77d3-4fa1-8184-5867ecf2d6d1\") " pod="openstack/placement-db-create-gqp9n" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.142036 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-operator-scripts\") pod \"placement-db-create-gqp9n\" (UID: \"020d51a4-77d3-4fa1-8184-5867ecf2d6d1\") " pod="openstack/placement-db-create-gqp9n" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.157816 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4pvh\" (UniqueName: \"kubernetes.io/projected/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-kube-api-access-r4pvh\") pod \"placement-db-create-gqp9n\" (UID: \"020d51a4-77d3-4fa1-8184-5867ecf2d6d1\") " pod="openstack/placement-db-create-gqp9n" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.242813 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7xhh\" (UniqueName: \"kubernetes.io/projected/94338586-7af0-45e5-b6aa-780295200d4e-kube-api-access-k7xhh\") pod \"placement-403e-account-create-update-fksgc\" (UID: \"94338586-7af0-45e5-b6aa-780295200d4e\") " pod="openstack/placement-403e-account-create-update-fksgc" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.243296 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94338586-7af0-45e5-b6aa-780295200d4e-operator-scripts\") pod \"placement-403e-account-create-update-fksgc\" (UID: \"94338586-7af0-45e5-b6aa-780295200d4e\") " pod="openstack/placement-403e-account-create-update-fksgc" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.244021 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94338586-7af0-45e5-b6aa-780295200d4e-operator-scripts\") pod \"placement-403e-account-create-update-fksgc\" (UID: \"94338586-7af0-45e5-b6aa-780295200d4e\") " pod="openstack/placement-403e-account-create-update-fksgc" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.265009 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7xhh\" (UniqueName: \"kubernetes.io/projected/94338586-7af0-45e5-b6aa-780295200d4e-kube-api-access-k7xhh\") pod \"placement-403e-account-create-update-fksgc\" (UID: \"94338586-7af0-45e5-b6aa-780295200d4e\") " pod="openstack/placement-403e-account-create-update-fksgc" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.283641 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c053-account-create-update-pbzxx" event={"ID":"ae41990f-dd6c-4bae-ba0c-ebfde88a25a0","Type":"ContainerDied","Data":"a568c97147b3dc6b28577d1e5c62a09d1235197aee6b703437835db363313b2a"} Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.283684 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a568c97147b3dc6b28577d1e5c62a09d1235197aee6b703437835db363313b2a" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.283742 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c053-account-create-update-pbzxx" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.285918 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m9d5c" event={"ID":"e3381688-f092-411b-b2b2-5cdd67813b7d","Type":"ContainerDied","Data":"ef1ab13b6c8df8f6fc9fff3bdb2b89ff564915f6a8f021b0429c7a24dd657a6a"} Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.285969 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m9d5c" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.285991 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef1ab13b6c8df8f6fc9fff3bdb2b89ff564915f6a8f021b0429c7a24dd657a6a" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.287056 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"85ed28ab259952d0e0fdba457c93a2bac0461baa62a68e52c5d1b868679b1e3a"} Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.351651 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mfqkt"] Jan 29 14:20:11 crc kubenswrapper[4753]: W0129 14:20:11.354899 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aba56e7_db42_42d9_9586_fb6a145f2a39.slice/crio-3d9f13946cb1a97ab65e83c5f36564243d4940058ac6fa642f65a6f5af206645 WatchSource:0}: Error finding container 3d9f13946cb1a97ab65e83c5f36564243d4940058ac6fa642f65a6f5af206645: Status 404 returned error can't find the container with id 3d9f13946cb1a97ab65e83c5f36564243d4940058ac6fa642f65a6f5af206645 Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.357987 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gqp9n" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.408987 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-403e-account-create-update-fksgc" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.444458 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2e4b-account-create-update-sbtfx"] Jan 29 14:20:11 crc kubenswrapper[4753]: W0129 14:20:11.469567 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30b88724_5390_491e_a5b3_0b3fbcbf8bd2.slice/crio-4689b2192096904a9da4831a4a88ec17b2209bffc1d45b876c4988a66dcb16c1 WatchSource:0}: Error finding container 4689b2192096904a9da4831a4a88ec17b2209bffc1d45b876c4988a66dcb16c1: Status 404 returned error can't find the container with id 4689b2192096904a9da4831a4a88ec17b2209bffc1d45b876c4988a66dcb16c1 Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.612871 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h6msp" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.648862 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjw9h\" (UniqueName: \"kubernetes.io/projected/6ad50e7c-b519-401c-8121-ab82de569044-kube-api-access-kjw9h\") pod \"6ad50e7c-b519-401c-8121-ab82de569044\" (UID: \"6ad50e7c-b519-401c-8121-ab82de569044\") " Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.648896 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ad50e7c-b519-401c-8121-ab82de569044-operator-scripts\") pod \"6ad50e7c-b519-401c-8121-ab82de569044\" (UID: \"6ad50e7c-b519-401c-8121-ab82de569044\") " Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.650078 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad50e7c-b519-401c-8121-ab82de569044-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ad50e7c-b519-401c-8121-ab82de569044" (UID: "6ad50e7c-b519-401c-8121-ab82de569044"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.656839 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ad50e7c-b519-401c-8121-ab82de569044-kube-api-access-kjw9h" (OuterVolumeSpecName: "kube-api-access-kjw9h") pod "6ad50e7c-b519-401c-8121-ab82de569044" (UID: "6ad50e7c-b519-401c-8121-ab82de569044"). InnerVolumeSpecName "kube-api-access-kjw9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.686915 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.749924 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-ring-data-devices\") pod \"fd640e3a-86af-406d-83c6-2df59b891fc3\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.749969 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-swiftconf\") pod \"fd640e3a-86af-406d-83c6-2df59b891fc3\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.750114 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-dispersionconf\") pod \"fd640e3a-86af-406d-83c6-2df59b891fc3\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.750136 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd640e3a-86af-406d-83c6-2df59b891fc3-etc-swift\") pod \"fd640e3a-86af-406d-83c6-2df59b891fc3\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.750223 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-combined-ca-bundle\") pod \"fd640e3a-86af-406d-83c6-2df59b891fc3\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.750250 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s52rp\" (UniqueName: \"kubernetes.io/projected/fd640e3a-86af-406d-83c6-2df59b891fc3-kube-api-access-s52rp\") pod \"fd640e3a-86af-406d-83c6-2df59b891fc3\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.750269 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-scripts\") pod \"fd640e3a-86af-406d-83c6-2df59b891fc3\" (UID: \"fd640e3a-86af-406d-83c6-2df59b891fc3\") " Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.750591 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjw9h\" (UniqueName: \"kubernetes.io/projected/6ad50e7c-b519-401c-8121-ab82de569044-kube-api-access-kjw9h\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.750607 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ad50e7c-b519-401c-8121-ab82de569044-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.751678 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fd640e3a-86af-406d-83c6-2df59b891fc3" (UID: "fd640e3a-86af-406d-83c6-2df59b891fc3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.756532 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd640e3a-86af-406d-83c6-2df59b891fc3-kube-api-access-s52rp" (OuterVolumeSpecName: "kube-api-access-s52rp") pod "fd640e3a-86af-406d-83c6-2df59b891fc3" (UID: "fd640e3a-86af-406d-83c6-2df59b891fc3"). InnerVolumeSpecName "kube-api-access-s52rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.757784 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd640e3a-86af-406d-83c6-2df59b891fc3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fd640e3a-86af-406d-83c6-2df59b891fc3" (UID: "fd640e3a-86af-406d-83c6-2df59b891fc3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.761978 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fd640e3a-86af-406d-83c6-2df59b891fc3" (UID: "fd640e3a-86af-406d-83c6-2df59b891fc3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.785862 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-scripts" (OuterVolumeSpecName: "scripts") pod "fd640e3a-86af-406d-83c6-2df59b891fc3" (UID: "fd640e3a-86af-406d-83c6-2df59b891fc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.789918 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd640e3a-86af-406d-83c6-2df59b891fc3" (UID: "fd640e3a-86af-406d-83c6-2df59b891fc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.793514 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fd640e3a-86af-406d-83c6-2df59b891fc3" (UID: "fd640e3a-86af-406d-83c6-2df59b891fc3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.852279 4753 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.852610 4753 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.852619 4753 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.852627 4753 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd640e3a-86af-406d-83c6-2df59b891fc3-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.852635 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd640e3a-86af-406d-83c6-2df59b891fc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.852644 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s52rp\" (UniqueName: \"kubernetes.io/projected/fd640e3a-86af-406d-83c6-2df59b891fc3-kube-api-access-s52rp\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.852654 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd640e3a-86af-406d-83c6-2df59b891fc3-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:11 crc kubenswrapper[4753]: I0129 14:20:11.931797 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gqp9n"] Jan 29 14:20:11 crc kubenswrapper[4753]: W0129 14:20:11.986741 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod020d51a4_77d3_4fa1_8184_5867ecf2d6d1.slice/crio-a9aac4414445a868da8e6b7d00a09bde0b20d3327219655eaaa72f5e1be51a55 WatchSource:0}: Error finding container a9aac4414445a868da8e6b7d00a09bde0b20d3327219655eaaa72f5e1be51a55: Status 404 returned error can't find the container with id a9aac4414445a868da8e6b7d00a09bde0b20d3327219655eaaa72f5e1be51a55 Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.010574 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-403e-account-create-update-fksgc"] Jan 29 14:20:12 crc kubenswrapper[4753]: W0129 14:20:12.017362 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94338586_7af0_45e5_b6aa_780295200d4e.slice/crio-d174cb054e0fdbd565ba7afa8dd93f5e5bb48d0ac545620f181b2003ed540be6 WatchSource:0}: Error finding container d174cb054e0fdbd565ba7afa8dd93f5e5bb48d0ac545620f181b2003ed540be6: Status 404 returned error can't find the container with id d174cb054e0fdbd565ba7afa8dd93f5e5bb48d0ac545620f181b2003ed540be6 Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.219084 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rm9d5" podUID="5ca7a69c-2f29-46d8-ab2a-67393114629f" containerName="ovn-controller" probeResult="failure" output=< Jan 29 14:20:12 crc kubenswrapper[4753]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 29 14:20:12 crc kubenswrapper[4753]: > Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.296524 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-403e-account-create-update-fksgc" event={"ID":"94338586-7af0-45e5-b6aa-780295200d4e","Type":"ContainerStarted","Data":"576381ff3eac0a6ba00cc42f35866b9e306929cc77dfe313fbee2eabe20eabfd"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.296837 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-403e-account-create-update-fksgc" event={"ID":"94338586-7af0-45e5-b6aa-780295200d4e","Type":"ContainerStarted","Data":"d174cb054e0fdbd565ba7afa8dd93f5e5bb48d0ac545620f181b2003ed540be6"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.299343 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"894c79912ca8aa5fcde9530fa2c8b0075ce313eea0c4e93fccea72f8d718e504"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.301357 4753 generic.go:334] "Generic (PLEG): container finished" podID="30b88724-5390-491e-a5b3-0b3fbcbf8bd2" containerID="273ad29011f17b33fc1b0ca8c43fc417a7a9fcd227644ac6dea1d27d6b03361b" exitCode=0 Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.301417 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2e4b-account-create-update-sbtfx" event={"ID":"30b88724-5390-491e-a5b3-0b3fbcbf8bd2","Type":"ContainerDied","Data":"273ad29011f17b33fc1b0ca8c43fc417a7a9fcd227644ac6dea1d27d6b03361b"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.301448 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2e4b-account-create-update-sbtfx" event={"ID":"30b88724-5390-491e-a5b3-0b3fbcbf8bd2","Type":"ContainerStarted","Data":"4689b2192096904a9da4831a4a88ec17b2209bffc1d45b876c4988a66dcb16c1"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.303118 4753 generic.go:334] "Generic (PLEG): container finished" podID="6aba56e7-db42-42d9-9586-fb6a145f2a39" containerID="ebd26770e4f3deee8e6cab6959319039b301c068d712e07635f86a428ddaecb3" exitCode=0 Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.303290 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mfqkt" event={"ID":"6aba56e7-db42-42d9-9586-fb6a145f2a39","Type":"ContainerDied","Data":"ebd26770e4f3deee8e6cab6959319039b301c068d712e07635f86a428ddaecb3"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.303313 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mfqkt" event={"ID":"6aba56e7-db42-42d9-9586-fb6a145f2a39","Type":"ContainerStarted","Data":"3d9f13946cb1a97ab65e83c5f36564243d4940058ac6fa642f65a6f5af206645"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.306184 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h6msp" Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.306176 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h6msp" event={"ID":"6ad50e7c-b519-401c-8121-ab82de569044","Type":"ContainerDied","Data":"3ebc8bd21335ebac47c4741f2e5abc63576ed313cd893b3259ca7e71ef7e40cb"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.306333 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ebc8bd21335ebac47c4741f2e5abc63576ed313cd893b3259ca7e71ef7e40cb" Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.309398 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gqp9n" event={"ID":"020d51a4-77d3-4fa1-8184-5867ecf2d6d1","Type":"ContainerStarted","Data":"b74d0aaf87d9391cd8357f4b54f86596f5fc1f32b80ddcb4a7b5f92ee6f68faf"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.309427 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gqp9n" event={"ID":"020d51a4-77d3-4fa1-8184-5867ecf2d6d1","Type":"ContainerStarted","Data":"a9aac4414445a868da8e6b7d00a09bde0b20d3327219655eaaa72f5e1be51a55"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.310678 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-s47kf" event={"ID":"fd640e3a-86af-406d-83c6-2df59b891fc3","Type":"ContainerDied","Data":"e581e52d8cec50ba5e8993255685b79a23d4de3fc28b9688af9948a825c6bfc9"} Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.310711 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e581e52d8cec50ba5e8993255685b79a23d4de3fc28b9688af9948a825c6bfc9" Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.310734 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s47kf" Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.322762 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-403e-account-create-update-fksgc" podStartSLOduration=1.322739795 podStartE2EDuration="1.322739795s" podCreationTimestamp="2026-01-29 14:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:20:12.316886587 +0000 UTC m=+1047.011620979" watchObservedRunningTime="2026-01-29 14:20:12.322739795 +0000 UTC m=+1047.017474187" Jan 29 14:20:12 crc kubenswrapper[4753]: I0129 14:20:12.366657 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-gqp9n" podStartSLOduration=2.36663999 podStartE2EDuration="2.36663999s" podCreationTimestamp="2026-01-29 14:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:20:12.362401326 +0000 UTC m=+1047.057135718" watchObservedRunningTime="2026-01-29 14:20:12.36663999 +0000 UTC m=+1047.061374392" Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.324512 4753 generic.go:334] "Generic (PLEG): container finished" podID="94338586-7af0-45e5-b6aa-780295200d4e" containerID="576381ff3eac0a6ba00cc42f35866b9e306929cc77dfe313fbee2eabe20eabfd" exitCode=0 Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.324812 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-403e-account-create-update-fksgc" event={"ID":"94338586-7af0-45e5-b6aa-780295200d4e","Type":"ContainerDied","Data":"576381ff3eac0a6ba00cc42f35866b9e306929cc77dfe313fbee2eabe20eabfd"} Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.328711 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"8c14da07d1d3d581ed1cdb3f263473ebec5da8f60db91aaf8dae95021a6acbfa"} Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.331301 4753 generic.go:334] "Generic (PLEG): container finished" podID="020d51a4-77d3-4fa1-8184-5867ecf2d6d1" containerID="b74d0aaf87d9391cd8357f4b54f86596f5fc1f32b80ddcb4a7b5f92ee6f68faf" exitCode=0 Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.331371 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gqp9n" event={"ID":"020d51a4-77d3-4fa1-8184-5867ecf2d6d1","Type":"ContainerDied","Data":"b74d0aaf87d9391cd8357f4b54f86596f5fc1f32b80ddcb4a7b5f92ee6f68faf"} Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.785554 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e4b-account-create-update-sbtfx" Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.787256 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mfqkt" Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.883112 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-operator-scripts\") pod \"30b88724-5390-491e-a5b3-0b3fbcbf8bd2\" (UID: \"30b88724-5390-491e-a5b3-0b3fbcbf8bd2\") " Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.883252 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xklc2\" (UniqueName: \"kubernetes.io/projected/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-kube-api-access-xklc2\") pod \"30b88724-5390-491e-a5b3-0b3fbcbf8bd2\" (UID: \"30b88724-5390-491e-a5b3-0b3fbcbf8bd2\") " Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.883307 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aba56e7-db42-42d9-9586-fb6a145f2a39-operator-scripts\") pod \"6aba56e7-db42-42d9-9586-fb6a145f2a39\" (UID: \"6aba56e7-db42-42d9-9586-fb6a145f2a39\") " Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.883404 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsv8w\" (UniqueName: \"kubernetes.io/projected/6aba56e7-db42-42d9-9586-fb6a145f2a39-kube-api-access-dsv8w\") pod \"6aba56e7-db42-42d9-9586-fb6a145f2a39\" (UID: \"6aba56e7-db42-42d9-9586-fb6a145f2a39\") " Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.884755 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aba56e7-db42-42d9-9586-fb6a145f2a39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6aba56e7-db42-42d9-9586-fb6a145f2a39" (UID: "6aba56e7-db42-42d9-9586-fb6a145f2a39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.885213 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30b88724-5390-491e-a5b3-0b3fbcbf8bd2" (UID: "30b88724-5390-491e-a5b3-0b3fbcbf8bd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.889288 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aba56e7-db42-42d9-9586-fb6a145f2a39-kube-api-access-dsv8w" (OuterVolumeSpecName: "kube-api-access-dsv8w") pod "6aba56e7-db42-42d9-9586-fb6a145f2a39" (UID: "6aba56e7-db42-42d9-9586-fb6a145f2a39"). InnerVolumeSpecName "kube-api-access-dsv8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.889330 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-kube-api-access-xklc2" (OuterVolumeSpecName: "kube-api-access-xklc2") pod "30b88724-5390-491e-a5b3-0b3fbcbf8bd2" (UID: "30b88724-5390-491e-a5b3-0b3fbcbf8bd2"). InnerVolumeSpecName "kube-api-access-xklc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.986179 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xklc2\" (UniqueName: \"kubernetes.io/projected/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-kube-api-access-xklc2\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.986427 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aba56e7-db42-42d9-9586-fb6a145f2a39-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.986534 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsv8w\" (UniqueName: \"kubernetes.io/projected/6aba56e7-db42-42d9-9586-fb6a145f2a39-kube-api-access-dsv8w\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:13 crc kubenswrapper[4753]: I0129 14:20:13.986625 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30b88724-5390-491e-a5b3-0b3fbcbf8bd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.344083 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"851cf5e8ab4d46230b00d65ff6d2fe461116124fb2be946904a0939073438463"} Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.344686 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"e165e57a40f6bc0c996660f2d010eaf2048fe443cee81d3a5bb35a9b618cefea"} Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.347954 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2e4b-account-create-update-sbtfx" event={"ID":"30b88724-5390-491e-a5b3-0b3fbcbf8bd2","Type":"ContainerDied","Data":"4689b2192096904a9da4831a4a88ec17b2209bffc1d45b876c4988a66dcb16c1"} Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.347987 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e4b-account-create-update-sbtfx" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.347995 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4689b2192096904a9da4831a4a88ec17b2209bffc1d45b876c4988a66dcb16c1" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.352240 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mfqkt" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.352863 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mfqkt" event={"ID":"6aba56e7-db42-42d9-9586-fb6a145f2a39","Type":"ContainerDied","Data":"3d9f13946cb1a97ab65e83c5f36564243d4940058ac6fa642f65a6f5af206645"} Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.353060 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d9f13946cb1a97ab65e83c5f36564243d4940058ac6fa642f65a6f5af206645" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.693505 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-h6msp"] Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.699964 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-h6msp"] Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.781423 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-403e-account-create-update-fksgc" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.787344 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gqp9n" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.799024 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7xhh\" (UniqueName: \"kubernetes.io/projected/94338586-7af0-45e5-b6aa-780295200d4e-kube-api-access-k7xhh\") pod \"94338586-7af0-45e5-b6aa-780295200d4e\" (UID: \"94338586-7af0-45e5-b6aa-780295200d4e\") " Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.799074 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94338586-7af0-45e5-b6aa-780295200d4e-operator-scripts\") pod \"94338586-7af0-45e5-b6aa-780295200d4e\" (UID: \"94338586-7af0-45e5-b6aa-780295200d4e\") " Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.799237 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4pvh\" (UniqueName: \"kubernetes.io/projected/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-kube-api-access-r4pvh\") pod \"020d51a4-77d3-4fa1-8184-5867ecf2d6d1\" (UID: \"020d51a4-77d3-4fa1-8184-5867ecf2d6d1\") " Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.799345 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-operator-scripts\") pod \"020d51a4-77d3-4fa1-8184-5867ecf2d6d1\" (UID: \"020d51a4-77d3-4fa1-8184-5867ecf2d6d1\") " Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.800007 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94338586-7af0-45e5-b6aa-780295200d4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94338586-7af0-45e5-b6aa-780295200d4e" (UID: "94338586-7af0-45e5-b6aa-780295200d4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.800413 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "020d51a4-77d3-4fa1-8184-5867ecf2d6d1" (UID: "020d51a4-77d3-4fa1-8184-5867ecf2d6d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.804486 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94338586-7af0-45e5-b6aa-780295200d4e-kube-api-access-k7xhh" (OuterVolumeSpecName: "kube-api-access-k7xhh") pod "94338586-7af0-45e5-b6aa-780295200d4e" (UID: "94338586-7af0-45e5-b6aa-780295200d4e"). InnerVolumeSpecName "kube-api-access-k7xhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.805024 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-kube-api-access-r4pvh" (OuterVolumeSpecName: "kube-api-access-r4pvh") pod "020d51a4-77d3-4fa1-8184-5867ecf2d6d1" (UID: "020d51a4-77d3-4fa1-8184-5867ecf2d6d1"). InnerVolumeSpecName "kube-api-access-r4pvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.901353 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4pvh\" (UniqueName: \"kubernetes.io/projected/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-kube-api-access-r4pvh\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.901390 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/020d51a4-77d3-4fa1-8184-5867ecf2d6d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.901408 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7xhh\" (UniqueName: \"kubernetes.io/projected/94338586-7af0-45e5-b6aa-780295200d4e-kube-api-access-k7xhh\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:14 crc kubenswrapper[4753]: I0129 14:20:14.901419 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94338586-7af0-45e5-b6aa-780295200d4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:15 crc kubenswrapper[4753]: I0129 14:20:15.056750 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 29 14:20:15 crc kubenswrapper[4753]: I0129 14:20:15.390395 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"902624d332ba6e97b89215d66460f6d1894207ecaac9579489c0f78d378469ed"} Jan 29 14:20:15 crc kubenswrapper[4753]: I0129 14:20:15.393082 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gqp9n" Jan 29 14:20:15 crc kubenswrapper[4753]: I0129 14:20:15.393084 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gqp9n" event={"ID":"020d51a4-77d3-4fa1-8184-5867ecf2d6d1","Type":"ContainerDied","Data":"a9aac4414445a868da8e6b7d00a09bde0b20d3327219655eaaa72f5e1be51a55"} Jan 29 14:20:15 crc kubenswrapper[4753]: I0129 14:20:15.393480 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9aac4414445a868da8e6b7d00a09bde0b20d3327219655eaaa72f5e1be51a55" Jan 29 14:20:15 crc kubenswrapper[4753]: I0129 14:20:15.395805 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-403e-account-create-update-fksgc" event={"ID":"94338586-7af0-45e5-b6aa-780295200d4e","Type":"ContainerDied","Data":"d174cb054e0fdbd565ba7afa8dd93f5e5bb48d0ac545620f181b2003ed540be6"} Jan 29 14:20:15 crc kubenswrapper[4753]: I0129 14:20:15.395848 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d174cb054e0fdbd565ba7afa8dd93f5e5bb48d0ac545620f181b2003ed540be6" Jan 29 14:20:15 crc kubenswrapper[4753]: I0129 14:20:15.395919 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-403e-account-create-update-fksgc" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.176545 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ad50e7c-b519-401c-8121-ab82de569044" path="/var/lib/kubelet/pods/6ad50e7c-b519-401c-8121-ab82de569044/volumes" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.410503 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"2f0532da04eff45cc1a9f36829b01187bceb564dbe88cdc93585471ff9447783"} Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.410543 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"9cc09e193c62f0685bbeb053af7bcee09164f6f7a1f094768044b967541fdd99"} Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.410554 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"e4b8c37d71f04f91b5041bbfd2ebe5398fdb2a0eabb2c788da216bac1701192b"} Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.526797 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-gv87j"] Jan 29 14:20:16 crc kubenswrapper[4753]: E0129 14:20:16.527353 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b88724-5390-491e-a5b3-0b3fbcbf8bd2" containerName="mariadb-account-create-update" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527384 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b88724-5390-491e-a5b3-0b3fbcbf8bd2" containerName="mariadb-account-create-update" Jan 29 14:20:16 crc kubenswrapper[4753]: E0129 14:20:16.527415 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad50e7c-b519-401c-8121-ab82de569044" containerName="mariadb-account-create-update" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527429 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad50e7c-b519-401c-8121-ab82de569044" containerName="mariadb-account-create-update" Jan 29 14:20:16 crc kubenswrapper[4753]: E0129 14:20:16.527449 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94338586-7af0-45e5-b6aa-780295200d4e" containerName="mariadb-account-create-update" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527464 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="94338586-7af0-45e5-b6aa-780295200d4e" containerName="mariadb-account-create-update" Jan 29 14:20:16 crc kubenswrapper[4753]: E0129 14:20:16.527491 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020d51a4-77d3-4fa1-8184-5867ecf2d6d1" containerName="mariadb-database-create" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527504 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="020d51a4-77d3-4fa1-8184-5867ecf2d6d1" containerName="mariadb-database-create" Jan 29 14:20:16 crc kubenswrapper[4753]: E0129 14:20:16.527527 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd640e3a-86af-406d-83c6-2df59b891fc3" containerName="swift-ring-rebalance" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527542 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd640e3a-86af-406d-83c6-2df59b891fc3" containerName="swift-ring-rebalance" Jan 29 14:20:16 crc kubenswrapper[4753]: E0129 14:20:16.527564 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aba56e7-db42-42d9-9586-fb6a145f2a39" containerName="mariadb-database-create" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527577 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aba56e7-db42-42d9-9586-fb6a145f2a39" containerName="mariadb-database-create" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527855 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd640e3a-86af-406d-83c6-2df59b891fc3" containerName="swift-ring-rebalance" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527878 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad50e7c-b519-401c-8121-ab82de569044" containerName="mariadb-account-create-update" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527901 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="94338586-7af0-45e5-b6aa-780295200d4e" containerName="mariadb-account-create-update" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527933 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="020d51a4-77d3-4fa1-8184-5867ecf2d6d1" containerName="mariadb-database-create" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527954 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aba56e7-db42-42d9-9586-fb6a145f2a39" containerName="mariadb-database-create" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.527983 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b88724-5390-491e-a5b3-0b3fbcbf8bd2" containerName="mariadb-account-create-update" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.528931 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.533625 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.533760 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-thtgt" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.536923 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gv87j"] Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.643862 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-combined-ca-bundle\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.644199 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-config-data\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.644310 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-db-sync-config-data\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.644638 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9bbt\" (UniqueName: \"kubernetes.io/projected/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-kube-api-access-m9bbt\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.745526 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9bbt\" (UniqueName: \"kubernetes.io/projected/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-kube-api-access-m9bbt\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.745599 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-combined-ca-bundle\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.745625 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-config-data\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.745664 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-db-sync-config-data\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.751034 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-config-data\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.751175 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-combined-ca-bundle\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.762427 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-db-sync-config-data\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.771420 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9bbt\" (UniqueName: \"kubernetes.io/projected/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-kube-api-access-m9bbt\") pod \"glance-db-sync-gv87j\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:16 crc kubenswrapper[4753]: I0129 14:20:16.853731 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.200696 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rm9d5" podUID="5ca7a69c-2f29-46d8-ab2a-67393114629f" containerName="ovn-controller" probeResult="failure" output=< Jan 29 14:20:17 crc kubenswrapper[4753]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 29 14:20:17 crc kubenswrapper[4753]: > Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.227789 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.245616 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.434746 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"3cebc2a254087212c9cd362b275183545752ff8879dd6da9c51d87070dfa4dab"} Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.434810 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"f7948c4c2ea98b702b132274cc49f43a7f9f174f2d4a40192b402344998935a2"} Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.443732 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rm9d5-config-dfxwp"] Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.444713 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.452052 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.461304 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-additional-scripts\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.461361 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run-ovn\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.461392 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-scripts\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.461422 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.461456 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j545k\" (UniqueName: \"kubernetes.io/projected/6f3ff39f-51df-4a69-a245-fed0285dbbbe-kube-api-access-j545k\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.461518 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-log-ovn\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.472965 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rm9d5-config-dfxwp"] Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.563327 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-additional-scripts\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.563376 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run-ovn\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.563399 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-scripts\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.563419 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.563442 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j545k\" (UniqueName: \"kubernetes.io/projected/6f3ff39f-51df-4a69-a245-fed0285dbbbe-kube-api-access-j545k\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.563480 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-log-ovn\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.563790 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-log-ovn\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.564433 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-additional-scripts\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.564480 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.564774 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run-ovn\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.565813 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-scripts\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.593614 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gv87j"] Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.609847 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j545k\" (UniqueName: \"kubernetes.io/projected/6f3ff39f-51df-4a69-a245-fed0285dbbbe-kube-api-access-j545k\") pod \"ovn-controller-rm9d5-config-dfxwp\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.739373 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 29 14:20:17 crc kubenswrapper[4753]: I0129 14:20:17.773557 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.039010 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-zjqlx"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.040440 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zjqlx" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.069465 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zjqlx"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.116308 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.142279 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4mdvn"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.145625 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4mdvn" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.206466 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h5cd\" (UniqueName: \"kubernetes.io/projected/b47fe495-e567-4097-82be-eff62586e721-kube-api-access-4h5cd\") pod \"cinder-db-create-zjqlx\" (UID: \"b47fe495-e567-4097-82be-eff62586e721\") " pod="openstack/cinder-db-create-zjqlx" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.206535 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b47fe495-e567-4097-82be-eff62586e721-operator-scripts\") pod \"cinder-db-create-zjqlx\" (UID: \"b47fe495-e567-4097-82be-eff62586e721\") " pod="openstack/cinder-db-create-zjqlx" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.237930 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4mdvn"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.237970 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9257-account-create-update-gx8hz"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.239564 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9257-account-create-update-gx8hz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.244290 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9257-account-create-update-gx8hz"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.256387 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.277738 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f8b4-account-create-update-9k466"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.278870 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8b4-account-create-update-9k466" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.283944 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.301733 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f8b4-account-create-update-9k466"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.311616 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k69qr\" (UniqueName: \"kubernetes.io/projected/a96f22ef-0591-44e5-a14d-4517ca0dcf14-kube-api-access-k69qr\") pod \"barbican-db-create-4mdvn\" (UID: \"a96f22ef-0591-44e5-a14d-4517ca0dcf14\") " pod="openstack/barbican-db-create-4mdvn" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.311732 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h5cd\" (UniqueName: \"kubernetes.io/projected/b47fe495-e567-4097-82be-eff62586e721-kube-api-access-4h5cd\") pod \"cinder-db-create-zjqlx\" (UID: \"b47fe495-e567-4097-82be-eff62586e721\") " pod="openstack/cinder-db-create-zjqlx" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.311775 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b47fe495-e567-4097-82be-eff62586e721-operator-scripts\") pod \"cinder-db-create-zjqlx\" (UID: \"b47fe495-e567-4097-82be-eff62586e721\") " pod="openstack/cinder-db-create-zjqlx" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.311903 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a96f22ef-0591-44e5-a14d-4517ca0dcf14-operator-scripts\") pod \"barbican-db-create-4mdvn\" (UID: \"a96f22ef-0591-44e5-a14d-4517ca0dcf14\") " pod="openstack/barbican-db-create-4mdvn" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.312822 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b47fe495-e567-4097-82be-eff62586e721-operator-scripts\") pod \"cinder-db-create-zjqlx\" (UID: \"b47fe495-e567-4097-82be-eff62586e721\") " pod="openstack/cinder-db-create-zjqlx" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.335919 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h5cd\" (UniqueName: \"kubernetes.io/projected/b47fe495-e567-4097-82be-eff62586e721-kube-api-access-4h5cd\") pod \"cinder-db-create-zjqlx\" (UID: \"b47fe495-e567-4097-82be-eff62586e721\") " pod="openstack/cinder-db-create-zjqlx" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.406492 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rm9d5-config-dfxwp"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.408814 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zjqlx" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.416956 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5jv2\" (UniqueName: \"kubernetes.io/projected/1b970443-356c-45b4-a764-800416b926e2-kube-api-access-k5jv2\") pod \"barbican-9257-account-create-update-gx8hz\" (UID: \"1b970443-356c-45b4-a764-800416b926e2\") " pod="openstack/barbican-9257-account-create-update-gx8hz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.417053 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k69qr\" (UniqueName: \"kubernetes.io/projected/a96f22ef-0591-44e5-a14d-4517ca0dcf14-kube-api-access-k69qr\") pod \"barbican-db-create-4mdvn\" (UID: \"a96f22ef-0591-44e5-a14d-4517ca0dcf14\") " pod="openstack/barbican-db-create-4mdvn" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.417096 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b970443-356c-45b4-a764-800416b926e2-operator-scripts\") pod \"barbican-9257-account-create-update-gx8hz\" (UID: \"1b970443-356c-45b4-a764-800416b926e2\") " pod="openstack/barbican-9257-account-create-update-gx8hz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.417120 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9htc5\" (UniqueName: \"kubernetes.io/projected/43161e39-b904-49c6-be19-c11d01817594-kube-api-access-9htc5\") pod \"cinder-f8b4-account-create-update-9k466\" (UID: \"43161e39-b904-49c6-be19-c11d01817594\") " pod="openstack/cinder-f8b4-account-create-update-9k466" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.417203 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a96f22ef-0591-44e5-a14d-4517ca0dcf14-operator-scripts\") pod \"barbican-db-create-4mdvn\" (UID: \"a96f22ef-0591-44e5-a14d-4517ca0dcf14\") " pod="openstack/barbican-db-create-4mdvn" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.417229 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43161e39-b904-49c6-be19-c11d01817594-operator-scripts\") pod \"cinder-f8b4-account-create-update-9k466\" (UID: \"43161e39-b904-49c6-be19-c11d01817594\") " pod="openstack/cinder-f8b4-account-create-update-9k466" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.418462 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a96f22ef-0591-44e5-a14d-4517ca0dcf14-operator-scripts\") pod \"barbican-db-create-4mdvn\" (UID: \"a96f22ef-0591-44e5-a14d-4517ca0dcf14\") " pod="openstack/barbican-db-create-4mdvn" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.429877 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-np8wt"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.432127 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-np8wt" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.437388 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k69qr\" (UniqueName: \"kubernetes.io/projected/a96f22ef-0591-44e5-a14d-4517ca0dcf14-kube-api-access-k69qr\") pod \"barbican-db-create-4mdvn\" (UID: \"a96f22ef-0591-44e5-a14d-4517ca0dcf14\") " pod="openstack/barbican-db-create-4mdvn" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.460104 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b7f8-account-create-update-b2tcz"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.461086 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7f8-account-create-update-b2tcz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.463833 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.495559 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-np8wt"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.501915 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"4e7b00175d4cce7fcb0347c3d927ada241593a834021b289319e8fb80e3be8d0"} Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.501947 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"9e38b526a388417cc886c1224a1c67b02aaed0db6c47390206d2232611f965db"} Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.501957 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"5ea90ff3252e8f33e1b49dcc054b155d3ca71041184f597d0ebf57cf05baa4d3"} Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.507301 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b7f8-account-create-update-b2tcz"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.508268 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gv87j" event={"ID":"53c1c8ae-4366-410e-b16b-3d06d55ce6e4","Type":"ContainerStarted","Data":"ab198333c527bc278ca83a5ed2c64a8cc6a99e23519bda938c697c4dbe72f05f"} Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.518024 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4mdvn" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.518533 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nqsz\" (UniqueName: \"kubernetes.io/projected/93698618-cbb9-4158-8c7e-1f37dc143651-kube-api-access-5nqsz\") pod \"neutron-b7f8-account-create-update-b2tcz\" (UID: \"93698618-cbb9-4158-8c7e-1f37dc143651\") " pod="openstack/neutron-b7f8-account-create-update-b2tcz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.518592 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43161e39-b904-49c6-be19-c11d01817594-operator-scripts\") pod \"cinder-f8b4-account-create-update-9k466\" (UID: \"43161e39-b904-49c6-be19-c11d01817594\") " pod="openstack/cinder-f8b4-account-create-update-9k466" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.518616 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5jv2\" (UniqueName: \"kubernetes.io/projected/1b970443-356c-45b4-a764-800416b926e2-kube-api-access-k5jv2\") pod \"barbican-9257-account-create-update-gx8hz\" (UID: \"1b970443-356c-45b4-a764-800416b926e2\") " pod="openstack/barbican-9257-account-create-update-gx8hz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.518673 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/563e2a43-bce3-4ba1-bd5e-25ec89449549-operator-scripts\") pod \"neutron-db-create-np8wt\" (UID: \"563e2a43-bce3-4ba1-bd5e-25ec89449549\") " pod="openstack/neutron-db-create-np8wt" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.518693 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b970443-356c-45b4-a764-800416b926e2-operator-scripts\") pod \"barbican-9257-account-create-update-gx8hz\" (UID: \"1b970443-356c-45b4-a764-800416b926e2\") " pod="openstack/barbican-9257-account-create-update-gx8hz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.518711 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93698618-cbb9-4158-8c7e-1f37dc143651-operator-scripts\") pod \"neutron-b7f8-account-create-update-b2tcz\" (UID: \"93698618-cbb9-4158-8c7e-1f37dc143651\") " pod="openstack/neutron-b7f8-account-create-update-b2tcz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.518729 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9htc5\" (UniqueName: \"kubernetes.io/projected/43161e39-b904-49c6-be19-c11d01817594-kube-api-access-9htc5\") pod \"cinder-f8b4-account-create-update-9k466\" (UID: \"43161e39-b904-49c6-be19-c11d01817594\") " pod="openstack/cinder-f8b4-account-create-update-9k466" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.518795 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4zn4\" (UniqueName: \"kubernetes.io/projected/563e2a43-bce3-4ba1-bd5e-25ec89449549-kube-api-access-q4zn4\") pod \"neutron-db-create-np8wt\" (UID: \"563e2a43-bce3-4ba1-bd5e-25ec89449549\") " pod="openstack/neutron-db-create-np8wt" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.519439 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43161e39-b904-49c6-be19-c11d01817594-operator-scripts\") pod \"cinder-f8b4-account-create-update-9k466\" (UID: \"43161e39-b904-49c6-be19-c11d01817594\") " pod="openstack/cinder-f8b4-account-create-update-9k466" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.522742 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b970443-356c-45b4-a764-800416b926e2-operator-scripts\") pod \"barbican-9257-account-create-update-gx8hz\" (UID: \"1b970443-356c-45b4-a764-800416b926e2\") " pod="openstack/barbican-9257-account-create-update-gx8hz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.538468 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5jv2\" (UniqueName: \"kubernetes.io/projected/1b970443-356c-45b4-a764-800416b926e2-kube-api-access-k5jv2\") pod \"barbican-9257-account-create-update-gx8hz\" (UID: \"1b970443-356c-45b4-a764-800416b926e2\") " pod="openstack/barbican-9257-account-create-update-gx8hz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.539662 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9htc5\" (UniqueName: \"kubernetes.io/projected/43161e39-b904-49c6-be19-c11d01817594-kube-api-access-9htc5\") pod \"cinder-f8b4-account-create-update-9k466\" (UID: \"43161e39-b904-49c6-be19-c11d01817594\") " pod="openstack/cinder-f8b4-account-create-update-9k466" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.582866 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-n9g6j"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.584035 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.590381 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.590746 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rq2fm" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.590903 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.591027 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.612045 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-n9g6j"] Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.623553 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/563e2a43-bce3-4ba1-bd5e-25ec89449549-operator-scripts\") pod \"neutron-db-create-np8wt\" (UID: \"563e2a43-bce3-4ba1-bd5e-25ec89449549\") " pod="openstack/neutron-db-create-np8wt" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.623611 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93698618-cbb9-4158-8c7e-1f37dc143651-operator-scripts\") pod \"neutron-b7f8-account-create-update-b2tcz\" (UID: \"93698618-cbb9-4158-8c7e-1f37dc143651\") " pod="openstack/neutron-b7f8-account-create-update-b2tcz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.623668 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4zn4\" (UniqueName: \"kubernetes.io/projected/563e2a43-bce3-4ba1-bd5e-25ec89449549-kube-api-access-q4zn4\") pod \"neutron-db-create-np8wt\" (UID: \"563e2a43-bce3-4ba1-bd5e-25ec89449549\") " pod="openstack/neutron-db-create-np8wt" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.623702 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nqsz\" (UniqueName: \"kubernetes.io/projected/93698618-cbb9-4158-8c7e-1f37dc143651-kube-api-access-5nqsz\") pod \"neutron-b7f8-account-create-update-b2tcz\" (UID: \"93698618-cbb9-4158-8c7e-1f37dc143651\") " pod="openstack/neutron-b7f8-account-create-update-b2tcz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.624793 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93698618-cbb9-4158-8c7e-1f37dc143651-operator-scripts\") pod \"neutron-b7f8-account-create-update-b2tcz\" (UID: \"93698618-cbb9-4158-8c7e-1f37dc143651\") " pod="openstack/neutron-b7f8-account-create-update-b2tcz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.626387 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/563e2a43-bce3-4ba1-bd5e-25ec89449549-operator-scripts\") pod \"neutron-db-create-np8wt\" (UID: \"563e2a43-bce3-4ba1-bd5e-25ec89449549\") " pod="openstack/neutron-db-create-np8wt" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.643382 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nqsz\" (UniqueName: \"kubernetes.io/projected/93698618-cbb9-4158-8c7e-1f37dc143651-kube-api-access-5nqsz\") pod \"neutron-b7f8-account-create-update-b2tcz\" (UID: \"93698618-cbb9-4158-8c7e-1f37dc143651\") " pod="openstack/neutron-b7f8-account-create-update-b2tcz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.647013 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4zn4\" (UniqueName: \"kubernetes.io/projected/563e2a43-bce3-4ba1-bd5e-25ec89449549-kube-api-access-q4zn4\") pod \"neutron-db-create-np8wt\" (UID: \"563e2a43-bce3-4ba1-bd5e-25ec89449549\") " pod="openstack/neutron-db-create-np8wt" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.683328 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9257-account-create-update-gx8hz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.705428 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8b4-account-create-update-9k466" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.726101 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-combined-ca-bundle\") pod \"keystone-db-sync-n9g6j\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.726189 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gkfl\" (UniqueName: \"kubernetes.io/projected/ed4d464b-59be-4a41-b9bc-2f16147c8ced-kube-api-access-9gkfl\") pod \"keystone-db-sync-n9g6j\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.726239 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-config-data\") pod \"keystone-db-sync-n9g6j\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.758541 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-np8wt" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.791033 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7f8-account-create-update-b2tcz" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.828137 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-combined-ca-bundle\") pod \"keystone-db-sync-n9g6j\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.828234 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gkfl\" (UniqueName: \"kubernetes.io/projected/ed4d464b-59be-4a41-b9bc-2f16147c8ced-kube-api-access-9gkfl\") pod \"keystone-db-sync-n9g6j\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.828279 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-config-data\") pod \"keystone-db-sync-n9g6j\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.832905 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-combined-ca-bundle\") pod \"keystone-db-sync-n9g6j\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.836496 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-config-data\") pod \"keystone-db-sync-n9g6j\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.851999 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gkfl\" (UniqueName: \"kubernetes.io/projected/ed4d464b-59be-4a41-b9bc-2f16147c8ced-kube-api-access-9gkfl\") pod \"keystone-db-sync-n9g6j\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.909542 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:18 crc kubenswrapper[4753]: W0129 14:20:18.980656 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda96f22ef_0591_44e5_a14d_4517ca0dcf14.slice/crio-9af6f7c2d9fc4570747eaa4fe7df8ca0d0240ea78bf84b6ecc65f0b3a5e75211 WatchSource:0}: Error finding container 9af6f7c2d9fc4570747eaa4fe7df8ca0d0240ea78bf84b6ecc65f0b3a5e75211: Status 404 returned error can't find the container with id 9af6f7c2d9fc4570747eaa4fe7df8ca0d0240ea78bf84b6ecc65f0b3a5e75211 Jan 29 14:20:18 crc kubenswrapper[4753]: I0129 14:20:18.982604 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4mdvn"] Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.103504 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zjqlx"] Jan 29 14:20:19 crc kubenswrapper[4753]: W0129 14:20:19.116068 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb47fe495_e567_4097_82be_eff62586e721.slice/crio-81f9077c8edf35e256844e7c659a3953d4665192637b8480f37df65d1886c75b WatchSource:0}: Error finding container 81f9077c8edf35e256844e7c659a3953d4665192637b8480f37df65d1886c75b: Status 404 returned error can't find the container with id 81f9077c8edf35e256844e7c659a3953d4665192637b8480f37df65d1886c75b Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.381000 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f8b4-account-create-update-9k466"] Jan 29 14:20:19 crc kubenswrapper[4753]: W0129 14:20:19.394492 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93698618_cbb9_4158_8c7e_1f37dc143651.slice/crio-2ac8b299848fe0c8aa328913e75268302749d035c5ee24efceb025a5669576e2 WatchSource:0}: Error finding container 2ac8b299848fe0c8aa328913e75268302749d035c5ee24efceb025a5669576e2: Status 404 returned error can't find the container with id 2ac8b299848fe0c8aa328913e75268302749d035c5ee24efceb025a5669576e2 Jan 29 14:20:19 crc kubenswrapper[4753]: W0129 14:20:19.396416 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43161e39_b904_49c6_be19_c11d01817594.slice/crio-b190c57913df060ee049c004e6f75a6d0814b17e3f078f83e1731b26ab303601 WatchSource:0}: Error finding container b190c57913df060ee049c004e6f75a6d0814b17e3f078f83e1731b26ab303601: Status 404 returned error can't find the container with id b190c57913df060ee049c004e6f75a6d0814b17e3f078f83e1731b26ab303601 Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.402429 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b7f8-account-create-update-b2tcz"] Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.414460 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9257-account-create-update-gx8hz"] Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.426745 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-np8wt"] Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.529834 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"3c1a5183ff790d39a91d088d0d1476e957753555d63f7eb99e9f9efc1ecd852f"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.529879 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerStarted","Data":"f2a0991e4200eaba753ab9243604efaa3906af593fb0112ee1a38be84f321412"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.534572 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f8b4-account-create-update-9k466" event={"ID":"43161e39-b904-49c6-be19-c11d01817594","Type":"ContainerStarted","Data":"b190c57913df060ee049c004e6f75a6d0814b17e3f078f83e1731b26ab303601"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.536875 4753 generic.go:334] "Generic (PLEG): container finished" podID="6f3ff39f-51df-4a69-a245-fed0285dbbbe" containerID="a9f3c8865dd90ae8cd702248995b59416d6e48fd3e228641478bf40b18db9234" exitCode=0 Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.537025 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rm9d5-config-dfxwp" event={"ID":"6f3ff39f-51df-4a69-a245-fed0285dbbbe","Type":"ContainerDied","Data":"a9f3c8865dd90ae8cd702248995b59416d6e48fd3e228641478bf40b18db9234"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.537118 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rm9d5-config-dfxwp" event={"ID":"6f3ff39f-51df-4a69-a245-fed0285dbbbe","Type":"ContainerStarted","Data":"c2b4b68d1606a337a1e355f36adbee4980ec354487743a3f1c3e19a44f3b5ef0"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.540032 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4mdvn" event={"ID":"a96f22ef-0591-44e5-a14d-4517ca0dcf14","Type":"ContainerStarted","Data":"1a0cd98dd6eddf05f0e42f6c2fc60a459aac1e62568241e836fba1754c6c671e"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.540073 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4mdvn" event={"ID":"a96f22ef-0591-44e5-a14d-4517ca0dcf14","Type":"ContainerStarted","Data":"9af6f7c2d9fc4570747eaa4fe7df8ca0d0240ea78bf84b6ecc65f0b3a5e75211"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.542067 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9257-account-create-update-gx8hz" event={"ID":"1b970443-356c-45b4-a764-800416b926e2","Type":"ContainerStarted","Data":"b7bf4a00fcc1fdfd770f26badebdbaac6cc4c01d7cd40bd9f543f772b1a600a5"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.543914 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-np8wt" event={"ID":"563e2a43-bce3-4ba1-bd5e-25ec89449549","Type":"ContainerStarted","Data":"04ce171151c6d0b684aba2720ce50bf1deef19387f0453d347a1b5a024346690"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.545728 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7f8-account-create-update-b2tcz" event={"ID":"93698618-cbb9-4158-8c7e-1f37dc143651","Type":"ContainerStarted","Data":"2ac8b299848fe0c8aa328913e75268302749d035c5ee24efceb025a5669576e2"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.549876 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zjqlx" event={"ID":"b47fe495-e567-4097-82be-eff62586e721","Type":"ContainerStarted","Data":"600dea35b4935654339a65c2998a749f0c3146c44c8980223cca2ab749d53237"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.549928 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zjqlx" event={"ID":"b47fe495-e567-4097-82be-eff62586e721","Type":"ContainerStarted","Data":"81f9077c8edf35e256844e7c659a3953d4665192637b8480f37df65d1886c75b"} Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.580042 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.022626525 podStartE2EDuration="27.580017273s" podCreationTimestamp="2026-01-29 14:19:52 +0000 UTC" firstStartedPulling="2026-01-29 14:20:10.479170483 +0000 UTC m=+1045.173904865" lastFinishedPulling="2026-01-29 14:20:17.036561191 +0000 UTC m=+1051.731295613" observedRunningTime="2026-01-29 14:20:19.576548179 +0000 UTC m=+1054.271282551" watchObservedRunningTime="2026-01-29 14:20:19.580017273 +0000 UTC m=+1054.274751655" Jan 29 14:20:19 crc kubenswrapper[4753]: W0129 14:20:19.607785 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded4d464b_59be_4a41_b9bc_2f16147c8ced.slice/crio-7ed66b75810a3565a2882f6a4eb3a01082d7d54952fd3b02d9262ce39a2088bc WatchSource:0}: Error finding container 7ed66b75810a3565a2882f6a4eb3a01082d7d54952fd3b02d9262ce39a2088bc: Status 404 returned error can't find the container with id 7ed66b75810a3565a2882f6a4eb3a01082d7d54952fd3b02d9262ce39a2088bc Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.614646 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-n9g6j"] Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.697603 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xcsvp"] Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.698975 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xcsvp" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.703167 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.735844 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xcsvp"] Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.848588 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g59kr\" (UniqueName: \"kubernetes.io/projected/1f613d6c-6411-47bd-8e65-fd132ae3e874-kube-api-access-g59kr\") pod \"root-account-create-update-xcsvp\" (UID: \"1f613d6c-6411-47bd-8e65-fd132ae3e874\") " pod="openstack/root-account-create-update-xcsvp" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.848637 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f613d6c-6411-47bd-8e65-fd132ae3e874-operator-scripts\") pod \"root-account-create-update-xcsvp\" (UID: \"1f613d6c-6411-47bd-8e65-fd132ae3e874\") " pod="openstack/root-account-create-update-xcsvp" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.884781 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-jvknq"] Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.886687 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.899282 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.909224 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-jvknq"] Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.951934 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g59kr\" (UniqueName: \"kubernetes.io/projected/1f613d6c-6411-47bd-8e65-fd132ae3e874-kube-api-access-g59kr\") pod \"root-account-create-update-xcsvp\" (UID: \"1f613d6c-6411-47bd-8e65-fd132ae3e874\") " pod="openstack/root-account-create-update-xcsvp" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.951983 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f613d6c-6411-47bd-8e65-fd132ae3e874-operator-scripts\") pod \"root-account-create-update-xcsvp\" (UID: \"1f613d6c-6411-47bd-8e65-fd132ae3e874\") " pod="openstack/root-account-create-update-xcsvp" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.952012 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-config\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.952091 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-svc\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.952107 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-nb\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.952209 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-swift-storage-0\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.952255 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28xg\" (UniqueName: \"kubernetes.io/projected/60785353-1684-46ad-92ed-4d984254055e-kube-api-access-f28xg\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.952403 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-sb\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.952746 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f613d6c-6411-47bd-8e65-fd132ae3e874-operator-scripts\") pod \"root-account-create-update-xcsvp\" (UID: \"1f613d6c-6411-47bd-8e65-fd132ae3e874\") " pod="openstack/root-account-create-update-xcsvp" Jan 29 14:20:19 crc kubenswrapper[4753]: I0129 14:20:19.976227 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g59kr\" (UniqueName: \"kubernetes.io/projected/1f613d6c-6411-47bd-8e65-fd132ae3e874-kube-api-access-g59kr\") pod \"root-account-create-update-xcsvp\" (UID: \"1f613d6c-6411-47bd-8e65-fd132ae3e874\") " pod="openstack/root-account-create-update-xcsvp" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.040132 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xcsvp" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.054234 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-svc\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.054279 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-nb\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.054308 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-swift-storage-0\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.054328 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28xg\" (UniqueName: \"kubernetes.io/projected/60785353-1684-46ad-92ed-4d984254055e-kube-api-access-f28xg\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.054382 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-sb\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.054423 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-config\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.055448 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-config\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.055607 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-nb\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.056215 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-sb\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.056368 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-swift-storage-0\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.056717 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-svc\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.092003 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28xg\" (UniqueName: \"kubernetes.io/projected/60785353-1684-46ad-92ed-4d984254055e-kube-api-access-f28xg\") pod \"dnsmasq-dns-75bdffd66f-jvknq\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.200863 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.553107 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xcsvp"] Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.559355 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n9g6j" event={"ID":"ed4d464b-59be-4a41-b9bc-2f16147c8ced","Type":"ContainerStarted","Data":"7ed66b75810a3565a2882f6a4eb3a01082d7d54952fd3b02d9262ce39a2088bc"} Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.561299 4753 generic.go:334] "Generic (PLEG): container finished" podID="93698618-cbb9-4158-8c7e-1f37dc143651" containerID="7d9bc6a338bfd8692c4f35a236d7b9e2ef5f2c475ad33ced83ef75edf0c8103b" exitCode=0 Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.561383 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7f8-account-create-update-b2tcz" event={"ID":"93698618-cbb9-4158-8c7e-1f37dc143651","Type":"ContainerDied","Data":"7d9bc6a338bfd8692c4f35a236d7b9e2ef5f2c475ad33ced83ef75edf0c8103b"} Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.566931 4753 generic.go:334] "Generic (PLEG): container finished" podID="b47fe495-e567-4097-82be-eff62586e721" containerID="600dea35b4935654339a65c2998a749f0c3146c44c8980223cca2ab749d53237" exitCode=0 Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.566976 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zjqlx" event={"ID":"b47fe495-e567-4097-82be-eff62586e721","Type":"ContainerDied","Data":"600dea35b4935654339a65c2998a749f0c3146c44c8980223cca2ab749d53237"} Jan 29 14:20:20 crc kubenswrapper[4753]: W0129 14:20:20.566987 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f613d6c_6411_47bd_8e65_fd132ae3e874.slice/crio-950b65f1fd2f823c4e85741a48fa1cbc98fe61db9e6971c59e9c54672276ccbd WatchSource:0}: Error finding container 950b65f1fd2f823c4e85741a48fa1cbc98fe61db9e6971c59e9c54672276ccbd: Status 404 returned error can't find the container with id 950b65f1fd2f823c4e85741a48fa1cbc98fe61db9e6971c59e9c54672276ccbd Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.570834 4753 generic.go:334] "Generic (PLEG): container finished" podID="43161e39-b904-49c6-be19-c11d01817594" containerID="52d311fc58397250b09be34c36b15dfcfcdccde7d8b80b2368229261729b241a" exitCode=0 Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.570882 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f8b4-account-create-update-9k466" event={"ID":"43161e39-b904-49c6-be19-c11d01817594","Type":"ContainerDied","Data":"52d311fc58397250b09be34c36b15dfcfcdccde7d8b80b2368229261729b241a"} Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.576731 4753 generic.go:334] "Generic (PLEG): container finished" podID="a96f22ef-0591-44e5-a14d-4517ca0dcf14" containerID="1a0cd98dd6eddf05f0e42f6c2fc60a459aac1e62568241e836fba1754c6c671e" exitCode=0 Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.576828 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4mdvn" event={"ID":"a96f22ef-0591-44e5-a14d-4517ca0dcf14","Type":"ContainerDied","Data":"1a0cd98dd6eddf05f0e42f6c2fc60a459aac1e62568241e836fba1754c6c671e"} Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.594733 4753 generic.go:334] "Generic (PLEG): container finished" podID="1b970443-356c-45b4-a764-800416b926e2" containerID="a09633d250fe3b97729c4d076f39aed0c8cbd6633607e39dc28ae59a436fa5e4" exitCode=0 Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.595086 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9257-account-create-update-gx8hz" event={"ID":"1b970443-356c-45b4-a764-800416b926e2","Type":"ContainerDied","Data":"a09633d250fe3b97729c4d076f39aed0c8cbd6633607e39dc28ae59a436fa5e4"} Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.601480 4753 generic.go:334] "Generic (PLEG): container finished" podID="563e2a43-bce3-4ba1-bd5e-25ec89449549" containerID="77bf30ffd3f8171c6519acda2167defb718243931cd2d3e74a471e92923e38b6" exitCode=0 Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.601547 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-np8wt" event={"ID":"563e2a43-bce3-4ba1-bd5e-25ec89449549","Type":"ContainerDied","Data":"77bf30ffd3f8171c6519acda2167defb718243931cd2d3e74a471e92923e38b6"} Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.783417 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-jvknq"] Jan 29 14:20:20 crc kubenswrapper[4753]: W0129 14:20:20.793384 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60785353_1684_46ad_92ed_4d984254055e.slice/crio-162ae6c7846c2215193bc8beef3677fee2b752b32e76cc3a413ab406ff8ab726 WatchSource:0}: Error finding container 162ae6c7846c2215193bc8beef3677fee2b752b32e76cc3a413ab406ff8ab726: Status 404 returned error can't find the container with id 162ae6c7846c2215193bc8beef3677fee2b752b32e76cc3a413ab406ff8ab726 Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.917549 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4mdvn" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.969253 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k69qr\" (UniqueName: \"kubernetes.io/projected/a96f22ef-0591-44e5-a14d-4517ca0dcf14-kube-api-access-k69qr\") pod \"a96f22ef-0591-44e5-a14d-4517ca0dcf14\" (UID: \"a96f22ef-0591-44e5-a14d-4517ca0dcf14\") " Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.969318 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a96f22ef-0591-44e5-a14d-4517ca0dcf14-operator-scripts\") pod \"a96f22ef-0591-44e5-a14d-4517ca0dcf14\" (UID: \"a96f22ef-0591-44e5-a14d-4517ca0dcf14\") " Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.969964 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a96f22ef-0591-44e5-a14d-4517ca0dcf14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a96f22ef-0591-44e5-a14d-4517ca0dcf14" (UID: "a96f22ef-0591-44e5-a14d-4517ca0dcf14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:20 crc kubenswrapper[4753]: I0129 14:20:20.973510 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a96f22ef-0591-44e5-a14d-4517ca0dcf14-kube-api-access-k69qr" (OuterVolumeSpecName: "kube-api-access-k69qr") pod "a96f22ef-0591-44e5-a14d-4517ca0dcf14" (UID: "a96f22ef-0591-44e5-a14d-4517ca0dcf14"). InnerVolumeSpecName "kube-api-access-k69qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.009569 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zjqlx" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.015340 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.071591 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-log-ovn\") pod \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.072286 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-scripts\") pod \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.072403 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-additional-scripts\") pod \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.072524 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j545k\" (UniqueName: \"kubernetes.io/projected/6f3ff39f-51df-4a69-a245-fed0285dbbbe-kube-api-access-j545k\") pod \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.072742 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h5cd\" (UniqueName: \"kubernetes.io/projected/b47fe495-e567-4097-82be-eff62586e721-kube-api-access-4h5cd\") pod \"b47fe495-e567-4097-82be-eff62586e721\" (UID: \"b47fe495-e567-4097-82be-eff62586e721\") " Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.072867 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run\") pod \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.073303 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run-ovn\") pod \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\" (UID: \"6f3ff39f-51df-4a69-a245-fed0285dbbbe\") " Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.073370 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b47fe495-e567-4097-82be-eff62586e721-operator-scripts\") pod \"b47fe495-e567-4097-82be-eff62586e721\" (UID: \"b47fe495-e567-4097-82be-eff62586e721\") " Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.073554 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run" (OuterVolumeSpecName: "var-run") pod "6f3ff39f-51df-4a69-a245-fed0285dbbbe" (UID: "6f3ff39f-51df-4a69-a245-fed0285dbbbe"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.073649 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6f3ff39f-51df-4a69-a245-fed0285dbbbe" (UID: "6f3ff39f-51df-4a69-a245-fed0285dbbbe"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.074582 4753 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.074601 4753 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.074613 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k69qr\" (UniqueName: \"kubernetes.io/projected/a96f22ef-0591-44e5-a14d-4517ca0dcf14-kube-api-access-k69qr\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.074623 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a96f22ef-0591-44e5-a14d-4517ca0dcf14-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.075082 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b47fe495-e567-4097-82be-eff62586e721-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b47fe495-e567-4097-82be-eff62586e721" (UID: "b47fe495-e567-4097-82be-eff62586e721"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.075128 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6f3ff39f-51df-4a69-a245-fed0285dbbbe" (UID: "6f3ff39f-51df-4a69-a245-fed0285dbbbe"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.076074 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-scripts" (OuterVolumeSpecName: "scripts") pod "6f3ff39f-51df-4a69-a245-fed0285dbbbe" (UID: "6f3ff39f-51df-4a69-a245-fed0285dbbbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.076245 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6f3ff39f-51df-4a69-a245-fed0285dbbbe" (UID: "6f3ff39f-51df-4a69-a245-fed0285dbbbe"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.079771 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3ff39f-51df-4a69-a245-fed0285dbbbe-kube-api-access-j545k" (OuterVolumeSpecName: "kube-api-access-j545k") pod "6f3ff39f-51df-4a69-a245-fed0285dbbbe" (UID: "6f3ff39f-51df-4a69-a245-fed0285dbbbe"). InnerVolumeSpecName "kube-api-access-j545k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.092816 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b47fe495-e567-4097-82be-eff62586e721-kube-api-access-4h5cd" (OuterVolumeSpecName: "kube-api-access-4h5cd") pod "b47fe495-e567-4097-82be-eff62586e721" (UID: "b47fe495-e567-4097-82be-eff62586e721"). InnerVolumeSpecName "kube-api-access-4h5cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.176846 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.176877 4753 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3ff39f-51df-4a69-a245-fed0285dbbbe-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.176890 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j545k\" (UniqueName: \"kubernetes.io/projected/6f3ff39f-51df-4a69-a245-fed0285dbbbe-kube-api-access-j545k\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.176901 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h5cd\" (UniqueName: \"kubernetes.io/projected/b47fe495-e567-4097-82be-eff62586e721-kube-api-access-4h5cd\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.176911 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b47fe495-e567-4097-82be-eff62586e721-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.176922 4753 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f3ff39f-51df-4a69-a245-fed0285dbbbe-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.611252 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4mdvn" event={"ID":"a96f22ef-0591-44e5-a14d-4517ca0dcf14","Type":"ContainerDied","Data":"9af6f7c2d9fc4570747eaa4fe7df8ca0d0240ea78bf84b6ecc65f0b3a5e75211"} Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.611740 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9af6f7c2d9fc4570747eaa4fe7df8ca0d0240ea78bf84b6ecc65f0b3a5e75211" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.611412 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4mdvn" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.612992 4753 generic.go:334] "Generic (PLEG): container finished" podID="1f613d6c-6411-47bd-8e65-fd132ae3e874" containerID="04cf2bb41576bf58389254d683314aee72861c9ebd5be98e2ebdeec419c78c85" exitCode=0 Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.613041 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xcsvp" event={"ID":"1f613d6c-6411-47bd-8e65-fd132ae3e874","Type":"ContainerDied","Data":"04cf2bb41576bf58389254d683314aee72861c9ebd5be98e2ebdeec419c78c85"} Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.613067 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xcsvp" event={"ID":"1f613d6c-6411-47bd-8e65-fd132ae3e874","Type":"ContainerStarted","Data":"950b65f1fd2f823c4e85741a48fa1cbc98fe61db9e6971c59e9c54672276ccbd"} Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.614963 4753 generic.go:334] "Generic (PLEG): container finished" podID="60785353-1684-46ad-92ed-4d984254055e" containerID="ff8c57e9cd091f2d8bb0c5ce1996f9a36382b6cc2b2708bdfe2dfcc845e0dde9" exitCode=0 Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.615031 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" event={"ID":"60785353-1684-46ad-92ed-4d984254055e","Type":"ContainerDied","Data":"ff8c57e9cd091f2d8bb0c5ce1996f9a36382b6cc2b2708bdfe2dfcc845e0dde9"} Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.615048 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" event={"ID":"60785353-1684-46ad-92ed-4d984254055e","Type":"ContainerStarted","Data":"162ae6c7846c2215193bc8beef3677fee2b752b32e76cc3a413ab406ff8ab726"} Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.621789 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zjqlx" event={"ID":"b47fe495-e567-4097-82be-eff62586e721","Type":"ContainerDied","Data":"81f9077c8edf35e256844e7c659a3953d4665192637b8480f37df65d1886c75b"} Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.621855 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f9077c8edf35e256844e7c659a3953d4665192637b8480f37df65d1886c75b" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.621821 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zjqlx" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.629504 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rm9d5-config-dfxwp" event={"ID":"6f3ff39f-51df-4a69-a245-fed0285dbbbe","Type":"ContainerDied","Data":"c2b4b68d1606a337a1e355f36adbee4980ec354487743a3f1c3e19a44f3b5ef0"} Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.629549 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b4b68d1606a337a1e355f36adbee4980ec354487743a3f1c3e19a44f3b5ef0" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.629584 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rm9d5-config-dfxwp" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.913099 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8b4-account-create-update-9k466" Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.991687 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43161e39-b904-49c6-be19-c11d01817594-operator-scripts\") pod \"43161e39-b904-49c6-be19-c11d01817594\" (UID: \"43161e39-b904-49c6-be19-c11d01817594\") " Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.991744 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9htc5\" (UniqueName: \"kubernetes.io/projected/43161e39-b904-49c6-be19-c11d01817594-kube-api-access-9htc5\") pod \"43161e39-b904-49c6-be19-c11d01817594\" (UID: \"43161e39-b904-49c6-be19-c11d01817594\") " Jan 29 14:20:21 crc kubenswrapper[4753]: I0129 14:20:21.993578 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43161e39-b904-49c6-be19-c11d01817594-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43161e39-b904-49c6-be19-c11d01817594" (UID: "43161e39-b904-49c6-be19-c11d01817594"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:21.999442 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43161e39-b904-49c6-be19-c11d01817594-kube-api-access-9htc5" (OuterVolumeSpecName: "kube-api-access-9htc5") pod "43161e39-b904-49c6-be19-c11d01817594" (UID: "43161e39-b904-49c6-be19-c11d01817594"). InnerVolumeSpecName "kube-api-access-9htc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.094553 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43161e39-b904-49c6-be19-c11d01817594-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.094580 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9htc5\" (UniqueName: \"kubernetes.io/projected/43161e39-b904-49c6-be19-c11d01817594-kube-api-access-9htc5\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.146817 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rm9d5-config-dfxwp"] Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.206684 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rm9d5-config-dfxwp"] Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.206746 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rm9d5" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.270060 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7f8-account-create-update-b2tcz" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.278830 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9257-account-create-update-gx8hz" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.282469 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-np8wt" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.398090 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5jv2\" (UniqueName: \"kubernetes.io/projected/1b970443-356c-45b4-a764-800416b926e2-kube-api-access-k5jv2\") pod \"1b970443-356c-45b4-a764-800416b926e2\" (UID: \"1b970443-356c-45b4-a764-800416b926e2\") " Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.398138 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4zn4\" (UniqueName: \"kubernetes.io/projected/563e2a43-bce3-4ba1-bd5e-25ec89449549-kube-api-access-q4zn4\") pod \"563e2a43-bce3-4ba1-bd5e-25ec89449549\" (UID: \"563e2a43-bce3-4ba1-bd5e-25ec89449549\") " Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.398237 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b970443-356c-45b4-a764-800416b926e2-operator-scripts\") pod \"1b970443-356c-45b4-a764-800416b926e2\" (UID: \"1b970443-356c-45b4-a764-800416b926e2\") " Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.398258 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93698618-cbb9-4158-8c7e-1f37dc143651-operator-scripts\") pod \"93698618-cbb9-4158-8c7e-1f37dc143651\" (UID: \"93698618-cbb9-4158-8c7e-1f37dc143651\") " Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.398278 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/563e2a43-bce3-4ba1-bd5e-25ec89449549-operator-scripts\") pod \"563e2a43-bce3-4ba1-bd5e-25ec89449549\" (UID: \"563e2a43-bce3-4ba1-bd5e-25ec89449549\") " Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.398375 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nqsz\" (UniqueName: \"kubernetes.io/projected/93698618-cbb9-4158-8c7e-1f37dc143651-kube-api-access-5nqsz\") pod \"93698618-cbb9-4158-8c7e-1f37dc143651\" (UID: \"93698618-cbb9-4158-8c7e-1f37dc143651\") " Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.398799 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93698618-cbb9-4158-8c7e-1f37dc143651-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93698618-cbb9-4158-8c7e-1f37dc143651" (UID: "93698618-cbb9-4158-8c7e-1f37dc143651"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.398820 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b970443-356c-45b4-a764-800416b926e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b970443-356c-45b4-a764-800416b926e2" (UID: "1b970443-356c-45b4-a764-800416b926e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.398908 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/563e2a43-bce3-4ba1-bd5e-25ec89449549-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "563e2a43-bce3-4ba1-bd5e-25ec89449549" (UID: "563e2a43-bce3-4ba1-bd5e-25ec89449549"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.402931 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93698618-cbb9-4158-8c7e-1f37dc143651-kube-api-access-5nqsz" (OuterVolumeSpecName: "kube-api-access-5nqsz") pod "93698618-cbb9-4158-8c7e-1f37dc143651" (UID: "93698618-cbb9-4158-8c7e-1f37dc143651"). InnerVolumeSpecName "kube-api-access-5nqsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.402957 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/563e2a43-bce3-4ba1-bd5e-25ec89449549-kube-api-access-q4zn4" (OuterVolumeSpecName: "kube-api-access-q4zn4") pod "563e2a43-bce3-4ba1-bd5e-25ec89449549" (UID: "563e2a43-bce3-4ba1-bd5e-25ec89449549"). InnerVolumeSpecName "kube-api-access-q4zn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.403276 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b970443-356c-45b4-a764-800416b926e2-kube-api-access-k5jv2" (OuterVolumeSpecName: "kube-api-access-k5jv2") pod "1b970443-356c-45b4-a764-800416b926e2" (UID: "1b970443-356c-45b4-a764-800416b926e2"). InnerVolumeSpecName "kube-api-access-k5jv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.499740 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nqsz\" (UniqueName: \"kubernetes.io/projected/93698618-cbb9-4158-8c7e-1f37dc143651-kube-api-access-5nqsz\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.499771 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5jv2\" (UniqueName: \"kubernetes.io/projected/1b970443-356c-45b4-a764-800416b926e2-kube-api-access-k5jv2\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.499779 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4zn4\" (UniqueName: \"kubernetes.io/projected/563e2a43-bce3-4ba1-bd5e-25ec89449549-kube-api-access-q4zn4\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.499792 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b970443-356c-45b4-a764-800416b926e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.499802 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93698618-cbb9-4158-8c7e-1f37dc143651-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.499810 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/563e2a43-bce3-4ba1-bd5e-25ec89449549-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.645840 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-np8wt" event={"ID":"563e2a43-bce3-4ba1-bd5e-25ec89449549","Type":"ContainerDied","Data":"04ce171151c6d0b684aba2720ce50bf1deef19387f0453d347a1b5a024346690"} Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.645888 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04ce171151c6d0b684aba2720ce50bf1deef19387f0453d347a1b5a024346690" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.648037 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-np8wt" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.652676 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" event={"ID":"60785353-1684-46ad-92ed-4d984254055e","Type":"ContainerStarted","Data":"93e83a50b81245ae1c980724868a73a39f9cabd4d61a265bc0e9fc0a573f2612"} Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.652804 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.654621 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7f8-account-create-update-b2tcz" event={"ID":"93698618-cbb9-4158-8c7e-1f37dc143651","Type":"ContainerDied","Data":"2ac8b299848fe0c8aa328913e75268302749d035c5ee24efceb025a5669576e2"} Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.654656 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ac8b299848fe0c8aa328913e75268302749d035c5ee24efceb025a5669576e2" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.654694 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7f8-account-create-update-b2tcz" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.658637 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f8b4-account-create-update-9k466" event={"ID":"43161e39-b904-49c6-be19-c11d01817594","Type":"ContainerDied","Data":"b190c57913df060ee049c004e6f75a6d0814b17e3f078f83e1731b26ab303601"} Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.658671 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b190c57913df060ee049c004e6f75a6d0814b17e3f078f83e1731b26ab303601" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.658761 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8b4-account-create-update-9k466" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.661221 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9257-account-create-update-gx8hz" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.664836 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9257-account-create-update-gx8hz" event={"ID":"1b970443-356c-45b4-a764-800416b926e2","Type":"ContainerDied","Data":"b7bf4a00fcc1fdfd770f26badebdbaac6cc4c01d7cd40bd9f543f772b1a600a5"} Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.664866 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7bf4a00fcc1fdfd770f26badebdbaac6cc4c01d7cd40bd9f543f772b1a600a5" Jan 29 14:20:22 crc kubenswrapper[4753]: I0129 14:20:22.674920 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" podStartSLOduration=3.674882983 podStartE2EDuration="3.674882983s" podCreationTimestamp="2026-01-29 14:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:20:22.673490596 +0000 UTC m=+1057.368225028" watchObservedRunningTime="2026-01-29 14:20:22.674882983 +0000 UTC m=+1057.369617365" Jan 29 14:20:24 crc kubenswrapper[4753]: I0129 14:20:24.162964 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3ff39f-51df-4a69-a245-fed0285dbbbe" path="/var/lib/kubelet/pods/6f3ff39f-51df-4a69-a245-fed0285dbbbe/volumes" Jan 29 14:20:25 crc kubenswrapper[4753]: I0129 14:20:25.472927 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xcsvp" Jan 29 14:20:25 crc kubenswrapper[4753]: I0129 14:20:25.569223 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g59kr\" (UniqueName: \"kubernetes.io/projected/1f613d6c-6411-47bd-8e65-fd132ae3e874-kube-api-access-g59kr\") pod \"1f613d6c-6411-47bd-8e65-fd132ae3e874\" (UID: \"1f613d6c-6411-47bd-8e65-fd132ae3e874\") " Jan 29 14:20:25 crc kubenswrapper[4753]: I0129 14:20:25.569324 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f613d6c-6411-47bd-8e65-fd132ae3e874-operator-scripts\") pod \"1f613d6c-6411-47bd-8e65-fd132ae3e874\" (UID: \"1f613d6c-6411-47bd-8e65-fd132ae3e874\") " Jan 29 14:20:25 crc kubenswrapper[4753]: I0129 14:20:25.570177 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f613d6c-6411-47bd-8e65-fd132ae3e874-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f613d6c-6411-47bd-8e65-fd132ae3e874" (UID: "1f613d6c-6411-47bd-8e65-fd132ae3e874"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:25 crc kubenswrapper[4753]: I0129 14:20:25.574670 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f613d6c-6411-47bd-8e65-fd132ae3e874-kube-api-access-g59kr" (OuterVolumeSpecName: "kube-api-access-g59kr") pod "1f613d6c-6411-47bd-8e65-fd132ae3e874" (UID: "1f613d6c-6411-47bd-8e65-fd132ae3e874"). InnerVolumeSpecName "kube-api-access-g59kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:25 crc kubenswrapper[4753]: I0129 14:20:25.671799 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g59kr\" (UniqueName: \"kubernetes.io/projected/1f613d6c-6411-47bd-8e65-fd132ae3e874-kube-api-access-g59kr\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:25 crc kubenswrapper[4753]: I0129 14:20:25.671850 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f613d6c-6411-47bd-8e65-fd132ae3e874-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:25 crc kubenswrapper[4753]: I0129 14:20:25.701322 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xcsvp" event={"ID":"1f613d6c-6411-47bd-8e65-fd132ae3e874","Type":"ContainerDied","Data":"950b65f1fd2f823c4e85741a48fa1cbc98fe61db9e6971c59e9c54672276ccbd"} Jan 29 14:20:25 crc kubenswrapper[4753]: I0129 14:20:25.701382 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="950b65f1fd2f823c4e85741a48fa1cbc98fe61db9e6971c59e9c54672276ccbd" Jan 29 14:20:25 crc kubenswrapper[4753]: I0129 14:20:25.701395 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xcsvp" Jan 29 14:20:26 crc kubenswrapper[4753]: I0129 14:20:26.710519 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n9g6j" event={"ID":"ed4d464b-59be-4a41-b9bc-2f16147c8ced","Type":"ContainerStarted","Data":"838908d5e19934d82b12ae2e84d73201f81cdc2b5ee863afff8ead4c7b4b6cbf"} Jan 29 14:20:26 crc kubenswrapper[4753]: I0129 14:20:26.741639 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-n9g6j" podStartSLOduration=2.876085602 podStartE2EDuration="8.741606568s" podCreationTimestamp="2026-01-29 14:20:18 +0000 UTC" firstStartedPulling="2026-01-29 14:20:19.618169243 +0000 UTC m=+1054.312903625" lastFinishedPulling="2026-01-29 14:20:25.483690199 +0000 UTC m=+1060.178424591" observedRunningTime="2026-01-29 14:20:26.729688485 +0000 UTC m=+1061.424422867" watchObservedRunningTime="2026-01-29 14:20:26.741606568 +0000 UTC m=+1061.436340960" Jan 29 14:20:30 crc kubenswrapper[4753]: I0129 14:20:30.203190 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:30 crc kubenswrapper[4753]: I0129 14:20:30.275529 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-gzftt"] Jan 29 14:20:30 crc kubenswrapper[4753]: I0129 14:20:30.275786 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" podUID="59b14dcd-09ad-4186-98d6-781ef2a5c3f6" containerName="dnsmasq-dns" containerID="cri-o://e4cc9a6cff40c07ad338240f406155a82abcd20c309cad37862dd5977188bc04" gracePeriod=10 Jan 29 14:20:30 crc kubenswrapper[4753]: I0129 14:20:30.752599 4753 generic.go:334] "Generic (PLEG): container finished" podID="59b14dcd-09ad-4186-98d6-781ef2a5c3f6" containerID="e4cc9a6cff40c07ad338240f406155a82abcd20c309cad37862dd5977188bc04" exitCode=0 Jan 29 14:20:30 crc kubenswrapper[4753]: I0129 14:20:30.752669 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" event={"ID":"59b14dcd-09ad-4186-98d6-781ef2a5c3f6","Type":"ContainerDied","Data":"e4cc9a6cff40c07ad338240f406155a82abcd20c309cad37862dd5977188bc04"} Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.103497 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" podUID="59b14dcd-09ad-4186-98d6-781ef2a5c3f6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.658635 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.778591 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" event={"ID":"59b14dcd-09ad-4186-98d6-781ef2a5c3f6","Type":"ContainerDied","Data":"ee5e4f51a17b89e86aa2886792b34ef32ec14f6a239fe22c97870d3c610e5d52"} Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.778645 4753 scope.go:117] "RemoveContainer" containerID="e4cc9a6cff40c07ad338240f406155a82abcd20c309cad37862dd5977188bc04" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.778757 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fdf7998c-gzftt" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.819092 4753 scope.go:117] "RemoveContainer" containerID="6ce01ce4b7948a73a401799af6d07ea4296e432d2c38856a74b0bc9ca6672d2a" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.850258 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-sb\") pod \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.850458 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srk8g\" (UniqueName: \"kubernetes.io/projected/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-kube-api-access-srk8g\") pod \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.851136 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-config\") pod \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.851245 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-nb\") pod \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.851374 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-dns-svc\") pod \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\" (UID: \"59b14dcd-09ad-4186-98d6-781ef2a5c3f6\") " Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.854415 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-kube-api-access-srk8g" (OuterVolumeSpecName: "kube-api-access-srk8g") pod "59b14dcd-09ad-4186-98d6-781ef2a5c3f6" (UID: "59b14dcd-09ad-4186-98d6-781ef2a5c3f6"). InnerVolumeSpecName "kube-api-access-srk8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.894445 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-config" (OuterVolumeSpecName: "config") pod "59b14dcd-09ad-4186-98d6-781ef2a5c3f6" (UID: "59b14dcd-09ad-4186-98d6-781ef2a5c3f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.909370 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59b14dcd-09ad-4186-98d6-781ef2a5c3f6" (UID: "59b14dcd-09ad-4186-98d6-781ef2a5c3f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.911680 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59b14dcd-09ad-4186-98d6-781ef2a5c3f6" (UID: "59b14dcd-09ad-4186-98d6-781ef2a5c3f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.914519 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59b14dcd-09ad-4186-98d6-781ef2a5c3f6" (UID: "59b14dcd-09ad-4186-98d6-781ef2a5c3f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.952833 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.952894 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.952910 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srk8g\" (UniqueName: \"kubernetes.io/projected/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-kube-api-access-srk8g\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.952923 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:33 crc kubenswrapper[4753]: I0129 14:20:33.952934 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59b14dcd-09ad-4186-98d6-781ef2a5c3f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:34 crc kubenswrapper[4753]: I0129 14:20:34.118408 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-gzftt"] Jan 29 14:20:34 crc kubenswrapper[4753]: I0129 14:20:34.128875 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67fdf7998c-gzftt"] Jan 29 14:20:34 crc kubenswrapper[4753]: I0129 14:20:34.166938 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b14dcd-09ad-4186-98d6-781ef2a5c3f6" path="/var/lib/kubelet/pods/59b14dcd-09ad-4186-98d6-781ef2a5c3f6/volumes" Jan 29 14:20:34 crc kubenswrapper[4753]: I0129 14:20:34.791885 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gv87j" event={"ID":"53c1c8ae-4366-410e-b16b-3d06d55ce6e4","Type":"ContainerStarted","Data":"9de8348dae5eee512c3a686a7ee1add3f75b5b58d76f466710a04d0d910d5cd7"} Jan 29 14:20:34 crc kubenswrapper[4753]: I0129 14:20:34.794639 4753 generic.go:334] "Generic (PLEG): container finished" podID="ed4d464b-59be-4a41-b9bc-2f16147c8ced" containerID="838908d5e19934d82b12ae2e84d73201f81cdc2b5ee863afff8ead4c7b4b6cbf" exitCode=0 Jan 29 14:20:34 crc kubenswrapper[4753]: I0129 14:20:34.794772 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n9g6j" event={"ID":"ed4d464b-59be-4a41-b9bc-2f16147c8ced","Type":"ContainerDied","Data":"838908d5e19934d82b12ae2e84d73201f81cdc2b5ee863afff8ead4c7b4b6cbf"} Jan 29 14:20:34 crc kubenswrapper[4753]: I0129 14:20:34.833224 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-gv87j" podStartSLOduration=2.977470171 podStartE2EDuration="18.833197825s" podCreationTimestamp="2026-01-29 14:20:16 +0000 UTC" firstStartedPulling="2026-01-29 14:20:17.597820437 +0000 UTC m=+1052.292554809" lastFinishedPulling="2026-01-29 14:20:33.453548081 +0000 UTC m=+1068.148282463" observedRunningTime="2026-01-29 14:20:34.815208929 +0000 UTC m=+1069.509943391" watchObservedRunningTime="2026-01-29 14:20:34.833197825 +0000 UTC m=+1069.527932247" Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.212436 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.399980 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-combined-ca-bundle\") pod \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.400100 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gkfl\" (UniqueName: \"kubernetes.io/projected/ed4d464b-59be-4a41-b9bc-2f16147c8ced-kube-api-access-9gkfl\") pod \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.400328 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-config-data\") pod \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\" (UID: \"ed4d464b-59be-4a41-b9bc-2f16147c8ced\") " Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.405213 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed4d464b-59be-4a41-b9bc-2f16147c8ced-kube-api-access-9gkfl" (OuterVolumeSpecName: "kube-api-access-9gkfl") pod "ed4d464b-59be-4a41-b9bc-2f16147c8ced" (UID: "ed4d464b-59be-4a41-b9bc-2f16147c8ced"). InnerVolumeSpecName "kube-api-access-9gkfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.435629 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed4d464b-59be-4a41-b9bc-2f16147c8ced" (UID: "ed4d464b-59be-4a41-b9bc-2f16147c8ced"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.466854 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-config-data" (OuterVolumeSpecName: "config-data") pod "ed4d464b-59be-4a41-b9bc-2f16147c8ced" (UID: "ed4d464b-59be-4a41-b9bc-2f16147c8ced"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.502362 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.502392 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gkfl\" (UniqueName: \"kubernetes.io/projected/ed4d464b-59be-4a41-b9bc-2f16147c8ced-kube-api-access-9gkfl\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.502405 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed4d464b-59be-4a41-b9bc-2f16147c8ced-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.815198 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n9g6j" event={"ID":"ed4d464b-59be-4a41-b9bc-2f16147c8ced","Type":"ContainerDied","Data":"7ed66b75810a3565a2882f6a4eb3a01082d7d54952fd3b02d9262ce39a2088bc"} Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.815242 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ed66b75810a3565a2882f6a4eb3a01082d7d54952fd3b02d9262ce39a2088bc" Jan 29 14:20:36 crc kubenswrapper[4753]: I0129 14:20:36.815249 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n9g6j" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.129262 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77bbd879b9-sjsfb"] Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145318 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a96f22ef-0591-44e5-a14d-4517ca0dcf14" containerName="mariadb-database-create" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145360 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a96f22ef-0591-44e5-a14d-4517ca0dcf14" containerName="mariadb-database-create" Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145372 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="563e2a43-bce3-4ba1-bd5e-25ec89449549" containerName="mariadb-database-create" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145379 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="563e2a43-bce3-4ba1-bd5e-25ec89449549" containerName="mariadb-database-create" Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145405 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3ff39f-51df-4a69-a245-fed0285dbbbe" containerName="ovn-config" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145411 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3ff39f-51df-4a69-a245-fed0285dbbbe" containerName="ovn-config" Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145424 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b47fe495-e567-4097-82be-eff62586e721" containerName="mariadb-database-create" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145432 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b47fe495-e567-4097-82be-eff62586e721" containerName="mariadb-database-create" Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145449 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b14dcd-09ad-4186-98d6-781ef2a5c3f6" containerName="dnsmasq-dns" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145455 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b14dcd-09ad-4186-98d6-781ef2a5c3f6" containerName="dnsmasq-dns" Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145475 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4d464b-59be-4a41-b9bc-2f16147c8ced" containerName="keystone-db-sync" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145482 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4d464b-59be-4a41-b9bc-2f16147c8ced" containerName="keystone-db-sync" Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145500 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93698618-cbb9-4158-8c7e-1f37dc143651" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145509 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="93698618-cbb9-4158-8c7e-1f37dc143651" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145533 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f613d6c-6411-47bd-8e65-fd132ae3e874" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145539 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f613d6c-6411-47bd-8e65-fd132ae3e874" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145556 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43161e39-b904-49c6-be19-c11d01817594" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145562 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="43161e39-b904-49c6-be19-c11d01817594" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145576 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b14dcd-09ad-4186-98d6-781ef2a5c3f6" containerName="init" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145583 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b14dcd-09ad-4186-98d6-781ef2a5c3f6" containerName="init" Jan 29 14:20:37 crc kubenswrapper[4753]: E0129 14:20:37.145604 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b970443-356c-45b4-a764-800416b926e2" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145609 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b970443-356c-45b4-a764-800416b926e2" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145959 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3ff39f-51df-4a69-a245-fed0285dbbbe" containerName="ovn-config" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.145983 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a96f22ef-0591-44e5-a14d-4517ca0dcf14" containerName="mariadb-database-create" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.146002 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b47fe495-e567-4097-82be-eff62586e721" containerName="mariadb-database-create" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.146015 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b14dcd-09ad-4186-98d6-781ef2a5c3f6" containerName="dnsmasq-dns" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.146026 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f613d6c-6411-47bd-8e65-fd132ae3e874" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.146049 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed4d464b-59be-4a41-b9bc-2f16147c8ced" containerName="keystone-db-sync" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.146060 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="93698618-cbb9-4158-8c7e-1f37dc143651" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.146072 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="43161e39-b904-49c6-be19-c11d01817594" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.146083 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="563e2a43-bce3-4ba1-bd5e-25ec89449549" containerName="mariadb-database-create" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.146097 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b970443-356c-45b4-a764-800416b926e2" containerName="mariadb-account-create-update" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.159942 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.210454 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77bbd879b9-sjsfb"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.252645 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qfvl6"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.253723 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.260753 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-sb\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.261360 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-swift-storage-0\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.261401 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds2jz\" (UniqueName: \"kubernetes.io/projected/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-kube-api-access-ds2jz\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.261426 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-svc\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.261466 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-config\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.261508 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-nb\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.260902 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rq2fm" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.261302 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.262293 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.263367 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.266143 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.270830 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qfvl6"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363065 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-scripts\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363126 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-fernet-keys\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363162 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-config-data\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363219 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-swift-storage-0\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363238 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvr5t\" (UniqueName: \"kubernetes.io/projected/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-kube-api-access-mvr5t\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363260 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds2jz\" (UniqueName: \"kubernetes.io/projected/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-kube-api-access-ds2jz\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363277 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-combined-ca-bundle\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363294 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-svc\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363321 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-config\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363353 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-nb\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363382 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-credential-keys\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.363406 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-sb\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.364283 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-swift-storage-0\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.364470 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-svc\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.365014 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-config\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.365060 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-sb\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.370239 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-drwzl"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.371643 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.376034 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tn74t" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.376734 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.376853 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.379398 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-nb\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.379705 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-drwzl"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.402932 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds2jz\" (UniqueName: \"kubernetes.io/projected/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-kube-api-access-ds2jz\") pod \"dnsmasq-dns-77bbd879b9-sjsfb\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.439241 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-zlbdz"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.440302 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.445012 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.445305 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5crk2" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.459089 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zlbdz"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465059 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfbwr\" (UniqueName: \"kubernetes.io/projected/07dc97c8-9d59-458c-81af-f83e6f71b09c-kube-api-access-hfbwr\") pod \"neutron-db-sync-drwzl\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465123 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-fernet-keys\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465188 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgtn6\" (UniqueName: \"kubernetes.io/projected/5b901113-4b66-4eb9-ac27-ad9446e6aa29-kube-api-access-mgtn6\") pod \"barbican-db-sync-zlbdz\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465218 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-config-data\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465268 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-combined-ca-bundle\") pod \"barbican-db-sync-zlbdz\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465352 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvr5t\" (UniqueName: \"kubernetes.io/projected/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-kube-api-access-mvr5t\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465385 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-combined-ca-bundle\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465468 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-db-sync-config-data\") pod \"barbican-db-sync-zlbdz\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465530 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-config\") pod \"neutron-db-sync-drwzl\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465554 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-credential-keys\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465615 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-scripts\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.465644 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-combined-ca-bundle\") pod \"neutron-db-sync-drwzl\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.472730 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.484903 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-combined-ca-bundle\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.489919 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-fernet-keys\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.492282 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-credential-keys\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.494092 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-config-data\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.503409 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.505595 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvr5t\" (UniqueName: \"kubernetes.io/projected/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-kube-api-access-mvr5t\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.508399 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.508595 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.524196 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77bbd879b9-sjsfb"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.524780 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.529818 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-scripts\") pod \"keystone-bootstrap-qfvl6\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.543235 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.556393 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-nhv2c"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.557647 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.561465 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.561943 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.563074 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9nfqw" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.567618 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-combined-ca-bundle\") pod \"neutron-db-sync-drwzl\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.567790 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfbwr\" (UniqueName: \"kubernetes.io/projected/07dc97c8-9d59-458c-81af-f83e6f71b09c-kube-api-access-hfbwr\") pod \"neutron-db-sync-drwzl\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.567890 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-run-httpd\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.567982 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-log-httpd\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.568111 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgtn6\" (UniqueName: \"kubernetes.io/projected/5b901113-4b66-4eb9-ac27-ad9446e6aa29-kube-api-access-mgtn6\") pod \"barbican-db-sync-zlbdz\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.568215 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-combined-ca-bundle\") pod \"barbican-db-sync-zlbdz\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.568304 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.568383 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-scripts\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.568460 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsb42\" (UniqueName: \"kubernetes.io/projected/ae4b866a-f13e-4cfd-95c1-50b64d249217-kube-api-access-dsb42\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.568543 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-db-sync-config-data\") pod \"barbican-db-sync-zlbdz\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.568607 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-config-data\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.568684 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-config\") pod \"neutron-db-sync-drwzl\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.568763 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.576275 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nhv2c"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.576777 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-db-sync-config-data\") pod \"barbican-db-sync-zlbdz\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.588130 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8495b76777-c62gs"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.589608 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-combined-ca-bundle\") pod \"neutron-db-sync-drwzl\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.590048 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.590502 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.593810 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfbwr\" (UniqueName: \"kubernetes.io/projected/07dc97c8-9d59-458c-81af-f83e6f71b09c-kube-api-access-hfbwr\") pod \"neutron-db-sync-drwzl\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.596590 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-config\") pod \"neutron-db-sync-drwzl\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.598785 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-combined-ca-bundle\") pod \"barbican-db-sync-zlbdz\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.610646 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-79v95"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.612683 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.622749 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.623091 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mckjn" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.623284 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.633290 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8495b76777-c62gs"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.645389 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgtn6\" (UniqueName: \"kubernetes.io/projected/5b901113-4b66-4eb9-ac27-ad9446e6aa29-kube-api-access-mgtn6\") pod \"barbican-db-sync-zlbdz\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.660017 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-79v95"] Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.670493 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-run-httpd\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.670603 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-log-httpd\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.670719 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.670807 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-logs\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.670870 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7b64f0e-f7ef-4737-a543-edba91ed6811-etc-machine-id\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.670937 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-combined-ca-bundle\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.670997 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hqlf\" (UniqueName: \"kubernetes.io/projected/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-kube-api-access-8hqlf\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.671061 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-scripts\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.671139 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-scripts\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.673797 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-config-data\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.673870 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzjm6\" (UniqueName: \"kubernetes.io/projected/5ae2b88d-362f-4152-af3a-95780fc0bdf4-kube-api-access-dzjm6\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.673962 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsb42\" (UniqueName: \"kubernetes.io/projected/ae4b866a-f13e-4cfd-95c1-50b64d249217-kube-api-access-dsb42\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686030 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-nb\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686076 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-swift-storage-0\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686125 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-db-sync-config-data\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686183 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwj6\" (UniqueName: \"kubernetes.io/projected/a7b64f0e-f7ef-4737-a543-edba91ed6811-kube-api-access-5jwj6\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686226 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-config-data\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686262 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-svc\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686311 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-config-data\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686338 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-config\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686367 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686385 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-combined-ca-bundle\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686419 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-sb\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.686459 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-scripts\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.675075 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-run-httpd\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.675309 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-log-httpd\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.685950 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-scripts\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.688033 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.698841 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.699800 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-config-data\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.721396 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsb42\" (UniqueName: \"kubernetes.io/projected/ae4b866a-f13e-4cfd-95c1-50b64d249217-kube-api-access-dsb42\") pod \"ceilometer-0\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " pod="openstack/ceilometer-0" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.742196 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-drwzl" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790342 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-logs\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790382 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7b64f0e-f7ef-4737-a543-edba91ed6811-etc-machine-id\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790407 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hqlf\" (UniqueName: \"kubernetes.io/projected/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-kube-api-access-8hqlf\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790426 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-combined-ca-bundle\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790447 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-scripts\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790489 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-config-data\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790508 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzjm6\" (UniqueName: \"kubernetes.io/projected/5ae2b88d-362f-4152-af3a-95780fc0bdf4-kube-api-access-dzjm6\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790555 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-nb\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790572 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-swift-storage-0\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790594 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-db-sync-config-data\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790614 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwj6\" (UniqueName: \"kubernetes.io/projected/a7b64f0e-f7ef-4737-a543-edba91ed6811-kube-api-access-5jwj6\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790638 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-svc\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790658 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-config-data\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790679 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-config\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790706 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-combined-ca-bundle\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790736 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-sb\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.790757 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-scripts\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.792821 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-nb\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.794466 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-logs\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.794500 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7b64f0e-f7ef-4737-a543-edba91ed6811-etc-machine-id\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.794754 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-config\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.795032 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-swift-storage-0\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.797700 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-sb\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.797981 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-combined-ca-bundle\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.799652 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-db-sync-config-data\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.799732 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-svc\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.801940 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-config-data\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.817796 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-combined-ca-bundle\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.818092 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-scripts\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.818235 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-scripts\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.820109 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-config-data\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.823305 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzjm6\" (UniqueName: \"kubernetes.io/projected/5ae2b88d-362f-4152-af3a-95780fc0bdf4-kube-api-access-dzjm6\") pod \"dnsmasq-dns-8495b76777-c62gs\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.826428 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwj6\" (UniqueName: \"kubernetes.io/projected/a7b64f0e-f7ef-4737-a543-edba91ed6811-kube-api-access-5jwj6\") pod \"cinder-db-sync-79v95\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.827096 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hqlf\" (UniqueName: \"kubernetes.io/projected/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-kube-api-access-8hqlf\") pod \"placement-db-sync-nhv2c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:37 crc kubenswrapper[4753]: I0129 14:20:37.862572 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.003345 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nhv2c" Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.004315 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.012951 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.020649 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-79v95" Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.235458 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qfvl6"] Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.332299 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77bbd879b9-sjsfb"] Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.358246 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-zlbdz"] Jan 29 14:20:38 crc kubenswrapper[4753]: W0129 14:20:38.359619 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b901113_4b66_4eb9_ac27_ad9446e6aa29.slice/crio-7d3a97f9ba19502b485d735f9c704a1ffdc2374a727200d07e7d774e26fc38e7 WatchSource:0}: Error finding container 7d3a97f9ba19502b485d735f9c704a1ffdc2374a727200d07e7d774e26fc38e7: Status 404 returned error can't find the container with id 7d3a97f9ba19502b485d735f9c704a1ffdc2374a727200d07e7d774e26fc38e7 Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.410999 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-drwzl"] Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.463028 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nhv2c"] Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.834608 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8495b76777-c62gs"] Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.849769 4753 generic.go:334] "Generic (PLEG): container finished" podID="a0dcff86-df6e-4c59-803f-b9d8c32c3a34" containerID="de97b5c7d968d55eb33cf3d978f2872b798a7f1cc6868e64a877c758ce55e98c" exitCode=0 Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.849829 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" event={"ID":"a0dcff86-df6e-4c59-803f-b9d8c32c3a34","Type":"ContainerDied","Data":"de97b5c7d968d55eb33cf3d978f2872b798a7f1cc6868e64a877c758ce55e98c"} Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.849855 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" event={"ID":"a0dcff86-df6e-4c59-803f-b9d8c32c3a34","Type":"ContainerStarted","Data":"e42e9678e00922797d10421d27c1085b4bfe3399ae0bc4090a936458c355ce79"} Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.853424 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nhv2c" event={"ID":"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c","Type":"ContainerStarted","Data":"06e04707d6cc64023d6d85b13ad1b03de9578a1677410578d68ffa7d61cd4b83"} Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.857437 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfvl6" event={"ID":"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c","Type":"ContainerStarted","Data":"bd11cb05b1130591c9e8b25904125b7acdc099679066354cd495cac1fb5d906c"} Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.857505 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfvl6" event={"ID":"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c","Type":"ContainerStarted","Data":"4f9bcf24ae9d652cc881e3c3662b0c2ef65c4a0b5fe57e2e0a89824a003079a6"} Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.859441 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zlbdz" event={"ID":"5b901113-4b66-4eb9-ac27-ad9446e6aa29","Type":"ContainerStarted","Data":"7d3a97f9ba19502b485d735f9c704a1ffdc2374a727200d07e7d774e26fc38e7"} Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.864569 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8495b76777-c62gs" event={"ID":"5ae2b88d-362f-4152-af3a-95780fc0bdf4","Type":"ContainerStarted","Data":"967385d61740ff4d82b5cfd1f92478bb7c2bb87a9231699e7acb1e3763ec1947"} Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.866061 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-drwzl" event={"ID":"07dc97c8-9d59-458c-81af-f83e6f71b09c","Type":"ContainerStarted","Data":"14e5df1a722e4fdd9f1cdff60bc64e62c47ca34041f0ff038441520109891f30"} Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.866098 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-drwzl" event={"ID":"07dc97c8-9d59-458c-81af-f83e6f71b09c","Type":"ContainerStarted","Data":"8dc2422e3235c15a5ec383b92b53c27453e9f267a75e63d951c62e4fcfc57a00"} Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.892990 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qfvl6" podStartSLOduration=1.892975452 podStartE2EDuration="1.892975452s" podCreationTimestamp="2026-01-29 14:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:20:38.888991404 +0000 UTC m=+1073.583725786" watchObservedRunningTime="2026-01-29 14:20:38.892975452 +0000 UTC m=+1073.587709834" Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.912807 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-79v95"] Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.923207 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-drwzl" podStartSLOduration=1.923189447 podStartE2EDuration="1.923189447s" podCreationTimestamp="2026-01-29 14:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:20:38.902067957 +0000 UTC m=+1073.596802339" watchObservedRunningTime="2026-01-29 14:20:38.923189447 +0000 UTC m=+1073.617923829" Jan 29 14:20:38 crc kubenswrapper[4753]: I0129 14:20:38.924235 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.162071 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.282558 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-svc\") pod \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.282624 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds2jz\" (UniqueName: \"kubernetes.io/projected/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-kube-api-access-ds2jz\") pod \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.282728 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-config\") pod \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.282811 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-swift-storage-0\") pod \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.282838 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-nb\") pod \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.282911 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-sb\") pod \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\" (UID: \"a0dcff86-df6e-4c59-803f-b9d8c32c3a34\") " Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.286977 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-kube-api-access-ds2jz" (OuterVolumeSpecName: "kube-api-access-ds2jz") pod "a0dcff86-df6e-4c59-803f-b9d8c32c3a34" (UID: "a0dcff86-df6e-4c59-803f-b9d8c32c3a34"). InnerVolumeSpecName "kube-api-access-ds2jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.302546 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-config" (OuterVolumeSpecName: "config") pod "a0dcff86-df6e-4c59-803f-b9d8c32c3a34" (UID: "a0dcff86-df6e-4c59-803f-b9d8c32c3a34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.304621 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a0dcff86-df6e-4c59-803f-b9d8c32c3a34" (UID: "a0dcff86-df6e-4c59-803f-b9d8c32c3a34"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.309347 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a0dcff86-df6e-4c59-803f-b9d8c32c3a34" (UID: "a0dcff86-df6e-4c59-803f-b9d8c32c3a34"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.314737 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a0dcff86-df6e-4c59-803f-b9d8c32c3a34" (UID: "a0dcff86-df6e-4c59-803f-b9d8c32c3a34"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.328218 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0dcff86-df6e-4c59-803f-b9d8c32c3a34" (UID: "a0dcff86-df6e-4c59-803f-b9d8c32c3a34"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.385619 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.385653 4753 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.385665 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.385675 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.385683 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.385694 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds2jz\" (UniqueName: \"kubernetes.io/projected/a0dcff86-df6e-4c59-803f-b9d8c32c3a34-kube-api-access-ds2jz\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.828820 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.900033 4753 generic.go:334] "Generic (PLEG): container finished" podID="5ae2b88d-362f-4152-af3a-95780fc0bdf4" containerID="655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804" exitCode=0 Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.900100 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8495b76777-c62gs" event={"ID":"5ae2b88d-362f-4152-af3a-95780fc0bdf4","Type":"ContainerDied","Data":"655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804"} Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.902356 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-79v95" event={"ID":"a7b64f0e-f7ef-4737-a543-edba91ed6811","Type":"ContainerStarted","Data":"b2bf2be42c6804e51c541b9a00622e903f0dbabe0b17f466e73a5738805284ed"} Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.904576 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" event={"ID":"a0dcff86-df6e-4c59-803f-b9d8c32c3a34","Type":"ContainerDied","Data":"e42e9678e00922797d10421d27c1085b4bfe3399ae0bc4090a936458c355ce79"} Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.904601 4753 scope.go:117] "RemoveContainer" containerID="de97b5c7d968d55eb33cf3d978f2872b798a7f1cc6868e64a877c758ce55e98c" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.904708 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77bbd879b9-sjsfb" Jan 29 14:20:39 crc kubenswrapper[4753]: I0129 14:20:39.911889 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae4b866a-f13e-4cfd-95c1-50b64d249217","Type":"ContainerStarted","Data":"dec543e3fff7226d229688f209f1431e5bd828f37960b101ec7840b804de5960"} Jan 29 14:20:40 crc kubenswrapper[4753]: I0129 14:20:40.130236 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77bbd879b9-sjsfb"] Jan 29 14:20:40 crc kubenswrapper[4753]: I0129 14:20:40.133020 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77bbd879b9-sjsfb"] Jan 29 14:20:40 crc kubenswrapper[4753]: I0129 14:20:40.167726 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0dcff86-df6e-4c59-803f-b9d8c32c3a34" path="/var/lib/kubelet/pods/a0dcff86-df6e-4c59-803f-b9d8c32c3a34/volumes" Jan 29 14:20:42 crc kubenswrapper[4753]: I0129 14:20:42.953987 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8495b76777-c62gs" event={"ID":"5ae2b88d-362f-4152-af3a-95780fc0bdf4","Type":"ContainerStarted","Data":"13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093"} Jan 29 14:20:42 crc kubenswrapper[4753]: I0129 14:20:42.954707 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:42 crc kubenswrapper[4753]: I0129 14:20:42.995542 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8495b76777-c62gs" podStartSLOduration=5.995455901 podStartE2EDuration="5.995455901s" podCreationTimestamp="2026-01-29 14:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:20:42.981950646 +0000 UTC m=+1077.676685068" watchObservedRunningTime="2026-01-29 14:20:42.995455901 +0000 UTC m=+1077.690190293" Jan 29 14:20:44 crc kubenswrapper[4753]: I0129 14:20:44.975703 4753 generic.go:334] "Generic (PLEG): container finished" podID="9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" containerID="bd11cb05b1130591c9e8b25904125b7acdc099679066354cd495cac1fb5d906c" exitCode=0 Jan 29 14:20:44 crc kubenswrapper[4753]: I0129 14:20:44.975788 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfvl6" event={"ID":"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c","Type":"ContainerDied","Data":"bd11cb05b1130591c9e8b25904125b7acdc099679066354cd495cac1fb5d906c"} Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.715933 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.852046 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-scripts\") pod \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.852088 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-config-data\") pod \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.852193 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-credential-keys\") pod \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.852218 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-combined-ca-bundle\") pod \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.852276 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvr5t\" (UniqueName: \"kubernetes.io/projected/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-kube-api-access-mvr5t\") pod \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.852299 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-fernet-keys\") pod \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\" (UID: \"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c\") " Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.859102 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" (UID: "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.859898 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-scripts" (OuterVolumeSpecName: "scripts") pod "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" (UID: "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.860759 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-kube-api-access-mvr5t" (OuterVolumeSpecName: "kube-api-access-mvr5t") pod "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" (UID: "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c"). InnerVolumeSpecName "kube-api-access-mvr5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.873453 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" (UID: "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.890945 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-config-data" (OuterVolumeSpecName: "config-data") pod "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" (UID: "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.900314 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" (UID: "9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.954384 4753 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.954427 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.954442 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvr5t\" (UniqueName: \"kubernetes.io/projected/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-kube-api-access-mvr5t\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.954458 4753 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.954469 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.954484 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.997048 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfvl6" event={"ID":"9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c","Type":"ContainerDied","Data":"4f9bcf24ae9d652cc881e3c3662b0c2ef65c4a0b5fe57e2e0a89824a003079a6"} Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.997087 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f9bcf24ae9d652cc881e3c3662b0c2ef65c4a0b5fe57e2e0a89824a003079a6" Jan 29 14:20:46 crc kubenswrapper[4753]: I0129 14:20:46.997140 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfvl6" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.000297 4753 generic.go:334] "Generic (PLEG): container finished" podID="53c1c8ae-4366-410e-b16b-3d06d55ce6e4" containerID="9de8348dae5eee512c3a686a7ee1add3f75b5b58d76f466710a04d0d910d5cd7" exitCode=0 Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.000347 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gv87j" event={"ID":"53c1c8ae-4366-410e-b16b-3d06d55ce6e4","Type":"ContainerDied","Data":"9de8348dae5eee512c3a686a7ee1add3f75b5b58d76f466710a04d0d910d5cd7"} Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.102598 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qfvl6"] Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.109727 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qfvl6"] Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.173989 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-q5h9z"] Jan 29 14:20:47 crc kubenswrapper[4753]: E0129 14:20:47.174726 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" containerName="keystone-bootstrap" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.174755 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" containerName="keystone-bootstrap" Jan 29 14:20:47 crc kubenswrapper[4753]: E0129 14:20:47.174786 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0dcff86-df6e-4c59-803f-b9d8c32c3a34" containerName="init" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.174797 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0dcff86-df6e-4c59-803f-b9d8c32c3a34" containerName="init" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.175101 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0dcff86-df6e-4c59-803f-b9d8c32c3a34" containerName="init" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.175138 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" containerName="keystone-bootstrap" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.176316 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.180523 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.180878 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.181084 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.181338 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.182116 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q5h9z"] Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.182232 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rq2fm" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.264417 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-combined-ca-bundle\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.264466 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-config-data\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.264493 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-scripts\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.264519 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-fernet-keys\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.264684 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlvz8\" (UniqueName: \"kubernetes.io/projected/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-kube-api-access-jlvz8\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.264816 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-credential-keys\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.366873 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-combined-ca-bundle\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.366911 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-config-data\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.366925 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-scripts\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.366943 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-fernet-keys\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.366985 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlvz8\" (UniqueName: \"kubernetes.io/projected/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-kube-api-access-jlvz8\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.367016 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-credential-keys\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.371901 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-combined-ca-bundle\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.372505 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-fernet-keys\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.374477 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-credential-keys\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.382100 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-config-data\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.382503 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-scripts\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.383988 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlvz8\" (UniqueName: \"kubernetes.io/projected/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-kube-api-access-jlvz8\") pod \"keystone-bootstrap-q5h9z\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:47 crc kubenswrapper[4753]: I0129 14:20:47.501135 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:20:48 crc kubenswrapper[4753]: I0129 14:20:48.016239 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:20:48 crc kubenswrapper[4753]: I0129 14:20:48.088212 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-jvknq"] Jan 29 14:20:48 crc kubenswrapper[4753]: I0129 14:20:48.088617 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" podUID="60785353-1684-46ad-92ed-4d984254055e" containerName="dnsmasq-dns" containerID="cri-o://93e83a50b81245ae1c980724868a73a39f9cabd4d61a265bc0e9fc0a573f2612" gracePeriod=10 Jan 29 14:20:48 crc kubenswrapper[4753]: I0129 14:20:48.163627 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c" path="/var/lib/kubelet/pods/9afab373-ceb7-4b44-b4d0-2c7e42ed7d5c/volumes" Jan 29 14:20:49 crc kubenswrapper[4753]: I0129 14:20:49.044605 4753 generic.go:334] "Generic (PLEG): container finished" podID="60785353-1684-46ad-92ed-4d984254055e" containerID="93e83a50b81245ae1c980724868a73a39f9cabd4d61a265bc0e9fc0a573f2612" exitCode=0 Jan 29 14:20:49 crc kubenswrapper[4753]: I0129 14:20:49.045005 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" event={"ID":"60785353-1684-46ad-92ed-4d984254055e","Type":"ContainerDied","Data":"93e83a50b81245ae1c980724868a73a39f9cabd4d61a265bc0e9fc0a573f2612"} Jan 29 14:20:50 crc kubenswrapper[4753]: I0129 14:20:50.202036 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" podUID="60785353-1684-46ad-92ed-4d984254055e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: connect: connection refused" Jan 29 14:20:55 crc kubenswrapper[4753]: I0129 14:20:55.202620 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" podUID="60785353-1684-46ad-92ed-4d984254055e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: connect: connection refused" Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.745900 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.773926 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-combined-ca-bundle\") pod \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.774089 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9bbt\" (UniqueName: \"kubernetes.io/projected/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-kube-api-access-m9bbt\") pod \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.774293 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-config-data\") pod \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.774333 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-db-sync-config-data\") pod \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\" (UID: \"53c1c8ae-4366-410e-b16b-3d06d55ce6e4\") " Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.797558 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-kube-api-access-m9bbt" (OuterVolumeSpecName: "kube-api-access-m9bbt") pod "53c1c8ae-4366-410e-b16b-3d06d55ce6e4" (UID: "53c1c8ae-4366-410e-b16b-3d06d55ce6e4"). InnerVolumeSpecName "kube-api-access-m9bbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.809938 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "53c1c8ae-4366-410e-b16b-3d06d55ce6e4" (UID: "53c1c8ae-4366-410e-b16b-3d06d55ce6e4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.836816 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53c1c8ae-4366-410e-b16b-3d06d55ce6e4" (UID: "53c1c8ae-4366-410e-b16b-3d06d55ce6e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.864705 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-config-data" (OuterVolumeSpecName: "config-data") pod "53c1c8ae-4366-410e-b16b-3d06d55ce6e4" (UID: "53c1c8ae-4366-410e-b16b-3d06d55ce6e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.875786 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.875820 4753 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.875832 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:56 crc kubenswrapper[4753]: I0129 14:20:56.875841 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9bbt\" (UniqueName: \"kubernetes.io/projected/53c1c8ae-4366-410e-b16b-3d06d55ce6e4-kube-api-access-m9bbt\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:57 crc kubenswrapper[4753]: I0129 14:20:57.054585 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:20:57 crc kubenswrapper[4753]: I0129 14:20:57.054642 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:20:57 crc kubenswrapper[4753]: I0129 14:20:57.116120 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gv87j" event={"ID":"53c1c8ae-4366-410e-b16b-3d06d55ce6e4","Type":"ContainerDied","Data":"ab198333c527bc278ca83a5ed2c64a8cc6a99e23519bda938c697c4dbe72f05f"} Jan 29 14:20:57 crc kubenswrapper[4753]: I0129 14:20:57.116175 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab198333c527bc278ca83a5ed2c64a8cc6a99e23519bda938c697c4dbe72f05f" Jan 29 14:20:57 crc kubenswrapper[4753]: I0129 14:20:57.116228 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gv87j" Jan 29 14:20:57 crc kubenswrapper[4753]: E0129 14:20:57.358906 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Jan 29 14:20:57 crc kubenswrapper[4753]: E0129 14:20:57.360380 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mgtn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-zlbdz_openstack(5b901113-4b66-4eb9-ac27-ad9446e6aa29): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:20:57 crc kubenswrapper[4753]: E0129 14:20:57.361620 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-zlbdz" podUID="5b901113-4b66-4eb9-ac27-ad9446e6aa29" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.144107 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66567888d7-7gh8w"] Jan 29 14:20:58 crc kubenswrapper[4753]: E0129 14:20:58.146850 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c1c8ae-4366-410e-b16b-3d06d55ce6e4" containerName="glance-db-sync" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.146887 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c1c8ae-4366-410e-b16b-3d06d55ce6e4" containerName="glance-db-sync" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.147264 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c1c8ae-4366-410e-b16b-3d06d55ce6e4" containerName="glance-db-sync" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.150266 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: E0129 14:20:58.166159 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack/barbican-db-sync-zlbdz" podUID="5b901113-4b66-4eb9-ac27-ad9446e6aa29" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.193420 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66567888d7-7gh8w"] Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.240042 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-svc\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.240947 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-swift-storage-0\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.241439 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-nb\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.252583 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-config\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.252706 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-sb\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.252872 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8d8p\" (UniqueName: \"kubernetes.io/projected/b09bd787-c105-4203-a0ca-3201ba0c3645-kube-api-access-n8d8p\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.354743 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-nb\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.355258 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-config\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.355296 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-sb\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.355338 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8d8p\" (UniqueName: \"kubernetes.io/projected/b09bd787-c105-4203-a0ca-3201ba0c3645-kube-api-access-n8d8p\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.355386 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-svc\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.355412 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-swift-storage-0\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.355865 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-nb\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.357375 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-sb\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.357399 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-svc\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.358608 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-swift-storage-0\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.360331 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-config\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.385850 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8d8p\" (UniqueName: \"kubernetes.io/projected/b09bd787-c105-4203-a0ca-3201ba0c3645-kube-api-access-n8d8p\") pod \"dnsmasq-dns-66567888d7-7gh8w\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.482961 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:20:58 crc kubenswrapper[4753]: E0129 14:20:58.868815 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Jan 29 14:20:58 crc kubenswrapper[4753]: E0129 14:20:58.869002 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5jwj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-79v95_openstack(a7b64f0e-f7ef-4737-a543-edba91ed6811): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 29 14:20:58 crc kubenswrapper[4753]: E0129 14:20:58.870437 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-79v95" podUID="a7b64f0e-f7ef-4737-a543-edba91ed6811" Jan 29 14:20:58 crc kubenswrapper[4753]: I0129 14:20:58.960771 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.076848 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-nb\") pod \"60785353-1684-46ad-92ed-4d984254055e\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.076945 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-sb\") pod \"60785353-1684-46ad-92ed-4d984254055e\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.077690 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-swift-storage-0\") pod \"60785353-1684-46ad-92ed-4d984254055e\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.077708 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f28xg\" (UniqueName: \"kubernetes.io/projected/60785353-1684-46ad-92ed-4d984254055e-kube-api-access-f28xg\") pod \"60785353-1684-46ad-92ed-4d984254055e\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.077748 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-svc\") pod \"60785353-1684-46ad-92ed-4d984254055e\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.077798 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-config\") pod \"60785353-1684-46ad-92ed-4d984254055e\" (UID: \"60785353-1684-46ad-92ed-4d984254055e\") " Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.085661 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60785353-1684-46ad-92ed-4d984254055e-kube-api-access-f28xg" (OuterVolumeSpecName: "kube-api-access-f28xg") pod "60785353-1684-46ad-92ed-4d984254055e" (UID: "60785353-1684-46ad-92ed-4d984254055e"). InnerVolumeSpecName "kube-api-access-f28xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.179624 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f28xg\" (UniqueName: \"kubernetes.io/projected/60785353-1684-46ad-92ed-4d984254055e-kube-api-access-f28xg\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.182983 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" event={"ID":"60785353-1684-46ad-92ed-4d984254055e","Type":"ContainerDied","Data":"162ae6c7846c2215193bc8beef3677fee2b752b32e76cc3a413ab406ff8ab726"} Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.183039 4753 scope.go:117] "RemoveContainer" containerID="93e83a50b81245ae1c980724868a73a39f9cabd4d61a265bc0e9fc0a573f2612" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.183209 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bdffd66f-jvknq" Jan 29 14:20:59 crc kubenswrapper[4753]: E0129 14:20:59.187484 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-79v95" podUID="a7b64f0e-f7ef-4737-a543-edba91ed6811" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.207048 4753 scope.go:117] "RemoveContainer" containerID="ff8c57e9cd091f2d8bb0c5ce1996f9a36382b6cc2b2708bdfe2dfcc845e0dde9" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.218810 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:20:59 crc kubenswrapper[4753]: E0129 14:20:59.219237 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60785353-1684-46ad-92ed-4d984254055e" containerName="init" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.219261 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="60785353-1684-46ad-92ed-4d984254055e" containerName="init" Jan 29 14:20:59 crc kubenswrapper[4753]: E0129 14:20:59.219283 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60785353-1684-46ad-92ed-4d984254055e" containerName="dnsmasq-dns" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.219291 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="60785353-1684-46ad-92ed-4d984254055e" containerName="dnsmasq-dns" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.219476 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="60785353-1684-46ad-92ed-4d984254055e" containerName="dnsmasq-dns" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.220389 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.221102 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60785353-1684-46ad-92ed-4d984254055e" (UID: "60785353-1684-46ad-92ed-4d984254055e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.222656 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.222942 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-thtgt" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.223064 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.242632 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "60785353-1684-46ad-92ed-4d984254055e" (UID: "60785353-1684-46ad-92ed-4d984254055e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.256871 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60785353-1684-46ad-92ed-4d984254055e" (UID: "60785353-1684-46ad-92ed-4d984254055e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.261010 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-config" (OuterVolumeSpecName: "config") pod "60785353-1684-46ad-92ed-4d984254055e" (UID: "60785353-1684-46ad-92ed-4d984254055e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.265042 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "60785353-1684-46ad-92ed-4d984254055e" (UID: "60785353-1684-46ad-92ed-4d984254055e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.278237 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.281332 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.281412 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-logs\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.281488 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.281710 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.285196 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.285303 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmhfl\" (UniqueName: \"kubernetes.io/projected/b5a9b99a-4591-4d8b-8534-115cf9d549e1-kube-api-access-wmhfl\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.285916 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.286178 4753 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.286241 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.286291 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.286339 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.286386 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60785353-1684-46ad-92ed-4d984254055e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.298619 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.302378 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.304552 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.307875 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.321705 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q5h9z"] Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388112 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388203 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388233 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388344 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9ll7\" (UniqueName: \"kubernetes.io/projected/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-kube-api-access-z9ll7\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388373 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-logs\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388393 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388447 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388471 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388538 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388558 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.388611 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.389189 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-logs\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.390440 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.390533 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.390620 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmhfl\" (UniqueName: \"kubernetes.io/projected/b5a9b99a-4591-4d8b-8534-115cf9d549e1-kube-api-access-wmhfl\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.390688 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.391434 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.393648 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.395692 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.406582 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmhfl\" (UniqueName: \"kubernetes.io/projected/b5a9b99a-4591-4d8b-8534-115cf9d549e1-kube-api-access-wmhfl\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.408240 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.416611 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: W0129 14:20:59.453843 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb09bd787_c105_4203_a0ca_3201ba0c3645.slice/crio-ebf16a1cc02788552699b44350e447da68a27ffaa2513c86f115680217cb1587 WatchSource:0}: Error finding container ebf16a1cc02788552699b44350e447da68a27ffaa2513c86f115680217cb1587: Status 404 returned error can't find the container with id ebf16a1cc02788552699b44350e447da68a27ffaa2513c86f115680217cb1587 Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.454978 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66567888d7-7gh8w"] Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.491974 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.492034 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.492061 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.492135 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.492192 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9ll7\" (UniqueName: \"kubernetes.io/projected/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-kube-api-access-z9ll7\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.492213 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.492261 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.492650 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.492862 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.492938 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.495172 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.497543 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.517646 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.518239 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-jvknq"] Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.520420 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9ll7\" (UniqueName: \"kubernetes.io/projected/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-kube-api-access-z9ll7\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.540335 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bdffd66f-jvknq"] Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.543425 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.557736 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:20:59 crc kubenswrapper[4753]: I0129 14:20:59.624900 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:00 crc kubenswrapper[4753]: I0129 14:21:00.161609 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60785353-1684-46ad-92ed-4d984254055e" path="/var/lib/kubelet/pods/60785353-1684-46ad-92ed-4d984254055e/volumes" Jan 29 14:21:00 crc kubenswrapper[4753]: I0129 14:21:00.191998 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nhv2c" event={"ID":"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c","Type":"ContainerStarted","Data":"e00e67a39ecbbcca5a026b499f041483245ed4bf836f275bfea36c7f48164b65"} Jan 29 14:21:00 crc kubenswrapper[4753]: I0129 14:21:00.195642 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae4b866a-f13e-4cfd-95c1-50b64d249217","Type":"ContainerStarted","Data":"ff655e22020abc68977684e4d6b5233ddce78356b5d081acecc4a689f675327e"} Jan 29 14:21:00 crc kubenswrapper[4753]: I0129 14:21:00.213626 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-nhv2c" podStartSLOduration=2.912694327 podStartE2EDuration="23.213423809s" podCreationTimestamp="2026-01-29 14:20:37 +0000 UTC" firstStartedPulling="2026-01-29 14:20:38.539605159 +0000 UTC m=+1073.234339541" lastFinishedPulling="2026-01-29 14:20:58.840334641 +0000 UTC m=+1093.535069023" observedRunningTime="2026-01-29 14:21:00.209534944 +0000 UTC m=+1094.904269326" watchObservedRunningTime="2026-01-29 14:21:00.213423809 +0000 UTC m=+1094.908158191" Jan 29 14:21:00 crc kubenswrapper[4753]: I0129 14:21:00.217454 4753 generic.go:334] "Generic (PLEG): container finished" podID="b09bd787-c105-4203-a0ca-3201ba0c3645" containerID="8191e39b2f0f4c59b6fbd551757fdf8c6644551043d7156e679ceec214d94f1f" exitCode=0 Jan 29 14:21:00 crc kubenswrapper[4753]: I0129 14:21:00.217536 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" event={"ID":"b09bd787-c105-4203-a0ca-3201ba0c3645","Type":"ContainerDied","Data":"8191e39b2f0f4c59b6fbd551757fdf8c6644551043d7156e679ceec214d94f1f"} Jan 29 14:21:00 crc kubenswrapper[4753]: I0129 14:21:00.217563 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" event={"ID":"b09bd787-c105-4203-a0ca-3201ba0c3645","Type":"ContainerStarted","Data":"ebf16a1cc02788552699b44350e447da68a27ffaa2513c86f115680217cb1587"} Jan 29 14:21:00 crc kubenswrapper[4753]: I0129 14:21:00.220390 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5h9z" event={"ID":"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb","Type":"ContainerStarted","Data":"ebad3f69ada789b8ad418576e49d5088e4c02b2732eef46ca9ae5b4cc644d332"} Jan 29 14:21:00 crc kubenswrapper[4753]: I0129 14:21:00.220430 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5h9z" event={"ID":"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb","Type":"ContainerStarted","Data":"fa018d805bcb408c6b2ecefddecc8f4031cb562ac42ea7fb361f7b61c6d9a1df"} Jan 29 14:21:00 crc kubenswrapper[4753]: I0129 14:21:00.281573 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-q5h9z" podStartSLOduration=13.281550139 podStartE2EDuration="13.281550139s" podCreationTimestamp="2026-01-29 14:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:00.2693794 +0000 UTC m=+1094.964113792" watchObservedRunningTime="2026-01-29 14:21:00.281550139 +0000 UTC m=+1094.976284521" Jan 29 14:21:01 crc kubenswrapper[4753]: I0129 14:21:01.069436 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:21:01 crc kubenswrapper[4753]: I0129 14:21:01.111302 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:21:01 crc kubenswrapper[4753]: I0129 14:21:01.156142 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:21:01 crc kubenswrapper[4753]: W0129 14:21:01.206961 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb32ec20f_cbc1_4df3_aa94_b5542a8268d8.slice/crio-0b48e3dc57ec1507d48d5c5c6d8083169c618c9fbd2c16bff282403730f799c2 WatchSource:0}: Error finding container 0b48e3dc57ec1507d48d5c5c6d8083169c618c9fbd2c16bff282403730f799c2: Status 404 returned error can't find the container with id 0b48e3dc57ec1507d48d5c5c6d8083169c618c9fbd2c16bff282403730f799c2 Jan 29 14:21:01 crc kubenswrapper[4753]: I0129 14:21:01.247307 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b32ec20f-cbc1-4df3-aa94-b5542a8268d8","Type":"ContainerStarted","Data":"0b48e3dc57ec1507d48d5c5c6d8083169c618c9fbd2c16bff282403730f799c2"} Jan 29 14:21:02 crc kubenswrapper[4753]: I0129 14:21:02.103515 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:21:02 crc kubenswrapper[4753]: W0129 14:21:02.116520 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5a9b99a_4591_4d8b_8534_115cf9d549e1.slice/crio-994455796a6f1c15ea2fc2de261033fa6223d25c7faa6762d78d9fb8dee77369 WatchSource:0}: Error finding container 994455796a6f1c15ea2fc2de261033fa6223d25c7faa6762d78d9fb8dee77369: Status 404 returned error can't find the container with id 994455796a6f1c15ea2fc2de261033fa6223d25c7faa6762d78d9fb8dee77369 Jan 29 14:21:02 crc kubenswrapper[4753]: I0129 14:21:02.258955 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae4b866a-f13e-4cfd-95c1-50b64d249217","Type":"ContainerStarted","Data":"3d5f6cf41f97cc1c79168168e732ba985f336c94896a7f61c62971e27348ed24"} Jan 29 14:21:02 crc kubenswrapper[4753]: I0129 14:21:02.261300 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" event={"ID":"b09bd787-c105-4203-a0ca-3201ba0c3645","Type":"ContainerStarted","Data":"f6e77f0fdbdf5dcf134b20797af347648ff29546478df07ba96c683e298aec54"} Jan 29 14:21:02 crc kubenswrapper[4753]: I0129 14:21:02.261361 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:21:02 crc kubenswrapper[4753]: I0129 14:21:02.264139 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5a9b99a-4591-4d8b-8534-115cf9d549e1","Type":"ContainerStarted","Data":"994455796a6f1c15ea2fc2de261033fa6223d25c7faa6762d78d9fb8dee77369"} Jan 29 14:21:02 crc kubenswrapper[4753]: I0129 14:21:02.266659 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b32ec20f-cbc1-4df3-aa94-b5542a8268d8","Type":"ContainerStarted","Data":"b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252"} Jan 29 14:21:02 crc kubenswrapper[4753]: I0129 14:21:02.284195 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" podStartSLOduration=4.284176056 podStartE2EDuration="4.284176056s" podCreationTimestamp="2026-01-29 14:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:02.27804135 +0000 UTC m=+1096.972775752" watchObservedRunningTime="2026-01-29 14:21:02.284176056 +0000 UTC m=+1096.978910438" Jan 29 14:21:03 crc kubenswrapper[4753]: I0129 14:21:03.280181 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5a9b99a-4591-4d8b-8534-115cf9d549e1","Type":"ContainerStarted","Data":"da1ae90af0cb13e22f00fa6061683c30f0ffd82c7f6ae4ec521d3a1a0aea10b7"} Jan 29 14:21:03 crc kubenswrapper[4753]: I0129 14:21:03.280571 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5a9b99a-4591-4d8b-8534-115cf9d549e1","Type":"ContainerStarted","Data":"6cf829791a4f4878cfbaec0b92a1f1a45349cf6ae7bd635cbe469c9416b023c3"} Jan 29 14:21:03 crc kubenswrapper[4753]: I0129 14:21:03.280307 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" containerName="glance-log" containerID="cri-o://6cf829791a4f4878cfbaec0b92a1f1a45349cf6ae7bd635cbe469c9416b023c3" gracePeriod=30 Jan 29 14:21:03 crc kubenswrapper[4753]: I0129 14:21:03.280349 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" containerName="glance-httpd" containerID="cri-o://da1ae90af0cb13e22f00fa6061683c30f0ffd82c7f6ae4ec521d3a1a0aea10b7" gracePeriod=30 Jan 29 14:21:03 crc kubenswrapper[4753]: I0129 14:21:03.292176 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b32ec20f-cbc1-4df3-aa94-b5542a8268d8","Type":"ContainerStarted","Data":"51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff"} Jan 29 14:21:03 crc kubenswrapper[4753]: I0129 14:21:03.292450 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" containerName="glance-log" containerID="cri-o://b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252" gracePeriod=30 Jan 29 14:21:03 crc kubenswrapper[4753]: I0129 14:21:03.292458 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" containerName="glance-httpd" containerID="cri-o://51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff" gracePeriod=30 Jan 29 14:21:03 crc kubenswrapper[4753]: I0129 14:21:03.344103 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.344085837 podStartE2EDuration="5.344085837s" podCreationTimestamp="2026-01-29 14:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:03.332626557 +0000 UTC m=+1098.027360939" watchObservedRunningTime="2026-01-29 14:21:03.344085837 +0000 UTC m=+1098.038820219" Jan 29 14:21:03 crc kubenswrapper[4753]: I0129 14:21:03.379793 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.37977605 podStartE2EDuration="5.37977605s" podCreationTimestamp="2026-01-29 14:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:03.363932472 +0000 UTC m=+1098.058666844" watchObservedRunningTime="2026-01-29 14:21:03.37977605 +0000 UTC m=+1098.074510432" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.005493 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.097956 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-combined-ca-bundle\") pod \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.098029 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-httpd-run\") pod \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.098131 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9ll7\" (UniqueName: \"kubernetes.io/projected/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-kube-api-access-z9ll7\") pod \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.098433 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-config-data\") pod \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.098499 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-logs\") pod \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.098521 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-scripts\") pod \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.098535 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\" (UID: \"b32ec20f-cbc1-4df3-aa94-b5542a8268d8\") " Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.098681 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b32ec20f-cbc1-4df3-aa94-b5542a8268d8" (UID: "b32ec20f-cbc1-4df3-aa94-b5542a8268d8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.099011 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-logs" (OuterVolumeSpecName: "logs") pod "b32ec20f-cbc1-4df3-aa94-b5542a8268d8" (UID: "b32ec20f-cbc1-4df3-aa94-b5542a8268d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.099421 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.099440 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.109257 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "b32ec20f-cbc1-4df3-aa94-b5542a8268d8" (UID: "b32ec20f-cbc1-4df3-aa94-b5542a8268d8"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.110551 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-kube-api-access-z9ll7" (OuterVolumeSpecName: "kube-api-access-z9ll7") pod "b32ec20f-cbc1-4df3-aa94-b5542a8268d8" (UID: "b32ec20f-cbc1-4df3-aa94-b5542a8268d8"). InnerVolumeSpecName "kube-api-access-z9ll7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.112791 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-scripts" (OuterVolumeSpecName: "scripts") pod "b32ec20f-cbc1-4df3-aa94-b5542a8268d8" (UID: "b32ec20f-cbc1-4df3-aa94-b5542a8268d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.127866 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b32ec20f-cbc1-4df3-aa94-b5542a8268d8" (UID: "b32ec20f-cbc1-4df3-aa94-b5542a8268d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.149878 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-config-data" (OuterVolumeSpecName: "config-data") pod "b32ec20f-cbc1-4df3-aa94-b5542a8268d8" (UID: "b32ec20f-cbc1-4df3-aa94-b5542a8268d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.200650 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.200690 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.200700 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.200711 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9ll7\" (UniqueName: \"kubernetes.io/projected/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-kube-api-access-z9ll7\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.200721 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32ec20f-cbc1-4df3-aa94-b5542a8268d8-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.217962 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.301847 4753 generic.go:334] "Generic (PLEG): container finished" podID="aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" containerID="ebad3f69ada789b8ad418576e49d5088e4c02b2732eef46ca9ae5b4cc644d332" exitCode=0 Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.301913 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5h9z" event={"ID":"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb","Type":"ContainerDied","Data":"ebad3f69ada789b8ad418576e49d5088e4c02b2732eef46ca9ae5b4cc644d332"} Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.301885 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.306058 4753 generic.go:334] "Generic (PLEG): container finished" podID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" containerID="6cf829791a4f4878cfbaec0b92a1f1a45349cf6ae7bd635cbe469c9416b023c3" exitCode=143 Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.306119 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5a9b99a-4591-4d8b-8534-115cf9d549e1","Type":"ContainerDied","Data":"6cf829791a4f4878cfbaec0b92a1f1a45349cf6ae7bd635cbe469c9416b023c3"} Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.309058 4753 generic.go:334] "Generic (PLEG): container finished" podID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" containerID="51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff" exitCode=0 Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.309083 4753 generic.go:334] "Generic (PLEG): container finished" podID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" containerID="b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252" exitCode=143 Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.309104 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.309188 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b32ec20f-cbc1-4df3-aa94-b5542a8268d8","Type":"ContainerDied","Data":"51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff"} Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.309210 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b32ec20f-cbc1-4df3-aa94-b5542a8268d8","Type":"ContainerDied","Data":"b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252"} Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.309220 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b32ec20f-cbc1-4df3-aa94-b5542a8268d8","Type":"ContainerDied","Data":"0b48e3dc57ec1507d48d5c5c6d8083169c618c9fbd2c16bff282403730f799c2"} Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.309260 4753 scope.go:117] "RemoveContainer" containerID="51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.341426 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.352963 4753 scope.go:117] "RemoveContainer" containerID="b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.357455 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.376051 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:21:04 crc kubenswrapper[4753]: E0129 14:21:04.376467 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" containerName="glance-httpd" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.376484 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" containerName="glance-httpd" Jan 29 14:21:04 crc kubenswrapper[4753]: E0129 14:21:04.376499 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" containerName="glance-log" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.376506 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" containerName="glance-log" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.376662 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" containerName="glance-httpd" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.376685 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" containerName="glance-log" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.377549 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.379701 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.380224 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.387058 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.411094 4753 scope.go:117] "RemoveContainer" containerID="51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff" Jan 29 14:21:04 crc kubenswrapper[4753]: E0129 14:21:04.411508 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff\": container with ID starting with 51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff not found: ID does not exist" containerID="51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.411537 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff"} err="failed to get container status \"51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff\": rpc error: code = NotFound desc = could not find container \"51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff\": container with ID starting with 51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff not found: ID does not exist" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.411562 4753 scope.go:117] "RemoveContainer" containerID="b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252" Jan 29 14:21:04 crc kubenswrapper[4753]: E0129 14:21:04.411852 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252\": container with ID starting with b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252 not found: ID does not exist" containerID="b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.411873 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252"} err="failed to get container status \"b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252\": rpc error: code = NotFound desc = could not find container \"b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252\": container with ID starting with b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252 not found: ID does not exist" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.411886 4753 scope.go:117] "RemoveContainer" containerID="51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.412185 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff"} err="failed to get container status \"51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff\": rpc error: code = NotFound desc = could not find container \"51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff\": container with ID starting with 51faa17449c43ab844105d467c4ca4cee2a6445fd4336cba70be7278a93de5ff not found: ID does not exist" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.412206 4753 scope.go:117] "RemoveContainer" containerID="b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.412485 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252"} err="failed to get container status \"b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252\": rpc error: code = NotFound desc = could not find container \"b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252\": container with ID starting with b1744e94e27994b55687e6e5365f366dd3c2fe13cee607e733adce46491a2252 not found: ID does not exist" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.508117 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-config-data\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.508182 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-logs\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.508212 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.508240 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.508268 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.508287 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-scripts\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.508340 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.508376 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9qr6\" (UniqueName: \"kubernetes.io/projected/107a450b-f03d-4fbb-ad45-e04584ed3972-kube-api-access-h9qr6\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.609541 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-config-data\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.609588 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-logs\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.609612 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.609636 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.609662 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.609682 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-scripts\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.609726 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.609760 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9qr6\" (UniqueName: \"kubernetes.io/projected/107a450b-f03d-4fbb-ad45-e04584ed3972-kube-api-access-h9qr6\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.610105 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-logs\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.610313 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.610609 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.614750 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-scripts\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.616239 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.616827 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.624663 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-config-data\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.634956 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9qr6\" (UniqueName: \"kubernetes.io/projected/107a450b-f03d-4fbb-ad45-e04584ed3972-kube-api-access-h9qr6\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.643105 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:21:04 crc kubenswrapper[4753]: I0129 14:21:04.782979 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:05 crc kubenswrapper[4753]: I0129 14:21:05.332141 4753 generic.go:334] "Generic (PLEG): container finished" podID="1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" containerID="e00e67a39ecbbcca5a026b499f041483245ed4bf836f275bfea36c7f48164b65" exitCode=0 Jan 29 14:21:05 crc kubenswrapper[4753]: I0129 14:21:05.332309 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nhv2c" event={"ID":"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c","Type":"ContainerDied","Data":"e00e67a39ecbbcca5a026b499f041483245ed4bf836f275bfea36c7f48164b65"} Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.172674 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b32ec20f-cbc1-4df3-aa94-b5542a8268d8" path="/var/lib/kubelet/pods/b32ec20f-cbc1-4df3-aa94-b5542a8268d8/volumes" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.521988 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.660835 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-combined-ca-bundle\") pod \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.660902 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlvz8\" (UniqueName: \"kubernetes.io/projected/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-kube-api-access-jlvz8\") pod \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.661122 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-scripts\") pod \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.662017 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-config-data\") pod \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.662170 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-credential-keys\") pod \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.662273 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-fernet-keys\") pod \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\" (UID: \"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb\") " Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.669110 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-kube-api-access-jlvz8" (OuterVolumeSpecName: "kube-api-access-jlvz8") pod "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" (UID: "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb"). InnerVolumeSpecName "kube-api-access-jlvz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.669218 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" (UID: "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.669313 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" (UID: "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.670667 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-scripts" (OuterVolumeSpecName: "scripts") pod "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" (UID: "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.692090 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" (UID: "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.713083 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-config-data" (OuterVolumeSpecName: "config-data") pod "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" (UID: "aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.768463 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.768500 4753 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.768515 4753 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.768530 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.768544 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlvz8\" (UniqueName: \"kubernetes.io/projected/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-kube-api-access-jlvz8\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:06 crc kubenswrapper[4753]: I0129 14:21:06.768590 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.366418 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5h9z" event={"ID":"aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb","Type":"ContainerDied","Data":"fa018d805bcb408c6b2ecefddecc8f4031cb562ac42ea7fb361f7b61c6d9a1df"} Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.366822 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa018d805bcb408c6b2ecefddecc8f4031cb562ac42ea7fb361f7b61c6d9a1df" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.366470 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5h9z" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.620833 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7b9f57fc94-gqqlc"] Jan 29 14:21:07 crc kubenswrapper[4753]: E0129 14:21:07.621235 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" containerName="keystone-bootstrap" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.621248 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" containerName="keystone-bootstrap" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.621424 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" containerName="keystone-bootstrap" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.621964 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.624668 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.624874 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rq2fm" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.624941 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.625033 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.626501 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.626776 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.638229 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b9f57fc94-gqqlc"] Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.684362 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzpx4\" (UniqueName: \"kubernetes.io/projected/cf6045aa-89c7-46c0-ba1e-4d63b9740883-kube-api-access-pzpx4\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.684407 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-credential-keys\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.684455 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-public-tls-certs\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.684546 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-internal-tls-certs\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.684721 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-config-data\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.684801 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-scripts\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.684893 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-fernet-keys\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.684934 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-combined-ca-bundle\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.786209 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-config-data\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.786297 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-scripts\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.786338 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-fernet-keys\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.786365 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-combined-ca-bundle\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.786461 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzpx4\" (UniqueName: \"kubernetes.io/projected/cf6045aa-89c7-46c0-ba1e-4d63b9740883-kube-api-access-pzpx4\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.786490 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-credential-keys\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.786549 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-public-tls-certs\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.786582 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-internal-tls-certs\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.790710 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-config-data\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.791528 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-internal-tls-certs\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.791842 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-credential-keys\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.792192 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-public-tls-certs\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.792813 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-combined-ca-bundle\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.793084 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-scripts\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.793946 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-fernet-keys\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.822809 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzpx4\" (UniqueName: \"kubernetes.io/projected/cf6045aa-89c7-46c0-ba1e-4d63b9740883-kube-api-access-pzpx4\") pod \"keystone-7b9f57fc94-gqqlc\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:07 crc kubenswrapper[4753]: I0129 14:21:07.981928 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.375666 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nhv2c" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.386241 4753 generic.go:334] "Generic (PLEG): container finished" podID="07dc97c8-9d59-458c-81af-f83e6f71b09c" containerID="14e5df1a722e4fdd9f1cdff60bc64e62c47ca34041f0ff038441520109891f30" exitCode=0 Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.386403 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-drwzl" event={"ID":"07dc97c8-9d59-458c-81af-f83e6f71b09c","Type":"ContainerDied","Data":"14e5df1a722e4fdd9f1cdff60bc64e62c47ca34041f0ff038441520109891f30"} Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.397415 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-scripts\") pod \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.397544 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-logs\") pod \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.397642 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-config-data\") pod \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.397695 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-combined-ca-bundle\") pod \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.397812 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hqlf\" (UniqueName: \"kubernetes.io/projected/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-kube-api-access-8hqlf\") pod \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.400535 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nhv2c" event={"ID":"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c","Type":"ContainerDied","Data":"06e04707d6cc64023d6d85b13ad1b03de9578a1677410578d68ffa7d61cd4b83"} Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.400580 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06e04707d6cc64023d6d85b13ad1b03de9578a1677410578d68ffa7d61cd4b83" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.400657 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nhv2c" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.407879 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-kube-api-access-8hqlf" (OuterVolumeSpecName: "kube-api-access-8hqlf") pod "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" (UID: "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c"). InnerVolumeSpecName "kube-api-access-8hqlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.408845 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-logs" (OuterVolumeSpecName: "logs") pod "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" (UID: "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.413132 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-scripts" (OuterVolumeSpecName: "scripts") pod "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" (UID: "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:08 crc kubenswrapper[4753]: E0129 14:21:08.457707 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-config-data podName:1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c nodeName:}" failed. No retries permitted until 2026-01-29 14:21:08.957669803 +0000 UTC m=+1103.652404185 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-config-data") pod "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" (UID: "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c") : error deleting /var/lib/kubelet/pods/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c/volume-subpaths: remove /var/lib/kubelet/pods/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c/volume-subpaths: no such file or directory Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.461502 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" (UID: "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.485222 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.504860 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.505715 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.505731 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hqlf\" (UniqueName: \"kubernetes.io/projected/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-kube-api-access-8hqlf\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.505748 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.538506 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8495b76777-c62gs"] Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.539601 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8495b76777-c62gs" podUID="5ae2b88d-362f-4152-af3a-95780fc0bdf4" containerName="dnsmasq-dns" containerID="cri-o://13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093" gracePeriod=10 Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.856370 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b9f57fc94-gqqlc"] Jan 29 14:21:08 crc kubenswrapper[4753]: I0129 14:21:08.964268 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.020999 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-config-data\") pod \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\" (UID: \"1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c\") " Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.029478 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-config-data" (OuterVolumeSpecName: "config-data") pod "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" (UID: "1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.106677 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.127553 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.228463 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-sb\") pod \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.228543 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-config\") pod \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.228697 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-svc\") pod \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.228748 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-swift-storage-0\") pod \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.228781 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-nb\") pod \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.228828 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzjm6\" (UniqueName: \"kubernetes.io/projected/5ae2b88d-362f-4152-af3a-95780fc0bdf4-kube-api-access-dzjm6\") pod \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\" (UID: \"5ae2b88d-362f-4152-af3a-95780fc0bdf4\") " Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.234340 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae2b88d-362f-4152-af3a-95780fc0bdf4-kube-api-access-dzjm6" (OuterVolumeSpecName: "kube-api-access-dzjm6") pod "5ae2b88d-362f-4152-af3a-95780fc0bdf4" (UID: "5ae2b88d-362f-4152-af3a-95780fc0bdf4"). InnerVolumeSpecName "kube-api-access-dzjm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.277905 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ae2b88d-362f-4152-af3a-95780fc0bdf4" (UID: "5ae2b88d-362f-4152-af3a-95780fc0bdf4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.279024 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ae2b88d-362f-4152-af3a-95780fc0bdf4" (UID: "5ae2b88d-362f-4152-af3a-95780fc0bdf4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.279083 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-config" (OuterVolumeSpecName: "config") pod "5ae2b88d-362f-4152-af3a-95780fc0bdf4" (UID: "5ae2b88d-362f-4152-af3a-95780fc0bdf4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.281587 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5ae2b88d-362f-4152-af3a-95780fc0bdf4" (UID: "5ae2b88d-362f-4152-af3a-95780fc0bdf4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.291067 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ae2b88d-362f-4152-af3a-95780fc0bdf4" (UID: "5ae2b88d-362f-4152-af3a-95780fc0bdf4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.330951 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.330979 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.330989 4753 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.331000 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.331010 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzjm6\" (UniqueName: \"kubernetes.io/projected/5ae2b88d-362f-4152-af3a-95780fc0bdf4-kube-api-access-dzjm6\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.331019 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae2b88d-362f-4152-af3a-95780fc0bdf4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.414117 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"107a450b-f03d-4fbb-ad45-e04584ed3972","Type":"ContainerStarted","Data":"309dc651b59d91cac5a63e3afaae4d369469ea22758191f49aadbe0296ec2404"} Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.416177 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b9f57fc94-gqqlc" event={"ID":"cf6045aa-89c7-46c0-ba1e-4d63b9740883","Type":"ContainerStarted","Data":"e450be8a7013dbd08831b7150e1c8ccd73e085bf16b828b286d89403e5f7cfbc"} Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.416273 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.416291 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b9f57fc94-gqqlc" event={"ID":"cf6045aa-89c7-46c0-ba1e-4d63b9740883","Type":"ContainerStarted","Data":"d945de52fc406e592ec0c1c5c84efcee7158744da734004630f75b6961d196a9"} Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.423047 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae4b866a-f13e-4cfd-95c1-50b64d249217","Type":"ContainerStarted","Data":"ce57ea4271b091b656e99f290c9acae064d40a8fc209b44e9ae762dd4dfb8ffb"} Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.424802 4753 generic.go:334] "Generic (PLEG): container finished" podID="5ae2b88d-362f-4152-af3a-95780fc0bdf4" containerID="13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093" exitCode=0 Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.424885 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8495b76777-c62gs" event={"ID":"5ae2b88d-362f-4152-af3a-95780fc0bdf4","Type":"ContainerDied","Data":"13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093"} Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.425996 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8495b76777-c62gs" event={"ID":"5ae2b88d-362f-4152-af3a-95780fc0bdf4","Type":"ContainerDied","Data":"967385d61740ff4d82b5cfd1f92478bb7c2bb87a9231699e7acb1e3763ec1947"} Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.426017 4753 scope.go:117] "RemoveContainer" containerID="13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.424922 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8495b76777-c62gs" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.449213 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7b9f57fc94-gqqlc" podStartSLOduration=2.449191225 podStartE2EDuration="2.449191225s" podCreationTimestamp="2026-01-29 14:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:09.438642521 +0000 UTC m=+1104.133376903" watchObservedRunningTime="2026-01-29 14:21:09.449191225 +0000 UTC m=+1104.143925607" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.476285 4753 scope.go:117] "RemoveContainer" containerID="655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.478240 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8495b76777-c62gs"] Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.489112 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8495b76777-c62gs"] Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.505608 4753 scope.go:117] "RemoveContainer" containerID="13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093" Jan 29 14:21:09 crc kubenswrapper[4753]: E0129 14:21:09.506226 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093\": container with ID starting with 13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093 not found: ID does not exist" containerID="13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.506333 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093"} err="failed to get container status \"13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093\": rpc error: code = NotFound desc = could not find container \"13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093\": container with ID starting with 13d44b494aed8bc44d19734439c558d57a5fc3ed26d29d5cc831446fce503093 not found: ID does not exist" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.506406 4753 scope.go:117] "RemoveContainer" containerID="655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804" Jan 29 14:21:09 crc kubenswrapper[4753]: E0129 14:21:09.506804 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804\": container with ID starting with 655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804 not found: ID does not exist" containerID="655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.506852 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804"} err="failed to get container status \"655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804\": rpc error: code = NotFound desc = could not find container \"655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804\": container with ID starting with 655aceb612d1682f1ded2db08df614b07b2ecc048fdbd73ed10d035725c82804 not found: ID does not exist" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.791186 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-78cbccfdbb-x7lwh"] Jan 29 14:21:09 crc kubenswrapper[4753]: E0129 14:21:09.791518 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" containerName="placement-db-sync" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.791535 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" containerName="placement-db-sync" Jan 29 14:21:09 crc kubenswrapper[4753]: E0129 14:21:09.791552 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae2b88d-362f-4152-af3a-95780fc0bdf4" containerName="init" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.791557 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae2b88d-362f-4152-af3a-95780fc0bdf4" containerName="init" Jan 29 14:21:09 crc kubenswrapper[4753]: E0129 14:21:09.791574 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae2b88d-362f-4152-af3a-95780fc0bdf4" containerName="dnsmasq-dns" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.791581 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae2b88d-362f-4152-af3a-95780fc0bdf4" containerName="dnsmasq-dns" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.791752 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae2b88d-362f-4152-af3a-95780fc0bdf4" containerName="dnsmasq-dns" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.791766 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" containerName="placement-db-sync" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.792623 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.797555 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-drwzl" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.807076 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.807282 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.807388 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.807500 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9nfqw" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.807684 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.821605 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78cbccfdbb-x7lwh"] Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.838412 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-config\") pod \"07dc97c8-9d59-458c-81af-f83e6f71b09c\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.838878 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfbwr\" (UniqueName: \"kubernetes.io/projected/07dc97c8-9d59-458c-81af-f83e6f71b09c-kube-api-access-hfbwr\") pod \"07dc97c8-9d59-458c-81af-f83e6f71b09c\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.838933 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-combined-ca-bundle\") pod \"07dc97c8-9d59-458c-81af-f83e6f71b09c\" (UID: \"07dc97c8-9d59-458c-81af-f83e6f71b09c\") " Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.839182 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-combined-ca-bundle\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.839253 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-config-data\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.839293 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-internal-tls-certs\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.839328 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-logs\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.839355 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-scripts\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.839372 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th988\" (UniqueName: \"kubernetes.io/projected/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-kube-api-access-th988\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.839390 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-public-tls-certs\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.843021 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07dc97c8-9d59-458c-81af-f83e6f71b09c-kube-api-access-hfbwr" (OuterVolumeSpecName: "kube-api-access-hfbwr") pod "07dc97c8-9d59-458c-81af-f83e6f71b09c" (UID: "07dc97c8-9d59-458c-81af-f83e6f71b09c"). InnerVolumeSpecName "kube-api-access-hfbwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.875450 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07dc97c8-9d59-458c-81af-f83e6f71b09c" (UID: "07dc97c8-9d59-458c-81af-f83e6f71b09c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.889651 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-config" (OuterVolumeSpecName: "config") pod "07dc97c8-9d59-458c-81af-f83e6f71b09c" (UID: "07dc97c8-9d59-458c-81af-f83e6f71b09c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.941002 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-config-data\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.941062 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-internal-tls-certs\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.941099 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-logs\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.941130 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-scripts\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.941162 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th988\" (UniqueName: \"kubernetes.io/projected/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-kube-api-access-th988\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.941182 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-public-tls-certs\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.941215 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-combined-ca-bundle\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.941654 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfbwr\" (UniqueName: \"kubernetes.io/projected/07dc97c8-9d59-458c-81af-f83e6f71b09c-kube-api-access-hfbwr\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.941893 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.941911 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/07dc97c8-9d59-458c-81af-f83e6f71b09c-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.942584 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-logs\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.945641 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-combined-ca-bundle\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.945860 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-internal-tls-certs\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.947418 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-scripts\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.949529 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-config-data\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.950553 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-public-tls-certs\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:09 crc kubenswrapper[4753]: I0129 14:21:09.958267 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th988\" (UniqueName: \"kubernetes.io/projected/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-kube-api-access-th988\") pod \"placement-78cbccfdbb-x7lwh\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.027937 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.164342 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae2b88d-362f-4152-af3a-95780fc0bdf4" path="/var/lib/kubelet/pods/5ae2b88d-362f-4152-af3a-95780fc0bdf4/volumes" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.438322 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-drwzl" event={"ID":"07dc97c8-9d59-458c-81af-f83e6f71b09c","Type":"ContainerDied","Data":"8dc2422e3235c15a5ec383b92b53c27453e9f267a75e63d951c62e4fcfc57a00"} Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.438371 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dc2422e3235c15a5ec383b92b53c27453e9f267a75e63d951c62e4fcfc57a00" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.438377 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-drwzl" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.440187 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zlbdz" event={"ID":"5b901113-4b66-4eb9-ac27-ad9446e6aa29","Type":"ContainerStarted","Data":"8c50dc9a01d5ccf0266d0160ffff1fd0b192424fdbde1d80fdc73417a471c529"} Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.445334 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"107a450b-f03d-4fbb-ad45-e04584ed3972","Type":"ContainerStarted","Data":"324879b4bc43390a72b922e6f1989965ca8ade4892900bc8469ddff0dccfb0ef"} Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.445379 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"107a450b-f03d-4fbb-ad45-e04584ed3972","Type":"ContainerStarted","Data":"82d06422f106535b2475859bf3a3c11ef7c13e57119445daed3d619f602e5be6"} Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.475610 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-zlbdz" podStartSLOduration=2.23291641 podStartE2EDuration="33.475590319s" podCreationTimestamp="2026-01-29 14:20:37 +0000 UTC" firstStartedPulling="2026-01-29 14:20:38.371636743 +0000 UTC m=+1073.066371115" lastFinishedPulling="2026-01-29 14:21:09.614310642 +0000 UTC m=+1104.309045024" observedRunningTime="2026-01-29 14:21:10.465800754 +0000 UTC m=+1105.160535156" watchObservedRunningTime="2026-01-29 14:21:10.475590319 +0000 UTC m=+1105.170324701" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.507846 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.507821359 podStartE2EDuration="6.507821359s" podCreationTimestamp="2026-01-29 14:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:10.490653896 +0000 UTC m=+1105.185388278" watchObservedRunningTime="2026-01-29 14:21:10.507821359 +0000 UTC m=+1105.202555741" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.519527 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78cbccfdbb-x7lwh"] Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.648791 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-hh94p"] Jan 29 14:21:10 crc kubenswrapper[4753]: E0129 14:21:10.649934 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07dc97c8-9d59-458c-81af-f83e6f71b09c" containerName="neutron-db-sync" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.649966 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="07dc97c8-9d59-458c-81af-f83e6f71b09c" containerName="neutron-db-sync" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.658656 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="07dc97c8-9d59-458c-81af-f83e6f71b09c" containerName="neutron-db-sync" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.660103 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.676250 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-hh94p"] Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.715444 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56665b56dd-pw99j"] Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.718885 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.722019 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.723656 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.733199 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.743781 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56665b56dd-pw99j"] Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.748410 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tn74t" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.766017 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h48v6\" (UniqueName: \"kubernetes.io/projected/3b74239b-a7f3-4d1d-b292-50d131a0dad1-kube-api-access-h48v6\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.766126 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-swift-storage-0\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.766226 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-svc\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.766429 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-config\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.766495 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-nb\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.766653 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-httpd-config\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.766867 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-config\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.767017 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kl7q\" (UniqueName: \"kubernetes.io/projected/4192bada-5336-48ed-9bb7-d6db5a1e6278-kube-api-access-7kl7q\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.767076 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-combined-ca-bundle\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.767098 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-sb\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.767119 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-ovndb-tls-certs\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869160 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h48v6\" (UniqueName: \"kubernetes.io/projected/3b74239b-a7f3-4d1d-b292-50d131a0dad1-kube-api-access-h48v6\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869214 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-swift-storage-0\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869246 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-svc\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869308 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-config\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869340 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-nb\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869366 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-httpd-config\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869393 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-config\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869439 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kl7q\" (UniqueName: \"kubernetes.io/projected/4192bada-5336-48ed-9bb7-d6db5a1e6278-kube-api-access-7kl7q\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869468 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-sb\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869483 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-combined-ca-bundle\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.869497 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-ovndb-tls-certs\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.870350 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-swift-storage-0\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.870363 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-nb\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.870361 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-svc\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.870641 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-config\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.871037 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-sb\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.874249 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-combined-ca-bundle\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.877949 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-ovndb-tls-certs\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.881955 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-httpd-config\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.891856 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-config\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.895256 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h48v6\" (UniqueName: \"kubernetes.io/projected/3b74239b-a7f3-4d1d-b292-50d131a0dad1-kube-api-access-h48v6\") pod \"dnsmasq-dns-7bb67c87c9-hh94p\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:10 crc kubenswrapper[4753]: I0129 14:21:10.895506 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kl7q\" (UniqueName: \"kubernetes.io/projected/4192bada-5336-48ed-9bb7-d6db5a1e6278-kube-api-access-7kl7q\") pod \"neutron-56665b56dd-pw99j\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:11 crc kubenswrapper[4753]: I0129 14:21:11.072883 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:11 crc kubenswrapper[4753]: I0129 14:21:11.080773 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:11 crc kubenswrapper[4753]: I0129 14:21:11.477225 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cbccfdbb-x7lwh" event={"ID":"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7","Type":"ContainerStarted","Data":"7928bff86f6434736cea8c468f448a8ff6bae7916724dac5e9b712cc68b65281"} Jan 29 14:21:11 crc kubenswrapper[4753]: I0129 14:21:11.477614 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cbccfdbb-x7lwh" event={"ID":"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7","Type":"ContainerStarted","Data":"0048536c095c1deea68c977175a1614cd4da94f70d3e465ff69e6a722a840709"} Jan 29 14:21:11 crc kubenswrapper[4753]: I0129 14:21:11.477625 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cbccfdbb-x7lwh" event={"ID":"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7","Type":"ContainerStarted","Data":"60a7001a94c2226bc46eb06d5822549a8f0c4196428f5dc7b1b8b967f00daefe"} Jan 29 14:21:11 crc kubenswrapper[4753]: I0129 14:21:11.541442 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-78cbccfdbb-x7lwh" podStartSLOduration=2.541425137 podStartE2EDuration="2.541425137s" podCreationTimestamp="2026-01-29 14:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:11.53300142 +0000 UTC m=+1106.227735802" watchObservedRunningTime="2026-01-29 14:21:11.541425137 +0000 UTC m=+1106.236159519" Jan 29 14:21:11 crc kubenswrapper[4753]: I0129 14:21:11.606856 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-hh94p"] Jan 29 14:21:11 crc kubenswrapper[4753]: I0129 14:21:11.832575 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56665b56dd-pw99j"] Jan 29 14:21:11 crc kubenswrapper[4753]: W0129 14:21:11.839981 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4192bada_5336_48ed_9bb7_d6db5a1e6278.slice/crio-d88912bc2517ad063b12a12057e4c0f0d9a9d8cf92350d8975097dd6d9a079c8 WatchSource:0}: Error finding container d88912bc2517ad063b12a12057e4c0f0d9a9d8cf92350d8975097dd6d9a079c8: Status 404 returned error can't find the container with id d88912bc2517ad063b12a12057e4c0f0d9a9d8cf92350d8975097dd6d9a079c8 Jan 29 14:21:12 crc kubenswrapper[4753]: I0129 14:21:12.490988 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56665b56dd-pw99j" event={"ID":"4192bada-5336-48ed-9bb7-d6db5a1e6278","Type":"ContainerStarted","Data":"af2fdc76bc0016cdd7bf585f0bccc1c95c9e4d9b502387f54870c1a7b741f38b"} Jan 29 14:21:12 crc kubenswrapper[4753]: I0129 14:21:12.491540 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56665b56dd-pw99j" event={"ID":"4192bada-5336-48ed-9bb7-d6db5a1e6278","Type":"ContainerStarted","Data":"d88912bc2517ad063b12a12057e4c0f0d9a9d8cf92350d8975097dd6d9a079c8"} Jan 29 14:21:12 crc kubenswrapper[4753]: I0129 14:21:12.493871 4753 generic.go:334] "Generic (PLEG): container finished" podID="3b74239b-a7f3-4d1d-b292-50d131a0dad1" containerID="50ab971e9a239744fde27c5ac8cddddc70b2c7c9487a2ad97df2565c8899536b" exitCode=0 Jan 29 14:21:12 crc kubenswrapper[4753]: I0129 14:21:12.495436 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" event={"ID":"3b74239b-a7f3-4d1d-b292-50d131a0dad1","Type":"ContainerDied","Data":"50ab971e9a239744fde27c5ac8cddddc70b2c7c9487a2ad97df2565c8899536b"} Jan 29 14:21:12 crc kubenswrapper[4753]: I0129 14:21:12.495498 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" event={"ID":"3b74239b-a7f3-4d1d-b292-50d131a0dad1","Type":"ContainerStarted","Data":"73be0fc283e78f9aa9cd1424060227302861f495695d2ddc7218369bf45d0075"} Jan 29 14:21:12 crc kubenswrapper[4753]: I0129 14:21:12.495523 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:12 crc kubenswrapper[4753]: I0129 14:21:12.495540 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.506616 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56665b56dd-pw99j" event={"ID":"4192bada-5336-48ed-9bb7-d6db5a1e6278","Type":"ContainerStarted","Data":"338979fa3be2b8317f4043dc0e2703a440119f91dac85ae2da3ad29090e1a012"} Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.506959 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.545264 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56665b56dd-pw99j" podStartSLOduration=3.545242434 podStartE2EDuration="3.545242434s" podCreationTimestamp="2026-01-29 14:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:13.528930873 +0000 UTC m=+1108.223665265" watchObservedRunningTime="2026-01-29 14:21:13.545242434 +0000 UTC m=+1108.239976816" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.637036 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55965d95bf-pftcq"] Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.638611 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.640707 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.656484 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.656916 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55965d95bf-pftcq"] Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.723945 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-httpd-config\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.724081 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-public-tls-certs\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.724199 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-config\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.724313 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-ovndb-tls-certs\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.724364 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-combined-ca-bundle\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.724388 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-internal-tls-certs\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.724594 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpg98\" (UniqueName: \"kubernetes.io/projected/1897b4f4-3f70-4584-9801-59c207f4d1db-kube-api-access-lpg98\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.826080 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-config\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.826189 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-ovndb-tls-certs\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.826208 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-combined-ca-bundle\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.826225 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-internal-tls-certs\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.826288 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpg98\" (UniqueName: \"kubernetes.io/projected/1897b4f4-3f70-4584-9801-59c207f4d1db-kube-api-access-lpg98\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.826318 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-httpd-config\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.826348 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-public-tls-certs\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.832085 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-internal-tls-certs\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.832559 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-httpd-config\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.832681 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-config\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.832789 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-public-tls-certs\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.833777 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-combined-ca-bundle\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.835802 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-ovndb-tls-certs\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:13 crc kubenswrapper[4753]: I0129 14:21:13.844259 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpg98\" (UniqueName: \"kubernetes.io/projected/1897b4f4-3f70-4584-9801-59c207f4d1db-kube-api-access-lpg98\") pod \"neutron-55965d95bf-pftcq\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:14 crc kubenswrapper[4753]: I0129 14:21:14.002336 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:14 crc kubenswrapper[4753]: I0129 14:21:14.525551 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" event={"ID":"3b74239b-a7f3-4d1d-b292-50d131a0dad1","Type":"ContainerStarted","Data":"cad549f567dc8d2e735609c8aa87f47f0cd16a596a682692a57f23801cc75689"} Jan 29 14:21:14 crc kubenswrapper[4753]: I0129 14:21:14.526079 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:14 crc kubenswrapper[4753]: I0129 14:21:14.552059 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" podStartSLOduration=4.552040049 podStartE2EDuration="4.552040049s" podCreationTimestamp="2026-01-29 14:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:14.549656954 +0000 UTC m=+1109.244391356" watchObservedRunningTime="2026-01-29 14:21:14.552040049 +0000 UTC m=+1109.246774431" Jan 29 14:21:14 crc kubenswrapper[4753]: I0129 14:21:14.784714 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:14 crc kubenswrapper[4753]: I0129 14:21:14.784801 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:14 crc kubenswrapper[4753]: I0129 14:21:14.827384 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:14 crc kubenswrapper[4753]: I0129 14:21:14.845908 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:15 crc kubenswrapper[4753]: I0129 14:21:15.535807 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:15 crc kubenswrapper[4753]: I0129 14:21:15.535869 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:17 crc kubenswrapper[4753]: I0129 14:21:17.463218 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:17 crc kubenswrapper[4753]: I0129 14:21:17.474113 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 14:21:17 crc kubenswrapper[4753]: I0129 14:21:17.560521 4753 generic.go:334] "Generic (PLEG): container finished" podID="5b901113-4b66-4eb9-ac27-ad9446e6aa29" containerID="8c50dc9a01d5ccf0266d0160ffff1fd0b192424fdbde1d80fdc73417a471c529" exitCode=0 Jan 29 14:21:17 crc kubenswrapper[4753]: I0129 14:21:17.561328 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zlbdz" event={"ID":"5b901113-4b66-4eb9-ac27-ad9446e6aa29","Type":"ContainerDied","Data":"8c50dc9a01d5ccf0266d0160ffff1fd0b192424fdbde1d80fdc73417a471c529"} Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.210420 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.377690 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-db-sync-config-data\") pod \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.377944 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-combined-ca-bundle\") pod \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.378018 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgtn6\" (UniqueName: \"kubernetes.io/projected/5b901113-4b66-4eb9-ac27-ad9446e6aa29-kube-api-access-mgtn6\") pod \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\" (UID: \"5b901113-4b66-4eb9-ac27-ad9446e6aa29\") " Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.382989 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b901113-4b66-4eb9-ac27-ad9446e6aa29-kube-api-access-mgtn6" (OuterVolumeSpecName: "kube-api-access-mgtn6") pod "5b901113-4b66-4eb9-ac27-ad9446e6aa29" (UID: "5b901113-4b66-4eb9-ac27-ad9446e6aa29"). InnerVolumeSpecName "kube-api-access-mgtn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.383900 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5b901113-4b66-4eb9-ac27-ad9446e6aa29" (UID: "5b901113-4b66-4eb9-ac27-ad9446e6aa29"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.402217 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b901113-4b66-4eb9-ac27-ad9446e6aa29" (UID: "5b901113-4b66-4eb9-ac27-ad9446e6aa29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.480287 4753 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.480587 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b901113-4b66-4eb9-ac27-ad9446e6aa29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.480597 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgtn6\" (UniqueName: \"kubernetes.io/projected/5b901113-4b66-4eb9-ac27-ad9446e6aa29-kube-api-access-mgtn6\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.576470 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55965d95bf-pftcq"] Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.592521 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-zlbdz" event={"ID":"5b901113-4b66-4eb9-ac27-ad9446e6aa29","Type":"ContainerDied","Data":"7d3a97f9ba19502b485d735f9c704a1ffdc2374a727200d07e7d774e26fc38e7"} Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.592561 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d3a97f9ba19502b485d735f9c704a1ffdc2374a727200d07e7d774e26fc38e7" Jan 29 14:21:20 crc kubenswrapper[4753]: I0129 14:21:20.592613 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-zlbdz" Jan 29 14:21:20 crc kubenswrapper[4753]: W0129 14:21:20.593251 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1897b4f4_3f70_4584_9801_59c207f4d1db.slice/crio-0fd70a45beb18140cb3620e5018c855d745317f628eca68ac5a1e7e4ce42157c WatchSource:0}: Error finding container 0fd70a45beb18140cb3620e5018c855d745317f628eca68ac5a1e7e4ce42157c: Status 404 returned error can't find the container with id 0fd70a45beb18140cb3620e5018c855d745317f628eca68ac5a1e7e4ce42157c Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.075506 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.169422 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66567888d7-7gh8w"] Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.169951 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" podUID="b09bd787-c105-4203-a0ca-3201ba0c3645" containerName="dnsmasq-dns" containerID="cri-o://f6e77f0fdbdf5dcf134b20797af347648ff29546478df07ba96c683e298aec54" gracePeriod=10 Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.489393 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d9c4fb469-wlxbk"] Jan 29 14:21:21 crc kubenswrapper[4753]: E0129 14:21:21.490022 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b901113-4b66-4eb9-ac27-ad9446e6aa29" containerName="barbican-db-sync" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.490036 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b901113-4b66-4eb9-ac27-ad9446e6aa29" containerName="barbican-db-sync" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.490240 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b901113-4b66-4eb9-ac27-ad9446e6aa29" containerName="barbican-db-sync" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.491115 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.497882 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.497978 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5crk2" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.498360 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.499307 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data-custom\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.499456 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-combined-ca-bundle\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.499538 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.499557 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-599fdd45b6-c7l8c"] Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.499616 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5766d009-a05f-4e8e-8267-9bd6c1267d3a-logs\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.499639 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zphs5\" (UniqueName: \"kubernetes.io/projected/5766d009-a05f-4e8e-8267-9bd6c1267d3a-kube-api-access-zphs5\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.500937 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.502181 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.510306 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d9c4fb469-wlxbk"] Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.528343 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-599fdd45b6-c7l8c"] Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.601196 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-combined-ca-bundle\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.601967 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.602025 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5766d009-a05f-4e8e-8267-9bd6c1267d3a-logs\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.602045 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zphs5\" (UniqueName: \"kubernetes.io/projected/5766d009-a05f-4e8e-8267-9bd6c1267d3a-kube-api-access-zphs5\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.602098 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prv7t\" (UniqueName: \"kubernetes.io/projected/62921d0c-9482-49a2-8d20-5dda61ba80da-kube-api-access-prv7t\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.602145 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62921d0c-9482-49a2-8d20-5dda61ba80da-logs\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.602191 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data-custom\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.602210 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.602230 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-combined-ca-bundle\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.602254 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data-custom\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.604229 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55965d95bf-pftcq" event={"ID":"1897b4f4-3f70-4584-9801-59c207f4d1db","Type":"ContainerStarted","Data":"0fd70a45beb18140cb3620e5018c855d745317f628eca68ac5a1e7e4ce42157c"} Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.606032 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5766d009-a05f-4e8e-8267-9bd6c1267d3a-logs\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.609846 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data-custom\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.609982 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-combined-ca-bundle\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.617066 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.633048 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-hjmpz"] Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.634573 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.639030 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zphs5\" (UniqueName: \"kubernetes.io/projected/5766d009-a05f-4e8e-8267-9bd6c1267d3a-kube-api-access-zphs5\") pod \"barbican-worker-6d9c4fb469-wlxbk\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.656120 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-hjmpz"] Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.703489 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-svc\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.703600 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-nb\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.703626 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prv7t\" (UniqueName: \"kubernetes.io/projected/62921d0c-9482-49a2-8d20-5dda61ba80da-kube-api-access-prv7t\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.703683 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62921d0c-9482-49a2-8d20-5dda61ba80da-logs\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.703701 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-config\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.703721 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-swift-storage-0\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.703755 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-sb\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.703773 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pslf2\" (UniqueName: \"kubernetes.io/projected/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-kube-api-access-pslf2\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.703793 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data-custom\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.703808 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.704063 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-combined-ca-bundle\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.704883 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62921d0c-9482-49a2-8d20-5dda61ba80da-logs\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.707972 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data-custom\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.708089 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-combined-ca-bundle\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.710632 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.730006 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9d9fc848-865w5"] Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.732636 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prv7t\" (UniqueName: \"kubernetes.io/projected/62921d0c-9482-49a2-8d20-5dda61ba80da-kube-api-access-prv7t\") pod \"barbican-keystone-listener-599fdd45b6-c7l8c\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.735364 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.744522 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9d9fc848-865w5"] Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.790604 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805413 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-combined-ca-bundle\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805463 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-svc\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805498 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brwm2\" (UniqueName: \"kubernetes.io/projected/79c4f6e1-381b-4c9f-b265-547887d39ab1-kube-api-access-brwm2\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805518 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data-custom\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805556 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805583 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-nb\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805630 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-config\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805645 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-swift-storage-0\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805662 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-sb\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805680 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79c4f6e1-381b-4c9f-b265-547887d39ab1-logs\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.805698 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pslf2\" (UniqueName: \"kubernetes.io/projected/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-kube-api-access-pslf2\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.806695 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-svc\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.806797 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-nb\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.807157 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-sb\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.807209 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-config\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.807311 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-swift-storage-0\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.810720 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.820244 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.827905 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pslf2\" (UniqueName: \"kubernetes.io/projected/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-kube-api-access-pslf2\") pod \"dnsmasq-dns-54c4dfcffc-hjmpz\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.910845 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79c4f6e1-381b-4c9f-b265-547887d39ab1-logs\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.911083 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-combined-ca-bundle\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.911126 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brwm2\" (UniqueName: \"kubernetes.io/projected/79c4f6e1-381b-4c9f-b265-547887d39ab1-kube-api-access-brwm2\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.911146 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data-custom\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.911206 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.911861 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79c4f6e1-381b-4c9f-b265-547887d39ab1-logs\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.916227 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-combined-ca-bundle\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.916421 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data-custom\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.921619 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:21 crc kubenswrapper[4753]: I0129 14:21:21.936313 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brwm2\" (UniqueName: \"kubernetes.io/projected/79c4f6e1-381b-4c9f-b265-547887d39ab1-kube-api-access-brwm2\") pod \"barbican-api-9d9fc848-865w5\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.007910 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.112831 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.292302 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d9c4fb469-wlxbk"] Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.300553 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-599fdd45b6-c7l8c"] Jan 29 14:21:22 crc kubenswrapper[4753]: W0129 14:21:22.364897 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62921d0c_9482_49a2_8d20_5dda61ba80da.slice/crio-0fc06144907ebce1c52c6c2f39e97bc3d8ae95784a20a740f4f2e06aca457a92 WatchSource:0}: Error finding container 0fc06144907ebce1c52c6c2f39e97bc3d8ae95784a20a740f4f2e06aca457a92: Status 404 returned error can't find the container with id 0fc06144907ebce1c52c6c2f39e97bc3d8ae95784a20a740f4f2e06aca457a92 Jan 29 14:21:22 crc kubenswrapper[4753]: W0129 14:21:22.371464 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5766d009_a05f_4e8e_8267_9bd6c1267d3a.slice/crio-f602b4bb61fe764b47c81e745fe06ca697de6376b87af80355b0767cc16d33ad WatchSource:0}: Error finding container f602b4bb61fe764b47c81e745fe06ca697de6376b87af80355b0767cc16d33ad: Status 404 returned error can't find the container with id f602b4bb61fe764b47c81e745fe06ca697de6376b87af80355b0767cc16d33ad Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.533191 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-hjmpz"] Jan 29 14:21:22 crc kubenswrapper[4753]: W0129 14:21:22.582898 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod100d69bd_7c7b_41b8_84a6_c78e4480f0b4.slice/crio-8942c52cdb0ba13262adc7b084123d1d1360e21dae951eb3494ece3295b87db3 WatchSource:0}: Error finding container 8942c52cdb0ba13262adc7b084123d1d1360e21dae951eb3494ece3295b87db3: Status 404 returned error can't find the container with id 8942c52cdb0ba13262adc7b084123d1d1360e21dae951eb3494ece3295b87db3 Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.614558 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55965d95bf-pftcq" event={"ID":"1897b4f4-3f70-4584-9801-59c207f4d1db","Type":"ContainerStarted","Data":"ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560"} Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.614604 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55965d95bf-pftcq" event={"ID":"1897b4f4-3f70-4584-9801-59c207f4d1db","Type":"ContainerStarted","Data":"24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78"} Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.614741 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.616635 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-79v95" event={"ID":"a7b64f0e-f7ef-4737-a543-edba91ed6811","Type":"ContainerStarted","Data":"3b46cefbd0f083420a3feabe6fcc97942c18ff0babb3df072dd24c5df0a8f895"} Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.621329 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" event={"ID":"62921d0c-9482-49a2-8d20-5dda61ba80da","Type":"ContainerStarted","Data":"0fc06144907ebce1c52c6c2f39e97bc3d8ae95784a20a740f4f2e06aca457a92"} Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.625369 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" event={"ID":"5766d009-a05f-4e8e-8267-9bd6c1267d3a","Type":"ContainerStarted","Data":"f602b4bb61fe764b47c81e745fe06ca697de6376b87af80355b0767cc16d33ad"} Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.627597 4753 generic.go:334] "Generic (PLEG): container finished" podID="b09bd787-c105-4203-a0ca-3201ba0c3645" containerID="f6e77f0fdbdf5dcf134b20797af347648ff29546478df07ba96c683e298aec54" exitCode=0 Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.627685 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" event={"ID":"b09bd787-c105-4203-a0ca-3201ba0c3645","Type":"ContainerDied","Data":"f6e77f0fdbdf5dcf134b20797af347648ff29546478df07ba96c683e298aec54"} Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.635384 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" event={"ID":"100d69bd-7c7b-41b8-84a6-c78e4480f0b4","Type":"ContainerStarted","Data":"8942c52cdb0ba13262adc7b084123d1d1360e21dae951eb3494ece3295b87db3"} Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.648379 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55965d95bf-pftcq" podStartSLOduration=9.648302437 podStartE2EDuration="9.648302437s" podCreationTimestamp="2026-01-29 14:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:22.633533749 +0000 UTC m=+1117.328268131" watchObservedRunningTime="2026-01-29 14:21:22.648302437 +0000 UTC m=+1117.343036839" Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.674222 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-79v95" podStartSLOduration=4.551665534 podStartE2EDuration="45.674193887s" podCreationTimestamp="2026-01-29 14:20:37 +0000 UTC" firstStartedPulling="2026-01-29 14:20:38.938209813 +0000 UTC m=+1073.632944195" lastFinishedPulling="2026-01-29 14:21:20.060738126 +0000 UTC m=+1114.755472548" observedRunningTime="2026-01-29 14:21:22.650427135 +0000 UTC m=+1117.345161517" watchObservedRunningTime="2026-01-29 14:21:22.674193887 +0000 UTC m=+1117.368928289" Jan 29 14:21:22 crc kubenswrapper[4753]: W0129 14:21:22.682598 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79c4f6e1_381b_4c9f_b265_547887d39ab1.slice/crio-2a000feffcc85d3c79bc62afeb68221a77990935841b6f037fa71f150179e847 WatchSource:0}: Error finding container 2a000feffcc85d3c79bc62afeb68221a77990935841b6f037fa71f150179e847: Status 404 returned error can't find the container with id 2a000feffcc85d3c79bc62afeb68221a77990935841b6f037fa71f150179e847 Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.684502 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9d9fc848-865w5"] Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.802804 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.939715 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-config\") pod \"b09bd787-c105-4203-a0ca-3201ba0c3645\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.940124 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-swift-storage-0\") pod \"b09bd787-c105-4203-a0ca-3201ba0c3645\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.940190 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-nb\") pod \"b09bd787-c105-4203-a0ca-3201ba0c3645\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.940312 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-sb\") pod \"b09bd787-c105-4203-a0ca-3201ba0c3645\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.940390 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-svc\") pod \"b09bd787-c105-4203-a0ca-3201ba0c3645\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.940472 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8d8p\" (UniqueName: \"kubernetes.io/projected/b09bd787-c105-4203-a0ca-3201ba0c3645-kube-api-access-n8d8p\") pod \"b09bd787-c105-4203-a0ca-3201ba0c3645\" (UID: \"b09bd787-c105-4203-a0ca-3201ba0c3645\") " Jan 29 14:21:22 crc kubenswrapper[4753]: I0129 14:21:22.956360 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09bd787-c105-4203-a0ca-3201ba0c3645-kube-api-access-n8d8p" (OuterVolumeSpecName: "kube-api-access-n8d8p") pod "b09bd787-c105-4203-a0ca-3201ba0c3645" (UID: "b09bd787-c105-4203-a0ca-3201ba0c3645"). InnerVolumeSpecName "kube-api-access-n8d8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.000190 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b09bd787-c105-4203-a0ca-3201ba0c3645" (UID: "b09bd787-c105-4203-a0ca-3201ba0c3645"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.008176 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b09bd787-c105-4203-a0ca-3201ba0c3645" (UID: "b09bd787-c105-4203-a0ca-3201ba0c3645"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.013363 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b09bd787-c105-4203-a0ca-3201ba0c3645" (UID: "b09bd787-c105-4203-a0ca-3201ba0c3645"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.013495 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-config" (OuterVolumeSpecName: "config") pod "b09bd787-c105-4203-a0ca-3201ba0c3645" (UID: "b09bd787-c105-4203-a0ca-3201ba0c3645"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.014059 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b09bd787-c105-4203-a0ca-3201ba0c3645" (UID: "b09bd787-c105-4203-a0ca-3201ba0c3645"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.047828 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.047856 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.047868 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8d8p\" (UniqueName: \"kubernetes.io/projected/b09bd787-c105-4203-a0ca-3201ba0c3645-kube-api-access-n8d8p\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.047879 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.047888 4753 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.047897 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b09bd787-c105-4203-a0ca-3201ba0c3645-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.657123 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae4b866a-f13e-4cfd-95c1-50b64d249217","Type":"ContainerStarted","Data":"f4bf28df49742c84950ddb8eb746fb35293229ce94c851c3a852f37a0e62fd95"} Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.657322 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="ceilometer-central-agent" containerID="cri-o://ff655e22020abc68977684e4d6b5233ddce78356b5d081acecc4a689f675327e" gracePeriod=30 Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.657534 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.657951 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="proxy-httpd" containerID="cri-o://f4bf28df49742c84950ddb8eb746fb35293229ce94c851c3a852f37a0e62fd95" gracePeriod=30 Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.658014 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="sg-core" containerID="cri-o://ce57ea4271b091b656e99f290c9acae064d40a8fc209b44e9ae762dd4dfb8ffb" gracePeriod=30 Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.658064 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="ceilometer-notification-agent" containerID="cri-o://3d5f6cf41f97cc1c79168168e732ba985f336c94896a7f61c62971e27348ed24" gracePeriod=30 Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.672807 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" event={"ID":"b09bd787-c105-4203-a0ca-3201ba0c3645","Type":"ContainerDied","Data":"ebf16a1cc02788552699b44350e447da68a27ffaa2513c86f115680217cb1587"} Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.672903 4753 scope.go:117] "RemoveContainer" containerID="f6e77f0fdbdf5dcf134b20797af347648ff29546478df07ba96c683e298aec54" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.673080 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66567888d7-7gh8w" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.683616 4753 generic.go:334] "Generic (PLEG): container finished" podID="100d69bd-7c7b-41b8-84a6-c78e4480f0b4" containerID="6b2cb51caa1768f1c65c4de91f615e7e3e193803e87b6dc9802ce542552641b3" exitCode=0 Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.683679 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" event={"ID":"100d69bd-7c7b-41b8-84a6-c78e4480f0b4","Type":"ContainerDied","Data":"6b2cb51caa1768f1c65c4de91f615e7e3e193803e87b6dc9802ce542552641b3"} Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.684374 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.014015081 podStartE2EDuration="46.684355162s" podCreationTimestamp="2026-01-29 14:20:37 +0000 UTC" firstStartedPulling="2026-01-29 14:20:38.934946125 +0000 UTC m=+1073.629680507" lastFinishedPulling="2026-01-29 14:21:22.605286206 +0000 UTC m=+1117.300020588" observedRunningTime="2026-01-29 14:21:23.676259583 +0000 UTC m=+1118.370993965" watchObservedRunningTime="2026-01-29 14:21:23.684355162 +0000 UTC m=+1118.379089534" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.689768 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fc848-865w5" event={"ID":"79c4f6e1-381b-4c9f-b265-547887d39ab1","Type":"ContainerStarted","Data":"ceaef1ceec2f6ca383381ca30573762ffc047afa4a7e186ad2cbedd99e33d17e"} Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.689813 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fc848-865w5" event={"ID":"79c4f6e1-381b-4c9f-b265-547887d39ab1","Type":"ContainerStarted","Data":"8b1ef8941d5ded68d97fcd43e2874bf764c0c6c81914b6d866e1f640938892bf"} Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.689822 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fc848-865w5" event={"ID":"79c4f6e1-381b-4c9f-b265-547887d39ab1","Type":"ContainerStarted","Data":"2a000feffcc85d3c79bc62afeb68221a77990935841b6f037fa71f150179e847"} Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.689864 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.689877 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.710240 4753 scope.go:117] "RemoveContainer" containerID="8191e39b2f0f4c59b6fbd551757fdf8c6644551043d7156e679ceec214d94f1f" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.739511 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9d9fc848-865w5" podStartSLOduration=2.73949403 podStartE2EDuration="2.73949403s" podCreationTimestamp="2026-01-29 14:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:23.721212416 +0000 UTC m=+1118.415946788" watchObservedRunningTime="2026-01-29 14:21:23.73949403 +0000 UTC m=+1118.434228412" Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.839228 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66567888d7-7gh8w"] Jan 29 14:21:23 crc kubenswrapper[4753]: I0129 14:21:23.840484 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66567888d7-7gh8w"] Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.169467 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b09bd787-c105-4203-a0ca-3201ba0c3645" path="/var/lib/kubelet/pods/b09bd787-c105-4203-a0ca-3201ba0c3645/volumes" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.625934 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-754c57f55b-2hkbd"] Jan 29 14:21:24 crc kubenswrapper[4753]: E0129 14:21:24.626616 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09bd787-c105-4203-a0ca-3201ba0c3645" containerName="dnsmasq-dns" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.626635 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09bd787-c105-4203-a0ca-3201ba0c3645" containerName="dnsmasq-dns" Jan 29 14:21:24 crc kubenswrapper[4753]: E0129 14:21:24.626653 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09bd787-c105-4203-a0ca-3201ba0c3645" containerName="init" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.626660 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09bd787-c105-4203-a0ca-3201ba0c3645" containerName="init" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.626825 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09bd787-c105-4203-a0ca-3201ba0c3645" containerName="dnsmasq-dns" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.627810 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.633554 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.633667 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.634840 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-754c57f55b-2hkbd"] Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.685195 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data-custom\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.685355 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.685402 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7pv\" (UniqueName: \"kubernetes.io/projected/920278e2-a31f-4ad2-81be-d30a799b9d64-kube-api-access-rb7pv\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.685525 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-public-tls-certs\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.685559 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-internal-tls-certs\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.685590 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/920278e2-a31f-4ad2-81be-d30a799b9d64-logs\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.685619 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-combined-ca-bundle\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.720680 4753 generic.go:334] "Generic (PLEG): container finished" podID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerID="f4bf28df49742c84950ddb8eb746fb35293229ce94c851c3a852f37a0e62fd95" exitCode=0 Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.720837 4753 generic.go:334] "Generic (PLEG): container finished" podID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerID="ce57ea4271b091b656e99f290c9acae064d40a8fc209b44e9ae762dd4dfb8ffb" exitCode=2 Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.720898 4753 generic.go:334] "Generic (PLEG): container finished" podID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerID="ff655e22020abc68977684e4d6b5233ddce78356b5d081acecc4a689f675327e" exitCode=0 Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.720920 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae4b866a-f13e-4cfd-95c1-50b64d249217","Type":"ContainerDied","Data":"f4bf28df49742c84950ddb8eb746fb35293229ce94c851c3a852f37a0e62fd95"} Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.721254 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae4b866a-f13e-4cfd-95c1-50b64d249217","Type":"ContainerDied","Data":"ce57ea4271b091b656e99f290c9acae064d40a8fc209b44e9ae762dd4dfb8ffb"} Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.721284 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae4b866a-f13e-4cfd-95c1-50b64d249217","Type":"ContainerDied","Data":"ff655e22020abc68977684e4d6b5233ddce78356b5d081acecc4a689f675327e"} Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.787607 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-public-tls-certs\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.787811 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-internal-tls-certs\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.787837 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/920278e2-a31f-4ad2-81be-d30a799b9d64-logs\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.787859 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-combined-ca-bundle\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.787915 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data-custom\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.787990 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.788017 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7pv\" (UniqueName: \"kubernetes.io/projected/920278e2-a31f-4ad2-81be-d30a799b9d64-kube-api-access-rb7pv\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.788387 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/920278e2-a31f-4ad2-81be-d30a799b9d64-logs\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.791536 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-public-tls-certs\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.795807 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data-custom\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.796274 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-internal-tls-certs\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.796286 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.796781 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-combined-ca-bundle\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:24 crc kubenswrapper[4753]: I0129 14:21:24.802819 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7pv\" (UniqueName: \"kubernetes.io/projected/920278e2-a31f-4ad2-81be-d30a799b9d64-kube-api-access-rb7pv\") pod \"barbican-api-754c57f55b-2hkbd\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.021460 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.522204 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-754c57f55b-2hkbd"] Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.753938 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-754c57f55b-2hkbd" event={"ID":"920278e2-a31f-4ad2-81be-d30a799b9d64","Type":"ContainerStarted","Data":"d43baf5011f66db39e4f48ff74f5212be4b15a17d4fe4496eef46839b9b4fdd0"} Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.757359 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" event={"ID":"5766d009-a05f-4e8e-8267-9bd6c1267d3a","Type":"ContainerStarted","Data":"4c6f60456d1d153f776a9722a5c5c012f82210a1ce9b96301d44ce657ac865a5"} Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.757402 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" event={"ID":"5766d009-a05f-4e8e-8267-9bd6c1267d3a","Type":"ContainerStarted","Data":"125994f3ec1405b9e8dad0539053aa0a87df73ee58d6e17040fc7d0d422ceb19"} Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.762836 4753 generic.go:334] "Generic (PLEG): container finished" podID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerID="3d5f6cf41f97cc1c79168168e732ba985f336c94896a7f61c62971e27348ed24" exitCode=0 Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.762922 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae4b866a-f13e-4cfd-95c1-50b64d249217","Type":"ContainerDied","Data":"3d5f6cf41f97cc1c79168168e732ba985f336c94896a7f61c62971e27348ed24"} Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.766133 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" event={"ID":"100d69bd-7c7b-41b8-84a6-c78e4480f0b4","Type":"ContainerStarted","Data":"0b6e62b72fa89354b656420159d71664b787f973cbac0aca4b873ca56e236d8e"} Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.766303 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.769218 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" event={"ID":"62921d0c-9482-49a2-8d20-5dda61ba80da","Type":"ContainerStarted","Data":"4aa6335e18e43896614c27e4605f0813d42217c00887d65f886c827434a9fc80"} Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.769262 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" event={"ID":"62921d0c-9482-49a2-8d20-5dda61ba80da","Type":"ContainerStarted","Data":"ad233db4e8b1fd3e1f6f172ac5f80cdfd53b96374113b3ddbb73831c7e9cb5c9"} Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.787395 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" podStartSLOduration=2.669634065 podStartE2EDuration="4.787371295s" podCreationTimestamp="2026-01-29 14:21:21 +0000 UTC" firstStartedPulling="2026-01-29 14:21:22.382951086 +0000 UTC m=+1117.077685468" lastFinishedPulling="2026-01-29 14:21:24.500688316 +0000 UTC m=+1119.195422698" observedRunningTime="2026-01-29 14:21:25.775512326 +0000 UTC m=+1120.470246728" watchObservedRunningTime="2026-01-29 14:21:25.787371295 +0000 UTC m=+1120.482105687" Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.806873 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" podStartSLOduration=4.806851341 podStartE2EDuration="4.806851341s" podCreationTimestamp="2026-01-29 14:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:25.796052239 +0000 UTC m=+1120.490786631" watchObservedRunningTime="2026-01-29 14:21:25.806851341 +0000 UTC m=+1120.501585733" Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.910326 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:21:25 crc kubenswrapper[4753]: I0129 14:21:25.942630 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" podStartSLOduration=2.811220668 podStartE2EDuration="4.942611816s" podCreationTimestamp="2026-01-29 14:21:21 +0000 UTC" firstStartedPulling="2026-01-29 14:21:22.368030013 +0000 UTC m=+1117.062764385" lastFinishedPulling="2026-01-29 14:21:24.499421151 +0000 UTC m=+1119.194155533" observedRunningTime="2026-01-29 14:21:25.817897269 +0000 UTC m=+1120.512631651" watchObservedRunningTime="2026-01-29 14:21:25.942611816 +0000 UTC m=+1120.637346198" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.111517 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-scripts\") pod \"ae4b866a-f13e-4cfd-95c1-50b64d249217\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.111769 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsb42\" (UniqueName: \"kubernetes.io/projected/ae4b866a-f13e-4cfd-95c1-50b64d249217-kube-api-access-dsb42\") pod \"ae4b866a-f13e-4cfd-95c1-50b64d249217\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.111841 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-sg-core-conf-yaml\") pod \"ae4b866a-f13e-4cfd-95c1-50b64d249217\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.111873 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-run-httpd\") pod \"ae4b866a-f13e-4cfd-95c1-50b64d249217\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.112558 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-log-httpd\") pod \"ae4b866a-f13e-4cfd-95c1-50b64d249217\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.112594 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-combined-ca-bundle\") pod \"ae4b866a-f13e-4cfd-95c1-50b64d249217\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.112636 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-config-data\") pod \"ae4b866a-f13e-4cfd-95c1-50b64d249217\" (UID: \"ae4b866a-f13e-4cfd-95c1-50b64d249217\") " Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.117505 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ae4b866a-f13e-4cfd-95c1-50b64d249217" (UID: "ae4b866a-f13e-4cfd-95c1-50b64d249217"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.117784 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ae4b866a-f13e-4cfd-95c1-50b64d249217" (UID: "ae4b866a-f13e-4cfd-95c1-50b64d249217"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.120662 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-scripts" (OuterVolumeSpecName: "scripts") pod "ae4b866a-f13e-4cfd-95c1-50b64d249217" (UID: "ae4b866a-f13e-4cfd-95c1-50b64d249217"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.125483 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae4b866a-f13e-4cfd-95c1-50b64d249217-kube-api-access-dsb42" (OuterVolumeSpecName: "kube-api-access-dsb42") pod "ae4b866a-f13e-4cfd-95c1-50b64d249217" (UID: "ae4b866a-f13e-4cfd-95c1-50b64d249217"). InnerVolumeSpecName "kube-api-access-dsb42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.148202 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ae4b866a-f13e-4cfd-95c1-50b64d249217" (UID: "ae4b866a-f13e-4cfd-95c1-50b64d249217"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.215258 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsb42\" (UniqueName: \"kubernetes.io/projected/ae4b866a-f13e-4cfd-95c1-50b64d249217-kube-api-access-dsb42\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.215297 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.215309 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.215322 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae4b866a-f13e-4cfd-95c1-50b64d249217-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.215332 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.226561 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-config-data" (OuterVolumeSpecName: "config-data") pod "ae4b866a-f13e-4cfd-95c1-50b64d249217" (UID: "ae4b866a-f13e-4cfd-95c1-50b64d249217"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.229420 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae4b866a-f13e-4cfd-95c1-50b64d249217" (UID: "ae4b866a-f13e-4cfd-95c1-50b64d249217"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.317048 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.317074 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae4b866a-f13e-4cfd-95c1-50b64d249217-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.787196 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.787196 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae4b866a-f13e-4cfd-95c1-50b64d249217","Type":"ContainerDied","Data":"dec543e3fff7226d229688f209f1431e5bd828f37960b101ec7840b804de5960"} Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.787276 4753 scope.go:117] "RemoveContainer" containerID="f4bf28df49742c84950ddb8eb746fb35293229ce94c851c3a852f37a0e62fd95" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.791015 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-754c57f55b-2hkbd" event={"ID":"920278e2-a31f-4ad2-81be-d30a799b9d64","Type":"ContainerStarted","Data":"831d4b37a8f34dc6da88e5991350950aff013d5011940ad0f9ecbf93b46818a1"} Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.791068 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-754c57f55b-2hkbd" event={"ID":"920278e2-a31f-4ad2-81be-d30a799b9d64","Type":"ContainerStarted","Data":"eae4f86edd4d25f3149ae9e9e7b406efbd7b4f7e532051ca604088e784fc5e54"} Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.840677 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-754c57f55b-2hkbd" podStartSLOduration=2.8406596950000003 podStartE2EDuration="2.840659695s" podCreationTimestamp="2026-01-29 14:21:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:26.820341557 +0000 UTC m=+1121.515075949" watchObservedRunningTime="2026-01-29 14:21:26.840659695 +0000 UTC m=+1121.535394087" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.844660 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.852373 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.871105 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:21:26 crc kubenswrapper[4753]: E0129 14:21:26.871446 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="proxy-httpd" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.871461 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="proxy-httpd" Jan 29 14:21:26 crc kubenswrapper[4753]: E0129 14:21:26.871480 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="ceilometer-notification-agent" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.871486 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="ceilometer-notification-agent" Jan 29 14:21:26 crc kubenswrapper[4753]: E0129 14:21:26.871498 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="sg-core" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.871504 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="sg-core" Jan 29 14:21:26 crc kubenswrapper[4753]: E0129 14:21:26.871521 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="ceilometer-central-agent" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.871526 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="ceilometer-central-agent" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.871668 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="proxy-httpd" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.871696 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="sg-core" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.871715 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="ceilometer-notification-agent" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.871725 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" containerName="ceilometer-central-agent" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.873119 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.875355 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.876063 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.885935 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.927154 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.927476 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.927524 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-run-httpd\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.927541 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-config-data\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.927663 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-scripts\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.927704 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v9kc\" (UniqueName: \"kubernetes.io/projected/af576162-4633-43b7-908f-a6dc16560cc7-kube-api-access-2v9kc\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.927771 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-log-httpd\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.930751 4753 scope.go:117] "RemoveContainer" containerID="ce57ea4271b091b656e99f290c9acae064d40a8fc209b44e9ae762dd4dfb8ffb" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.949819 4753 scope.go:117] "RemoveContainer" containerID="3d5f6cf41f97cc1c79168168e732ba985f336c94896a7f61c62971e27348ed24" Jan 29 14:21:26 crc kubenswrapper[4753]: I0129 14:21:26.966763 4753 scope.go:117] "RemoveContainer" containerID="ff655e22020abc68977684e4d6b5233ddce78356b5d081acecc4a689f675327e" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.028835 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-scripts\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.028881 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v9kc\" (UniqueName: \"kubernetes.io/projected/af576162-4633-43b7-908f-a6dc16560cc7-kube-api-access-2v9kc\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.028954 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-log-httpd\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.028988 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.029026 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.029066 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-run-httpd\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.029091 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-config-data\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.029639 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-log-httpd\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.029701 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-run-httpd\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.036903 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-config-data\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.036913 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-scripts\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.037631 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.038955 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.055645 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.055763 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.060803 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v9kc\" (UniqueName: \"kubernetes.io/projected/af576162-4633-43b7-908f-a6dc16560cc7-kube-api-access-2v9kc\") pod \"ceilometer-0\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.198562 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.682514 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:21:27 crc kubenswrapper[4753]: W0129 14:21:27.695636 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf576162_4633_43b7_908f_a6dc16560cc7.slice/crio-25764975640a5623a4e5d595bed919920d711fdca6bea86f11d08fd17abb721c WatchSource:0}: Error finding container 25764975640a5623a4e5d595bed919920d711fdca6bea86f11d08fd17abb721c: Status 404 returned error can't find the container with id 25764975640a5623a4e5d595bed919920d711fdca6bea86f11d08fd17abb721c Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.821269 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af576162-4633-43b7-908f-a6dc16560cc7","Type":"ContainerStarted","Data":"25764975640a5623a4e5d595bed919920d711fdca6bea86f11d08fd17abb721c"} Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.823498 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:27 crc kubenswrapper[4753]: I0129 14:21:27.823559 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:28 crc kubenswrapper[4753]: I0129 14:21:28.192289 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae4b866a-f13e-4cfd-95c1-50b64d249217" path="/var/lib/kubelet/pods/ae4b866a-f13e-4cfd-95c1-50b64d249217/volumes" Jan 29 14:21:28 crc kubenswrapper[4753]: I0129 14:21:28.613174 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:28 crc kubenswrapper[4753]: I0129 14:21:28.833429 4753 generic.go:334] "Generic (PLEG): container finished" podID="a7b64f0e-f7ef-4737-a543-edba91ed6811" containerID="3b46cefbd0f083420a3feabe6fcc97942c18ff0babb3df072dd24c5df0a8f895" exitCode=0 Jan 29 14:21:28 crc kubenswrapper[4753]: I0129 14:21:28.833493 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-79v95" event={"ID":"a7b64f0e-f7ef-4737-a543-edba91ed6811","Type":"ContainerDied","Data":"3b46cefbd0f083420a3feabe6fcc97942c18ff0babb3df072dd24c5df0a8f895"} Jan 29 14:21:28 crc kubenswrapper[4753]: I0129 14:21:28.835675 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af576162-4633-43b7-908f-a6dc16560cc7","Type":"ContainerStarted","Data":"8c8b56903eb2784b7ee4eb6de5c0202f25c9e47f856620b26ff13909885877da"} Jan 29 14:21:29 crc kubenswrapper[4753]: I0129 14:21:29.559416 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 14:21:29 crc kubenswrapper[4753]: I0129 14:21:29.559771 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 14:21:29 crc kubenswrapper[4753]: I0129 14:21:29.851397 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af576162-4633-43b7-908f-a6dc16560cc7","Type":"ContainerStarted","Data":"981d9dd663e275310464c075a209536c041ffd974fc888e0d90e3a0184663d11"} Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.117544 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.301053 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-79v95" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.405857 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-config-data\") pod \"a7b64f0e-f7ef-4737-a543-edba91ed6811\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.405968 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7b64f0e-f7ef-4737-a543-edba91ed6811-etc-machine-id\") pod \"a7b64f0e-f7ef-4737-a543-edba91ed6811\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.405995 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-combined-ca-bundle\") pod \"a7b64f0e-f7ef-4737-a543-edba91ed6811\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.406018 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-scripts\") pod \"a7b64f0e-f7ef-4737-a543-edba91ed6811\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.406128 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jwj6\" (UniqueName: \"kubernetes.io/projected/a7b64f0e-f7ef-4737-a543-edba91ed6811-kube-api-access-5jwj6\") pod \"a7b64f0e-f7ef-4737-a543-edba91ed6811\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.406202 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-db-sync-config-data\") pod \"a7b64f0e-f7ef-4737-a543-edba91ed6811\" (UID: \"a7b64f0e-f7ef-4737-a543-edba91ed6811\") " Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.406450 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7b64f0e-f7ef-4737-a543-edba91ed6811-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a7b64f0e-f7ef-4737-a543-edba91ed6811" (UID: "a7b64f0e-f7ef-4737-a543-edba91ed6811"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.406717 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7b64f0e-f7ef-4737-a543-edba91ed6811-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.411612 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7b64f0e-f7ef-4737-a543-edba91ed6811-kube-api-access-5jwj6" (OuterVolumeSpecName: "kube-api-access-5jwj6") pod "a7b64f0e-f7ef-4737-a543-edba91ed6811" (UID: "a7b64f0e-f7ef-4737-a543-edba91ed6811"). InnerVolumeSpecName "kube-api-access-5jwj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.425108 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a7b64f0e-f7ef-4737-a543-edba91ed6811" (UID: "a7b64f0e-f7ef-4737-a543-edba91ed6811"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.425319 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-scripts" (OuterVolumeSpecName: "scripts") pod "a7b64f0e-f7ef-4737-a543-edba91ed6811" (UID: "a7b64f0e-f7ef-4737-a543-edba91ed6811"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.439574 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7b64f0e-f7ef-4737-a543-edba91ed6811" (UID: "a7b64f0e-f7ef-4737-a543-edba91ed6811"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.457582 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-config-data" (OuterVolumeSpecName: "config-data") pod "a7b64f0e-f7ef-4737-a543-edba91ed6811" (UID: "a7b64f0e-f7ef-4737-a543-edba91ed6811"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.508202 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.508233 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.508242 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jwj6\" (UniqueName: \"kubernetes.io/projected/a7b64f0e-f7ef-4737-a543-edba91ed6811-kube-api-access-5jwj6\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.508253 4753 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.508261 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b64f0e-f7ef-4737-a543-edba91ed6811-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.865708 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-79v95" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.865713 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-79v95" event={"ID":"a7b64f0e-f7ef-4737-a543-edba91ed6811","Type":"ContainerDied","Data":"b2bf2be42c6804e51c541b9a00622e903f0dbabe0b17f466e73a5738805284ed"} Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.866087 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2bf2be42c6804e51c541b9a00622e903f0dbabe0b17f466e73a5738805284ed" Jan 29 14:21:30 crc kubenswrapper[4753]: I0129 14:21:30.868567 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af576162-4633-43b7-908f-a6dc16560cc7","Type":"ContainerStarted","Data":"f35160890c1be25194e2028f513efc9f2a5c8394af2ddf82cb8ef6da1fdb2e90"} Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.255363 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:21:31 crc kubenswrapper[4753]: E0129 14:21:31.259846 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b64f0e-f7ef-4737-a543-edba91ed6811" containerName="cinder-db-sync" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.259874 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b64f0e-f7ef-4737-a543-edba91ed6811" containerName="cinder-db-sync" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.260058 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7b64f0e-f7ef-4737-a543-edba91ed6811" containerName="cinder-db-sync" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.260959 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.266145 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.266563 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.266708 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.266867 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mckjn" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.274167 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.322975 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-scripts\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.323033 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.323064 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.323087 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvx9x\" (UniqueName: \"kubernetes.io/projected/d6feeb92-3011-4e0e-b50d-9839bd14265f-kube-api-access-tvx9x\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.323134 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.323178 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6feeb92-3011-4e0e-b50d-9839bd14265f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.375080 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-hjmpz"] Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.375438 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" podUID="100d69bd-7c7b-41b8-84a6-c78e4480f0b4" containerName="dnsmasq-dns" containerID="cri-o://0b6e62b72fa89354b656420159d71664b787f973cbac0aca4b873ca56e236d8e" gracePeriod=10 Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.385071 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.396951 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-j42ph"] Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.414634 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.419669 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-j42ph"] Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.426079 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-scripts\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.426136 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.426199 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.426221 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvx9x\" (UniqueName: \"kubernetes.io/projected/d6feeb92-3011-4e0e-b50d-9839bd14265f-kube-api-access-tvx9x\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.426330 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.426353 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6feeb92-3011-4e0e-b50d-9839bd14265f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.426473 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6feeb92-3011-4e0e-b50d-9839bd14265f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.435047 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-scripts\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.435591 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.437540 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.445872 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.504748 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvx9x\" (UniqueName: \"kubernetes.io/projected/d6feeb92-3011-4e0e-b50d-9839bd14265f-kube-api-access-tvx9x\") pod \"cinder-scheduler-0\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.527814 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.528195 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-svc\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.528232 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-swift-storage-0\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.528351 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-config\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.528373 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbhh\" (UniqueName: \"kubernetes.io/projected/dbf11a80-f998-4c27-8534-c6634ef15703-kube-api-access-jdbhh\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.528396 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.588578 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.598936 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.600489 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.607939 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.608441 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.629556 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-config\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.629600 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbhh\" (UniqueName: \"kubernetes.io/projected/dbf11a80-f998-4c27-8534-c6634ef15703-kube-api-access-jdbhh\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.629622 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.629664 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.629693 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-svc\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.629725 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-swift-storage-0\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.630853 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-swift-storage-0\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.630961 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-config\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.630859 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-sb\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.631463 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-svc\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.637607 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-nb\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.653028 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbhh\" (UniqueName: \"kubernetes.io/projected/dbf11a80-f998-4c27-8534-c6634ef15703-kube-api-access-jdbhh\") pod \"dnsmasq-dns-6b4f5fc4f-j42ph\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.730792 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-logs\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.730847 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.730879 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-scripts\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.730911 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmqpq\" (UniqueName: \"kubernetes.io/projected/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-kube-api-access-bmqpq\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.730980 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.731000 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data-custom\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.731055 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.735204 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.832581 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmqpq\" (UniqueName: \"kubernetes.io/projected/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-kube-api-access-bmqpq\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.832683 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.832707 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data-custom\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.832762 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.832790 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-logs\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.832813 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.832844 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-scripts\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.836534 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-logs\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.839648 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-scripts\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.840757 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data-custom\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.841112 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.843719 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.844934 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.873986 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmqpq\" (UniqueName: \"kubernetes.io/projected/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-kube-api-access-bmqpq\") pod \"cinder-api-0\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " pod="openstack/cinder-api-0" Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.942434 4753 generic.go:334] "Generic (PLEG): container finished" podID="100d69bd-7c7b-41b8-84a6-c78e4480f0b4" containerID="0b6e62b72fa89354b656420159d71664b787f973cbac0aca4b873ca56e236d8e" exitCode=0 Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.942485 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" event={"ID":"100d69bd-7c7b-41b8-84a6-c78e4480f0b4","Type":"ContainerDied","Data":"0b6e62b72fa89354b656420159d71664b787f973cbac0aca4b873ca56e236d8e"} Jan 29 14:21:31 crc kubenswrapper[4753]: I0129 14:21:31.952909 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.204843 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.268868 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-svc\") pod \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.269377 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-config\") pod \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.269427 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-nb\") pod \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.269500 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pslf2\" (UniqueName: \"kubernetes.io/projected/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-kube-api-access-pslf2\") pod \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.269577 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-sb\") pod \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.269604 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-swift-storage-0\") pod \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\" (UID: \"100d69bd-7c7b-41b8-84a6-c78e4480f0b4\") " Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.333444 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-kube-api-access-pslf2" (OuterVolumeSpecName: "kube-api-access-pslf2") pod "100d69bd-7c7b-41b8-84a6-c78e4480f0b4" (UID: "100d69bd-7c7b-41b8-84a6-c78e4480f0b4"). InnerVolumeSpecName "kube-api-access-pslf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.381086 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.410794 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pslf2\" (UniqueName: \"kubernetes.io/projected/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-kube-api-access-pslf2\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.456352 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-j42ph"] Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.458388 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "100d69bd-7c7b-41b8-84a6-c78e4480f0b4" (UID: "100d69bd-7c7b-41b8-84a6-c78e4480f0b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.460665 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "100d69bd-7c7b-41b8-84a6-c78e4480f0b4" (UID: "100d69bd-7c7b-41b8-84a6-c78e4480f0b4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.460675 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "100d69bd-7c7b-41b8-84a6-c78e4480f0b4" (UID: "100d69bd-7c7b-41b8-84a6-c78e4480f0b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.494590 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-config" (OuterVolumeSpecName: "config") pod "100d69bd-7c7b-41b8-84a6-c78e4480f0b4" (UID: "100d69bd-7c7b-41b8-84a6-c78e4480f0b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.512368 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.512849 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.512860 4753 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.512869 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.560321 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "100d69bd-7c7b-41b8-84a6-c78e4480f0b4" (UID: "100d69bd-7c7b-41b8-84a6-c78e4480f0b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.616382 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/100d69bd-7c7b-41b8-84a6-c78e4480f0b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.849232 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:21:32 crc kubenswrapper[4753]: W0129 14:21:32.871549 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd67c3c44_b4b6_48ab_bbd1_943b6ea69338.slice/crio-433efb1919aa19f7038106e5748409ca3b8137010deab9e2320b2dd5bd6fa5ba WatchSource:0}: Error finding container 433efb1919aa19f7038106e5748409ca3b8137010deab9e2320b2dd5bd6fa5ba: Status 404 returned error can't find the container with id 433efb1919aa19f7038106e5748409ca3b8137010deab9e2320b2dd5bd6fa5ba Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.932127 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.957680 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d67c3c44-b4b6-48ab-bbd1-943b6ea69338","Type":"ContainerStarted","Data":"433efb1919aa19f7038106e5748409ca3b8137010deab9e2320b2dd5bd6fa5ba"} Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.966785 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" event={"ID":"100d69bd-7c7b-41b8-84a6-c78e4480f0b4","Type":"ContainerDied","Data":"8942c52cdb0ba13262adc7b084123d1d1360e21dae951eb3494ece3295b87db3"} Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.966839 4753 scope.go:117] "RemoveContainer" containerID="0b6e62b72fa89354b656420159d71664b787f973cbac0aca4b873ca56e236d8e" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.966999 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" Jan 29 14:21:32 crc kubenswrapper[4753]: I0129 14:21:32.978440 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d6feeb92-3011-4e0e-b50d-9839bd14265f","Type":"ContainerStarted","Data":"b5e804a8de55dde6ed52495c50675acd15cda0ac445835a4f35e10202857acd6"} Jan 29 14:21:33 crc kubenswrapper[4753]: I0129 14:21:33.006991 4753 generic.go:334] "Generic (PLEG): container finished" podID="dbf11a80-f998-4c27-8534-c6634ef15703" containerID="f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7" exitCode=0 Jan 29 14:21:33 crc kubenswrapper[4753]: I0129 14:21:33.007036 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" event={"ID":"dbf11a80-f998-4c27-8534-c6634ef15703","Type":"ContainerDied","Data":"f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7"} Jan 29 14:21:33 crc kubenswrapper[4753]: I0129 14:21:33.007222 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" event={"ID":"dbf11a80-f998-4c27-8534-c6634ef15703","Type":"ContainerStarted","Data":"bb4b8c9c6714ae832c15e65edc1dc6e985937e4cce85dde6168b7fa51ce8bdc8"} Jan 29 14:21:33 crc kubenswrapper[4753]: I0129 14:21:33.023467 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-hjmpz"] Jan 29 14:21:33 crc kubenswrapper[4753]: I0129 14:21:33.026286 4753 scope.go:117] "RemoveContainer" containerID="6b2cb51caa1768f1c65c4de91f615e7e3e193803e87b6dc9802ce542552641b3" Jan 29 14:21:33 crc kubenswrapper[4753]: I0129 14:21:33.044271 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54c4dfcffc-hjmpz"] Jan 29 14:21:33 crc kubenswrapper[4753]: I0129 14:21:33.562860 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.026947 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" event={"ID":"dbf11a80-f998-4c27-8534-c6634ef15703","Type":"ContainerStarted","Data":"0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864"} Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.027411 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.033117 4753 generic.go:334] "Generic (PLEG): container finished" podID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" containerID="da1ae90af0cb13e22f00fa6061683c30f0ffd82c7f6ae4ec521d3a1a0aea10b7" exitCode=137 Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.033210 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5a9b99a-4591-4d8b-8534-115cf9d549e1","Type":"ContainerDied","Data":"da1ae90af0cb13e22f00fa6061683c30f0ffd82c7f6ae4ec521d3a1a0aea10b7"} Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.037053 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.037994 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af576162-4633-43b7-908f-a6dc16560cc7","Type":"ContainerStarted","Data":"79566b6b49c7579aa3f624c9963d2eae687a842e01846f5ebf14cdb5b4ee1694"} Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.038330 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.049639 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" podStartSLOduration=3.0496091339999998 podStartE2EDuration="3.049609134s" podCreationTimestamp="2026-01-29 14:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:34.042626765 +0000 UTC m=+1128.737361157" watchObservedRunningTime="2026-01-29 14:21:34.049609134 +0000 UTC m=+1128.744343516" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.131303 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.034044488 podStartE2EDuration="8.131285828s" podCreationTimestamp="2026-01-29 14:21:26 +0000 UTC" firstStartedPulling="2026-01-29 14:21:27.698431108 +0000 UTC m=+1122.393165520" lastFinishedPulling="2026-01-29 14:21:32.795672478 +0000 UTC m=+1127.490406860" observedRunningTime="2026-01-29 14:21:34.113785566 +0000 UTC m=+1128.808519948" watchObservedRunningTime="2026-01-29 14:21:34.131285828 +0000 UTC m=+1128.826020210" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.162762 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="100d69bd-7c7b-41b8-84a6-c78e4480f0b4" path="/var/lib/kubelet/pods/100d69bd-7c7b-41b8-84a6-c78e4480f0b4/volumes" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.168910 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-config-data\") pod \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.169008 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-httpd-run\") pod \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.169090 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-logs\") pod \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.169633 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-logs" (OuterVolumeSpecName: "logs") pod "b5a9b99a-4591-4d8b-8534-115cf9d549e1" (UID: "b5a9b99a-4591-4d8b-8534-115cf9d549e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.169724 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.169860 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b5a9b99a-4591-4d8b-8534-115cf9d549e1" (UID: "b5a9b99a-4591-4d8b-8534-115cf9d549e1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.170076 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-combined-ca-bundle\") pod \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.170219 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmhfl\" (UniqueName: \"kubernetes.io/projected/b5a9b99a-4591-4d8b-8534-115cf9d549e1-kube-api-access-wmhfl\") pod \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.170241 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-scripts\") pod \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\" (UID: \"b5a9b99a-4591-4d8b-8534-115cf9d549e1\") " Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.171092 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.171108 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5a9b99a-4591-4d8b-8534-115cf9d549e1-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.179718 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-scripts" (OuterVolumeSpecName: "scripts") pod "b5a9b99a-4591-4d8b-8534-115cf9d549e1" (UID: "b5a9b99a-4591-4d8b-8534-115cf9d549e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.187603 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a9b99a-4591-4d8b-8534-115cf9d549e1-kube-api-access-wmhfl" (OuterVolumeSpecName: "kube-api-access-wmhfl") pod "b5a9b99a-4591-4d8b-8534-115cf9d549e1" (UID: "b5a9b99a-4591-4d8b-8534-115cf9d549e1"). InnerVolumeSpecName "kube-api-access-wmhfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.188846 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "b5a9b99a-4591-4d8b-8534-115cf9d549e1" (UID: "b5a9b99a-4591-4d8b-8534-115cf9d549e1"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.228660 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5a9b99a-4591-4d8b-8534-115cf9d549e1" (UID: "b5a9b99a-4591-4d8b-8534-115cf9d549e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.249120 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-config-data" (OuterVolumeSpecName: "config-data") pod "b5a9b99a-4591-4d8b-8534-115cf9d549e1" (UID: "b5a9b99a-4591-4d8b-8534-115cf9d549e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.272515 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmhfl\" (UniqueName: \"kubernetes.io/projected/b5a9b99a-4591-4d8b-8534-115cf9d549e1-kube-api-access-wmhfl\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.272551 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.272561 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.272580 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.272590 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a9b99a-4591-4d8b-8534-115cf9d549e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.300442 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.374914 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.669037 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.778083 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9d9fc848-865w5"] Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.778347 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9d9fc848-865w5" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerName="barbican-api-log" containerID="cri-o://8b1ef8941d5ded68d97fcd43e2874bf764c0c6c81914b6d866e1f640938892bf" gracePeriod=30 Jan 29 14:21:34 crc kubenswrapper[4753]: I0129 14:21:34.778492 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9d9fc848-865w5" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerName="barbican-api" containerID="cri-o://ceaef1ceec2f6ca383381ca30573762ffc047afa4a7e186ad2cbedd99e33d17e" gracePeriod=30 Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.089678 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d67c3c44-b4b6-48ab-bbd1-943b6ea69338","Type":"ContainerStarted","Data":"e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61"} Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.089944 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d67c3c44-b4b6-48ab-bbd1-943b6ea69338","Type":"ContainerStarted","Data":"260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342"} Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.089884 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" containerName="cinder-api-log" containerID="cri-o://260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342" gracePeriod=30 Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.090085 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.090172 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" containerName="cinder-api" containerID="cri-o://e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61" gracePeriod=30 Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.101316 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d6feeb92-3011-4e0e-b50d-9839bd14265f","Type":"ContainerStarted","Data":"a2184a17ca6189abab48b4f8c663c1b4774fe6572faf2feded15b0aa9738116d"} Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.106879 4753 generic.go:334] "Generic (PLEG): container finished" podID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerID="8b1ef8941d5ded68d97fcd43e2874bf764c0c6c81914b6d866e1f640938892bf" exitCode=143 Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.106950 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fc848-865w5" event={"ID":"79c4f6e1-381b-4c9f-b265-547887d39ab1","Type":"ContainerDied","Data":"8b1ef8941d5ded68d97fcd43e2874bf764c0c6c81914b6d866e1f640938892bf"} Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.130569 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.134477 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5a9b99a-4591-4d8b-8534-115cf9d549e1","Type":"ContainerDied","Data":"994455796a6f1c15ea2fc2de261033fa6223d25c7faa6762d78d9fb8dee77369"} Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.134547 4753 scope.go:117] "RemoveContainer" containerID="da1ae90af0cb13e22f00fa6061683c30f0ffd82c7f6ae4ec521d3a1a0aea10b7" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.172637 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.172595685 podStartE2EDuration="4.172595685s" podCreationTimestamp="2026-01-29 14:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:35.116608124 +0000 UTC m=+1129.811342506" watchObservedRunningTime="2026-01-29 14:21:35.172595685 +0000 UTC m=+1129.867330077" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.175576 4753 scope.go:117] "RemoveContainer" containerID="6cf829791a4f4878cfbaec0b92a1f1a45349cf6ae7bd635cbe469c9416b023c3" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.179742 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.198453 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.208168 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:21:35 crc kubenswrapper[4753]: E0129 14:21:35.208521 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100d69bd-7c7b-41b8-84a6-c78e4480f0b4" containerName="dnsmasq-dns" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.208538 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="100d69bd-7c7b-41b8-84a6-c78e4480f0b4" containerName="dnsmasq-dns" Jan 29 14:21:35 crc kubenswrapper[4753]: E0129 14:21:35.208562 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100d69bd-7c7b-41b8-84a6-c78e4480f0b4" containerName="init" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.208568 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="100d69bd-7c7b-41b8-84a6-c78e4480f0b4" containerName="init" Jan 29 14:21:35 crc kubenswrapper[4753]: E0129 14:21:35.208584 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" containerName="glance-log" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.208590 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" containerName="glance-log" Jan 29 14:21:35 crc kubenswrapper[4753]: E0129 14:21:35.208600 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" containerName="glance-httpd" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.208605 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" containerName="glance-httpd" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.208782 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="100d69bd-7c7b-41b8-84a6-c78e4480f0b4" containerName="dnsmasq-dns" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.208800 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" containerName="glance-log" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.208808 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" containerName="glance-httpd" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.210012 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.212463 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.216430 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.233670 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.298962 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.299282 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.299318 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.299345 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-logs\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.299369 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.299391 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-scripts\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.299447 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jldnx\" (UniqueName: \"kubernetes.io/projected/48f31315-8403-4d07-927f-4fe66c4db88b-kube-api-access-jldnx\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.299474 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-config-data\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.401799 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jldnx\" (UniqueName: \"kubernetes.io/projected/48f31315-8403-4d07-927f-4fe66c4db88b-kube-api-access-jldnx\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.401866 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-config-data\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.401968 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.401994 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.402037 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.402074 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-logs\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.402130 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.402175 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-scripts\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.402479 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.402543 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.402636 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-logs\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.411203 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.415714 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-scripts\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.421212 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-config-data\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.424638 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.431808 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jldnx\" (UniqueName: \"kubernetes.io/projected/48f31315-8403-4d07-927f-4fe66c4db88b-kube-api-access-jldnx\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.440780 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.556829 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:21:35 crc kubenswrapper[4753]: I0129 14:21:35.996040 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.122953 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-etc-machine-id\") pod \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.123100 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-logs\") pod \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.123129 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-combined-ca-bundle\") pod \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.123219 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data-custom\") pod \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.123333 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmqpq\" (UniqueName: \"kubernetes.io/projected/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-kube-api-access-bmqpq\") pod \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.123432 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-scripts\") pod \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.123610 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data\") pod \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\" (UID: \"d67c3c44-b4b6-48ab-bbd1-943b6ea69338\") " Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.124191 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d67c3c44-b4b6-48ab-bbd1-943b6ea69338" (UID: "d67c3c44-b4b6-48ab-bbd1-943b6ea69338"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.124704 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-logs" (OuterVolumeSpecName: "logs") pod "d67c3c44-b4b6-48ab-bbd1-943b6ea69338" (UID: "d67c3c44-b4b6-48ab-bbd1-943b6ea69338"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.125404 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.125431 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.130143 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-kube-api-access-bmqpq" (OuterVolumeSpecName: "kube-api-access-bmqpq") pod "d67c3c44-b4b6-48ab-bbd1-943b6ea69338" (UID: "d67c3c44-b4b6-48ab-bbd1-943b6ea69338"). InnerVolumeSpecName "kube-api-access-bmqpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.133388 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d67c3c44-b4b6-48ab-bbd1-943b6ea69338" (UID: "d67c3c44-b4b6-48ab-bbd1-943b6ea69338"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.133411 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-scripts" (OuterVolumeSpecName: "scripts") pod "d67c3c44-b4b6-48ab-bbd1-943b6ea69338" (UID: "d67c3c44-b4b6-48ab-bbd1-943b6ea69338"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.151075 4753 generic.go:334] "Generic (PLEG): container finished" podID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" containerID="e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61" exitCode=0 Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.151110 4753 generic.go:334] "Generic (PLEG): container finished" podID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" containerID="260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342" exitCode=143 Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.151360 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.160196 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a9b99a-4591-4d8b-8534-115cf9d549e1" path="/var/lib/kubelet/pods/b5a9b99a-4591-4d8b-8534-115cf9d549e1/volumes" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.162922 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d67c3c44-b4b6-48ab-bbd1-943b6ea69338" (UID: "d67c3c44-b4b6-48ab-bbd1-943b6ea69338"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.199363 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data" (OuterVolumeSpecName: "config-data") pod "d67c3c44-b4b6-48ab-bbd1-943b6ea69338" (UID: "d67c3c44-b4b6-48ab-bbd1-943b6ea69338"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.207664 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.218658318 podStartE2EDuration="5.207635622s" podCreationTimestamp="2026-01-29 14:21:31 +0000 UTC" firstStartedPulling="2026-01-29 14:21:32.339431444 +0000 UTC m=+1127.034165816" lastFinishedPulling="2026-01-29 14:21:33.328408738 +0000 UTC m=+1128.023143120" observedRunningTime="2026-01-29 14:21:36.202331899 +0000 UTC m=+1130.897066281" watchObservedRunningTime="2026-01-29 14:21:36.207635622 +0000 UTC m=+1130.902370004" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.227576 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.227620 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.227633 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmqpq\" (UniqueName: \"kubernetes.io/projected/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-kube-api-access-bmqpq\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.227647 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.227659 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67c3c44-b4b6-48ab-bbd1-943b6ea69338-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:36 crc kubenswrapper[4753]: W0129 14:21:36.254760 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48f31315_8403_4d07_927f_4fe66c4db88b.slice/crio-a12b94b86b73ac008156e4245fb3aad2466169d5aa75a4d5e17208ff728e41c9 WatchSource:0}: Error finding container a12b94b86b73ac008156e4245fb3aad2466169d5aa75a4d5e17208ff728e41c9: Status 404 returned error can't find the container with id a12b94b86b73ac008156e4245fb3aad2466169d5aa75a4d5e17208ff728e41c9 Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.302108 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d67c3c44-b4b6-48ab-bbd1-943b6ea69338","Type":"ContainerDied","Data":"e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61"} Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.302185 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d67c3c44-b4b6-48ab-bbd1-943b6ea69338","Type":"ContainerDied","Data":"260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342"} Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.302215 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.302233 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d67c3c44-b4b6-48ab-bbd1-943b6ea69338","Type":"ContainerDied","Data":"433efb1919aa19f7038106e5748409ca3b8137010deab9e2320b2dd5bd6fa5ba"} Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.302269 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d6feeb92-3011-4e0e-b50d-9839bd14265f","Type":"ContainerStarted","Data":"d16d9a2ee2633cbe1578fd0a07f3702029a9684207a8d67784588168d6b4cdcd"} Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.302297 4753 scope.go:117] "RemoveContainer" containerID="e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.327593 4753 scope.go:117] "RemoveContainer" containerID="260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.360416 4753 scope.go:117] "RemoveContainer" containerID="e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61" Jan 29 14:21:36 crc kubenswrapper[4753]: E0129 14:21:36.360847 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61\": container with ID starting with e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61 not found: ID does not exist" containerID="e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.360887 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61"} err="failed to get container status \"e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61\": rpc error: code = NotFound desc = could not find container \"e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61\": container with ID starting with e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61 not found: ID does not exist" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.360908 4753 scope.go:117] "RemoveContainer" containerID="260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342" Jan 29 14:21:36 crc kubenswrapper[4753]: E0129 14:21:36.361140 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342\": container with ID starting with 260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342 not found: ID does not exist" containerID="260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.361200 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342"} err="failed to get container status \"260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342\": rpc error: code = NotFound desc = could not find container \"260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342\": container with ID starting with 260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342 not found: ID does not exist" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.361223 4753 scope.go:117] "RemoveContainer" containerID="e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.365394 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61"} err="failed to get container status \"e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61\": rpc error: code = NotFound desc = could not find container \"e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61\": container with ID starting with e1abba1d402ab8339127b2681b5bc3c17ccafc2d70c59eaa1794363a36ef8f61 not found: ID does not exist" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.365421 4753 scope.go:117] "RemoveContainer" containerID="260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.367726 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342"} err="failed to get container status \"260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342\": rpc error: code = NotFound desc = could not find container \"260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342\": container with ID starting with 260a3e4172265e410c2c7caf04d2154d37f0a025a5333f1833de09c29adac342 not found: ID does not exist" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.499749 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.507473 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.535226 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:21:36 crc kubenswrapper[4753]: E0129 14:21:36.535591 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" containerName="cinder-api-log" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.535609 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" containerName="cinder-api-log" Jan 29 14:21:36 crc kubenswrapper[4753]: E0129 14:21:36.535625 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" containerName="cinder-api" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.535634 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" containerName="cinder-api" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.535807 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" containerName="cinder-api" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.535831 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" containerName="cinder-api-log" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.536696 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.539728 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.539979 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.544309 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.552623 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.593982 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.632343 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.632383 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.632443 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.632478 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-scripts\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.632503 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.632554 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2dbc378-044c-49a2-a891-94a90a0acff1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.632603 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-924sv\" (UniqueName: \"kubernetes.io/projected/f2dbc378-044c-49a2-a891-94a90a0acff1-kube-api-access-924sv\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.632619 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data-custom\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.632636 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2dbc378-044c-49a2-a891-94a90a0acff1-logs\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.735119 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-924sv\" (UniqueName: \"kubernetes.io/projected/f2dbc378-044c-49a2-a891-94a90a0acff1-kube-api-access-924sv\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.735441 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data-custom\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.735463 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2dbc378-044c-49a2-a891-94a90a0acff1-logs\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.735508 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.735527 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.735700 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.735836 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-scripts\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.735857 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.735995 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2dbc378-044c-49a2-a891-94a90a0acff1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.737866 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2dbc378-044c-49a2-a891-94a90a0acff1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.738108 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2dbc378-044c-49a2-a891-94a90a0acff1-logs\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.744778 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-scripts\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.744795 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.749254 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.749745 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.763607 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.765753 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-924sv\" (UniqueName: \"kubernetes.io/projected/f2dbc378-044c-49a2-a891-94a90a0acff1-kube-api-access-924sv\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.777728 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data-custom\") pod \"cinder-api-0\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " pod="openstack/cinder-api-0" Jan 29 14:21:36 crc kubenswrapper[4753]: I0129 14:21:36.894441 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 14:21:37 crc kubenswrapper[4753]: I0129 14:21:37.013374 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54c4dfcffc-hjmpz" podUID="100d69bd-7c7b-41b8-84a6-c78e4480f0b4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.159:5353: i/o timeout" Jan 29 14:21:37 crc kubenswrapper[4753]: I0129 14:21:37.189899 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48f31315-8403-4d07-927f-4fe66c4db88b","Type":"ContainerStarted","Data":"f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224"} Jan 29 14:21:37 crc kubenswrapper[4753]: I0129 14:21:37.189956 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48f31315-8403-4d07-927f-4fe66c4db88b","Type":"ContainerStarted","Data":"a12b94b86b73ac008156e4245fb3aad2466169d5aa75a4d5e17208ff728e41c9"} Jan 29 14:21:37 crc kubenswrapper[4753]: I0129 14:21:37.399805 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:21:37 crc kubenswrapper[4753]: I0129 14:21:37.982963 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9d9fc848-865w5" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:36362->10.217.0.160:9311: read: connection reset by peer" Jan 29 14:21:37 crc kubenswrapper[4753]: I0129 14:21:37.983677 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9d9fc848-865w5" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:36358->10.217.0.160:9311: read: connection reset by peer" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.174478 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67c3c44-b4b6-48ab-bbd1-943b6ea69338" path="/var/lib/kubelet/pods/d67c3c44-b4b6-48ab-bbd1-943b6ea69338/volumes" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.266610 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48f31315-8403-4d07-927f-4fe66c4db88b","Type":"ContainerStarted","Data":"372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c"} Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.285203 4753 generic.go:334] "Generic (PLEG): container finished" podID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerID="ceaef1ceec2f6ca383381ca30573762ffc047afa4a7e186ad2cbedd99e33d17e" exitCode=0 Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.285280 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fc848-865w5" event={"ID":"79c4f6e1-381b-4c9f-b265-547887d39ab1","Type":"ContainerDied","Data":"ceaef1ceec2f6ca383381ca30573762ffc047afa4a7e186ad2cbedd99e33d17e"} Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.303426 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f2dbc378-044c-49a2-a891-94a90a0acff1","Type":"ContainerStarted","Data":"6b786f25bafe8d6c3eaf139cfc995408a12a5b9be476cf604c495d5f9896ac60"} Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.512802 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.552582 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.552552024 podStartE2EDuration="3.552552024s" podCreationTimestamp="2026-01-29 14:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:38.325106376 +0000 UTC m=+1133.019840778" watchObservedRunningTime="2026-01-29 14:21:38.552552024 +0000 UTC m=+1133.247286406" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.595376 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-combined-ca-bundle\") pod \"79c4f6e1-381b-4c9f-b265-547887d39ab1\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.595453 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data-custom\") pod \"79c4f6e1-381b-4c9f-b265-547887d39ab1\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.595634 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brwm2\" (UniqueName: \"kubernetes.io/projected/79c4f6e1-381b-4c9f-b265-547887d39ab1-kube-api-access-brwm2\") pod \"79c4f6e1-381b-4c9f-b265-547887d39ab1\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.595670 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data\") pod \"79c4f6e1-381b-4c9f-b265-547887d39ab1\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.595728 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79c4f6e1-381b-4c9f-b265-547887d39ab1-logs\") pod \"79c4f6e1-381b-4c9f-b265-547887d39ab1\" (UID: \"79c4f6e1-381b-4c9f-b265-547887d39ab1\") " Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.597251 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c4f6e1-381b-4c9f-b265-547887d39ab1-logs" (OuterVolumeSpecName: "logs") pod "79c4f6e1-381b-4c9f-b265-547887d39ab1" (UID: "79c4f6e1-381b-4c9f-b265-547887d39ab1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.602006 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79c4f6e1-381b-4c9f-b265-547887d39ab1" (UID: "79c4f6e1-381b-4c9f-b265-547887d39ab1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.602653 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c4f6e1-381b-4c9f-b265-547887d39ab1-kube-api-access-brwm2" (OuterVolumeSpecName: "kube-api-access-brwm2") pod "79c4f6e1-381b-4c9f-b265-547887d39ab1" (UID: "79c4f6e1-381b-4c9f-b265-547887d39ab1"). InnerVolumeSpecName "kube-api-access-brwm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.651615 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79c4f6e1-381b-4c9f-b265-547887d39ab1" (UID: "79c4f6e1-381b-4c9f-b265-547887d39ab1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.661654 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data" (OuterVolumeSpecName: "config-data") pod "79c4f6e1-381b-4c9f-b265-547887d39ab1" (UID: "79c4f6e1-381b-4c9f-b265-547887d39ab1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.699000 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.699053 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79c4f6e1-381b-4c9f-b265-547887d39ab1-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.699068 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.699078 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79c4f6e1-381b-4c9f-b265-547887d39ab1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:38 crc kubenswrapper[4753]: I0129 14:21:38.699089 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brwm2\" (UniqueName: \"kubernetes.io/projected/79c4f6e1-381b-4c9f-b265-547887d39ab1-kube-api-access-brwm2\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.318744 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f2dbc378-044c-49a2-a891-94a90a0acff1","Type":"ContainerStarted","Data":"a111f231b7967402acc681815c59ca3c8b5a6d1e5677b5d94e8de77f77841cbc"} Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.319314 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f2dbc378-044c-49a2-a891-94a90a0acff1","Type":"ContainerStarted","Data":"656f30c915a464984557b8cece588ce1d9b95d296a1ac31530c3c6877585393c"} Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.319335 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.324949 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d9fc848-865w5" event={"ID":"79c4f6e1-381b-4c9f-b265-547887d39ab1","Type":"ContainerDied","Data":"2a000feffcc85d3c79bc62afeb68221a77990935841b6f037fa71f150179e847"} Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.325032 4753 scope.go:117] "RemoveContainer" containerID="ceaef1ceec2f6ca383381ca30573762ffc047afa4a7e186ad2cbedd99e33d17e" Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.325249 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d9fc848-865w5" Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.348060 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.348036765 podStartE2EDuration="3.348036765s" podCreationTimestamp="2026-01-29 14:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:39.336807422 +0000 UTC m=+1134.031541814" watchObservedRunningTime="2026-01-29 14:21:39.348036765 +0000 UTC m=+1134.042771157" Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.367293 4753 scope.go:117] "RemoveContainer" containerID="8b1ef8941d5ded68d97fcd43e2874bf764c0c6c81914b6d866e1f640938892bf" Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.413332 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9d9fc848-865w5"] Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.421188 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-9d9fc848-865w5"] Jan 29 14:21:39 crc kubenswrapper[4753]: I0129 14:21:39.690820 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:21:40 crc kubenswrapper[4753]: I0129 14:21:40.161017 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" path="/var/lib/kubelet/pods/79c4f6e1-381b-4c9f-b265-547887d39ab1/volumes" Jan 29 14:21:41 crc kubenswrapper[4753]: I0129 14:21:41.063404 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:41 crc kubenswrapper[4753]: I0129 14:21:41.100658 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:41 crc kubenswrapper[4753]: I0129 14:21:41.102760 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:21:41 crc kubenswrapper[4753]: I0129 14:21:41.737306 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:21:41 crc kubenswrapper[4753]: I0129 14:21:41.808339 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-hh94p"] Jan 29 14:21:41 crc kubenswrapper[4753]: I0129 14:21:41.808590 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" podUID="3b74239b-a7f3-4d1d-b292-50d131a0dad1" containerName="dnsmasq-dns" containerID="cri-o://cad549f567dc8d2e735609c8aa87f47f0cd16a596a682692a57f23801cc75689" gracePeriod=10 Jan 29 14:21:41 crc kubenswrapper[4753]: I0129 14:21:41.840352 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 29 14:21:41 crc kubenswrapper[4753]: I0129 14:21:41.927260 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.054030 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 29 14:21:42 crc kubenswrapper[4753]: E0129 14:21:42.054732 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerName="barbican-api" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.054749 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerName="barbican-api" Jan 29 14:21:42 crc kubenswrapper[4753]: E0129 14:21:42.054775 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerName="barbican-api-log" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.054783 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerName="barbican-api-log" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.054961 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerName="barbican-api-log" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.054986 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c4f6e1-381b-4c9f-b265-547887d39ab1" containerName="barbican-api" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.055511 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.063570 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.063834 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-qmzbc" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.063981 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.080132 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.195177 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config-secret\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.195257 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.195317 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.195359 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvszk\" (UniqueName: \"kubernetes.io/projected/606db0f5-bff8-4d65-bbad-63b8b1ba362c-kube-api-access-dvszk\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.297733 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config-secret\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.297819 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.297861 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.297904 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvszk\" (UniqueName: \"kubernetes.io/projected/606db0f5-bff8-4d65-bbad-63b8b1ba362c-kube-api-access-dvszk\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.299025 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.313441 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config-secret\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.321996 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvszk\" (UniqueName: \"kubernetes.io/projected/606db0f5-bff8-4d65-bbad-63b8b1ba362c-kube-api-access-dvszk\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.328901 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.370734 4753 generic.go:334] "Generic (PLEG): container finished" podID="3b74239b-a7f3-4d1d-b292-50d131a0dad1" containerID="cad549f567dc8d2e735609c8aa87f47f0cd16a596a682692a57f23801cc75689" exitCode=0 Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.371077 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d6feeb92-3011-4e0e-b50d-9839bd14265f" containerName="cinder-scheduler" containerID="cri-o://a2184a17ca6189abab48b4f8c663c1b4774fe6572faf2feded15b0aa9738116d" gracePeriod=30 Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.371273 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" event={"ID":"3b74239b-a7f3-4d1d-b292-50d131a0dad1","Type":"ContainerDied","Data":"cad549f567dc8d2e735609c8aa87f47f0cd16a596a682692a57f23801cc75689"} Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.372553 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d6feeb92-3011-4e0e-b50d-9839bd14265f" containerName="probe" containerID="cri-o://d16d9a2ee2633cbe1578fd0a07f3702029a9684207a8d67784588168d6b4cdcd" gracePeriod=30 Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.378538 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.433902 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.506856 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-svc\") pod \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.506923 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-nb\") pod \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.506952 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-sb\") pod \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.508428 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h48v6\" (UniqueName: \"kubernetes.io/projected/3b74239b-a7f3-4d1d-b292-50d131a0dad1-kube-api-access-h48v6\") pod \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.508549 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-config\") pod \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.508668 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-swift-storage-0\") pod \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\" (UID: \"3b74239b-a7f3-4d1d-b292-50d131a0dad1\") " Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.516164 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b74239b-a7f3-4d1d-b292-50d131a0dad1-kube-api-access-h48v6" (OuterVolumeSpecName: "kube-api-access-h48v6") pod "3b74239b-a7f3-4d1d-b292-50d131a0dad1" (UID: "3b74239b-a7f3-4d1d-b292-50d131a0dad1"). InnerVolumeSpecName "kube-api-access-h48v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.558093 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b74239b-a7f3-4d1d-b292-50d131a0dad1" (UID: "3b74239b-a7f3-4d1d-b292-50d131a0dad1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.573430 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b74239b-a7f3-4d1d-b292-50d131a0dad1" (UID: "3b74239b-a7f3-4d1d-b292-50d131a0dad1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.577432 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b74239b-a7f3-4d1d-b292-50d131a0dad1" (UID: "3b74239b-a7f3-4d1d-b292-50d131a0dad1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.577848 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b74239b-a7f3-4d1d-b292-50d131a0dad1" (UID: "3b74239b-a7f3-4d1d-b292-50d131a0dad1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.588696 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-config" (OuterVolumeSpecName: "config") pod "3b74239b-a7f3-4d1d-b292-50d131a0dad1" (UID: "3b74239b-a7f3-4d1d-b292-50d131a0dad1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.611732 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.611765 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.611777 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.611790 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h48v6\" (UniqueName: \"kubernetes.io/projected/3b74239b-a7f3-4d1d-b292-50d131a0dad1-kube-api-access-h48v6\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.611800 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.611809 4753 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b74239b-a7f3-4d1d-b292-50d131a0dad1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:42 crc kubenswrapper[4753]: I0129 14:21:42.933600 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 14:21:42 crc kubenswrapper[4753]: W0129 14:21:42.935633 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod606db0f5_bff8_4d65_bbad_63b8b1ba362c.slice/crio-a9d20dfde079391dc2e3c5e58ea6c76b7259d59da0878146245d026515dfdf8d WatchSource:0}: Error finding container a9d20dfde079391dc2e3c5e58ea6c76b7259d59da0878146245d026515dfdf8d: Status 404 returned error can't find the container with id a9d20dfde079391dc2e3c5e58ea6c76b7259d59da0878146245d026515dfdf8d Jan 29 14:21:43 crc kubenswrapper[4753]: I0129 14:21:43.383213 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"606db0f5-bff8-4d65-bbad-63b8b1ba362c","Type":"ContainerStarted","Data":"a9d20dfde079391dc2e3c5e58ea6c76b7259d59da0878146245d026515dfdf8d"} Jan 29 14:21:43 crc kubenswrapper[4753]: I0129 14:21:43.386453 4753 generic.go:334] "Generic (PLEG): container finished" podID="d6feeb92-3011-4e0e-b50d-9839bd14265f" containerID="d16d9a2ee2633cbe1578fd0a07f3702029a9684207a8d67784588168d6b4cdcd" exitCode=0 Jan 29 14:21:43 crc kubenswrapper[4753]: I0129 14:21:43.386527 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d6feeb92-3011-4e0e-b50d-9839bd14265f","Type":"ContainerDied","Data":"d16d9a2ee2633cbe1578fd0a07f3702029a9684207a8d67784588168d6b4cdcd"} Jan 29 14:21:43 crc kubenswrapper[4753]: I0129 14:21:43.388922 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" event={"ID":"3b74239b-a7f3-4d1d-b292-50d131a0dad1","Type":"ContainerDied","Data":"73be0fc283e78f9aa9cd1424060227302861f495695d2ddc7218369bf45d0075"} Jan 29 14:21:43 crc kubenswrapper[4753]: I0129 14:21:43.388985 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb67c87c9-hh94p" Jan 29 14:21:43 crc kubenswrapper[4753]: I0129 14:21:43.389012 4753 scope.go:117] "RemoveContainer" containerID="cad549f567dc8d2e735609c8aa87f47f0cd16a596a682692a57f23801cc75689" Jan 29 14:21:43 crc kubenswrapper[4753]: I0129 14:21:43.428219 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-hh94p"] Jan 29 14:21:43 crc kubenswrapper[4753]: I0129 14:21:43.428642 4753 scope.go:117] "RemoveContainer" containerID="50ab971e9a239744fde27c5ac8cddddc70b2c7c9487a2ad97df2565c8899536b" Jan 29 14:21:43 crc kubenswrapper[4753]: I0129 14:21:43.439683 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bb67c87c9-hh94p"] Jan 29 14:21:44 crc kubenswrapper[4753]: I0129 14:21:44.022612 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:21:44 crc kubenswrapper[4753]: I0129 14:21:44.125844 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56665b56dd-pw99j"] Jan 29 14:21:44 crc kubenswrapper[4753]: I0129 14:21:44.126244 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56665b56dd-pw99j" podUID="4192bada-5336-48ed-9bb7-d6db5a1e6278" containerName="neutron-api" containerID="cri-o://af2fdc76bc0016cdd7bf585f0bccc1c95c9e4d9b502387f54870c1a7b741f38b" gracePeriod=30 Jan 29 14:21:44 crc kubenswrapper[4753]: I0129 14:21:44.126339 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56665b56dd-pw99j" podUID="4192bada-5336-48ed-9bb7-d6db5a1e6278" containerName="neutron-httpd" containerID="cri-o://338979fa3be2b8317f4043dc0e2703a440119f91dac85ae2da3ad29090e1a012" gracePeriod=30 Jan 29 14:21:44 crc kubenswrapper[4753]: I0129 14:21:44.166622 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b74239b-a7f3-4d1d-b292-50d131a0dad1" path="/var/lib/kubelet/pods/3b74239b-a7f3-4d1d-b292-50d131a0dad1/volumes" Jan 29 14:21:44 crc kubenswrapper[4753]: I0129 14:21:44.416559 4753 generic.go:334] "Generic (PLEG): container finished" podID="4192bada-5336-48ed-9bb7-d6db5a1e6278" containerID="338979fa3be2b8317f4043dc0e2703a440119f91dac85ae2da3ad29090e1a012" exitCode=0 Jan 29 14:21:44 crc kubenswrapper[4753]: I0129 14:21:44.416603 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56665b56dd-pw99j" event={"ID":"4192bada-5336-48ed-9bb7-d6db5a1e6278","Type":"ContainerDied","Data":"338979fa3be2b8317f4043dc0e2703a440119f91dac85ae2da3ad29090e1a012"} Jan 29 14:21:45 crc kubenswrapper[4753]: I0129 14:21:45.557360 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 14:21:45 crc kubenswrapper[4753]: I0129 14:21:45.559961 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 14:21:45 crc kubenswrapper[4753]: I0129 14:21:45.616100 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 14:21:45 crc kubenswrapper[4753]: I0129 14:21:45.637010 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.469811 4753 generic.go:334] "Generic (PLEG): container finished" podID="d6feeb92-3011-4e0e-b50d-9839bd14265f" containerID="a2184a17ca6189abab48b4f8c663c1b4774fe6572faf2feded15b0aa9738116d" exitCode=0 Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.469882 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d6feeb92-3011-4e0e-b50d-9839bd14265f","Type":"ContainerDied","Data":"a2184a17ca6189abab48b4f8c663c1b4774fe6572faf2feded15b0aa9738116d"} Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.470736 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.470910 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.746659 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.905289 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvx9x\" (UniqueName: \"kubernetes.io/projected/d6feeb92-3011-4e0e-b50d-9839bd14265f-kube-api-access-tvx9x\") pod \"d6feeb92-3011-4e0e-b50d-9839bd14265f\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.905453 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6feeb92-3011-4e0e-b50d-9839bd14265f-etc-machine-id\") pod \"d6feeb92-3011-4e0e-b50d-9839bd14265f\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.905539 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data\") pod \"d6feeb92-3011-4e0e-b50d-9839bd14265f\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.905611 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-scripts\") pod \"d6feeb92-3011-4e0e-b50d-9839bd14265f\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.905673 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-combined-ca-bundle\") pod \"d6feeb92-3011-4e0e-b50d-9839bd14265f\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.905710 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data-custom\") pod \"d6feeb92-3011-4e0e-b50d-9839bd14265f\" (UID: \"d6feeb92-3011-4e0e-b50d-9839bd14265f\") " Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.907415 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6feeb92-3011-4e0e-b50d-9839bd14265f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d6feeb92-3011-4e0e-b50d-9839bd14265f" (UID: "d6feeb92-3011-4e0e-b50d-9839bd14265f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.936703 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-scripts" (OuterVolumeSpecName: "scripts") pod "d6feeb92-3011-4e0e-b50d-9839bd14265f" (UID: "d6feeb92-3011-4e0e-b50d-9839bd14265f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.947076 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6feeb92-3011-4e0e-b50d-9839bd14265f-kube-api-access-tvx9x" (OuterVolumeSpecName: "kube-api-access-tvx9x") pod "d6feeb92-3011-4e0e-b50d-9839bd14265f" (UID: "d6feeb92-3011-4e0e-b50d-9839bd14265f"). InnerVolumeSpecName "kube-api-access-tvx9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:46 crc kubenswrapper[4753]: I0129 14:21:46.962972 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d6feeb92-3011-4e0e-b50d-9839bd14265f" (UID: "d6feeb92-3011-4e0e-b50d-9839bd14265f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.012797 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvx9x\" (UniqueName: \"kubernetes.io/projected/d6feeb92-3011-4e0e-b50d-9839bd14265f-kube-api-access-tvx9x\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.012822 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6feeb92-3011-4e0e-b50d-9839bd14265f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.012832 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.012841 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.045441 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6feeb92-3011-4e0e-b50d-9839bd14265f" (UID: "d6feeb92-3011-4e0e-b50d-9839bd14265f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.067893 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data" (OuterVolumeSpecName: "config-data") pod "d6feeb92-3011-4e0e-b50d-9839bd14265f" (UID: "d6feeb92-3011-4e0e-b50d-9839bd14265f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.115044 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.115086 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6feeb92-3011-4e0e-b50d-9839bd14265f-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.187765 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7896b659cf-r8vcb"] Jan 29 14:21:47 crc kubenswrapper[4753]: E0129 14:21:47.188169 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6feeb92-3011-4e0e-b50d-9839bd14265f" containerName="probe" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.188186 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6feeb92-3011-4e0e-b50d-9839bd14265f" containerName="probe" Jan 29 14:21:47 crc kubenswrapper[4753]: E0129 14:21:47.188211 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6feeb92-3011-4e0e-b50d-9839bd14265f" containerName="cinder-scheduler" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.188218 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6feeb92-3011-4e0e-b50d-9839bd14265f" containerName="cinder-scheduler" Jan 29 14:21:47 crc kubenswrapper[4753]: E0129 14:21:47.188231 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b74239b-a7f3-4d1d-b292-50d131a0dad1" containerName="init" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.188237 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b74239b-a7f3-4d1d-b292-50d131a0dad1" containerName="init" Jan 29 14:21:47 crc kubenswrapper[4753]: E0129 14:21:47.188251 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b74239b-a7f3-4d1d-b292-50d131a0dad1" containerName="dnsmasq-dns" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.188257 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b74239b-a7f3-4d1d-b292-50d131a0dad1" containerName="dnsmasq-dns" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.188425 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6feeb92-3011-4e0e-b50d-9839bd14265f" containerName="probe" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.188444 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b74239b-a7f3-4d1d-b292-50d131a0dad1" containerName="dnsmasq-dns" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.188454 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6feeb92-3011-4e0e-b50d-9839bd14265f" containerName="cinder-scheduler" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.190035 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.199654 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.199871 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.200003 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.230464 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7896b659cf-r8vcb"] Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.320608 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-etc-swift\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.320685 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-combined-ca-bundle\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.320722 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-internal-tls-certs\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.320754 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-public-tls-certs\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.320784 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmrjc\" (UniqueName: \"kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-kube-api-access-rmrjc\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.320803 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-log-httpd\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.320852 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-config-data\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.320872 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-run-httpd\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.422517 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-config-data\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.422582 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-run-httpd\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.422620 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-etc-swift\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.422680 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-combined-ca-bundle\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.422713 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-internal-tls-certs\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.422765 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-public-tls-certs\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.422814 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmrjc\" (UniqueName: \"kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-kube-api-access-rmrjc\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.422836 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-log-httpd\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.423215 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-run-httpd\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.423854 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-log-httpd\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.436737 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-public-tls-certs\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.438101 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-etc-swift\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.440604 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-config-data\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.443492 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-combined-ca-bundle\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.445046 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-internal-tls-certs\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.448209 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmrjc\" (UniqueName: \"kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-kube-api-access-rmrjc\") pod \"swift-proxy-7896b659cf-r8vcb\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.533107 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.537706 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d6feeb92-3011-4e0e-b50d-9839bd14265f","Type":"ContainerDied","Data":"b5e804a8de55dde6ed52495c50675acd15cda0ac445835a4f35e10202857acd6"} Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.537800 4753 scope.go:117] "RemoveContainer" containerID="d16d9a2ee2633cbe1578fd0a07f3702029a9684207a8d67784588168d6b4cdcd" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.572749 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.579537 4753 scope.go:117] "RemoveContainer" containerID="a2184a17ca6189abab48b4f8c663c1b4774fe6572faf2feded15b0aa9738116d" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.696272 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.723029 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.765992 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.768860 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.793524 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.822519 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.852874 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.853044 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.853143 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86d005cc-e014-44b4-b8fa-f402d656ae5a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.853318 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mlw8\" (UniqueName: \"kubernetes.io/projected/86d005cc-e014-44b4-b8fa-f402d656ae5a-kube-api-access-7mlw8\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.853476 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-scripts\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.853561 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.958049 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86d005cc-e014-44b4-b8fa-f402d656ae5a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.958504 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mlw8\" (UniqueName: \"kubernetes.io/projected/86d005cc-e014-44b4-b8fa-f402d656ae5a-kube-api-access-7mlw8\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.958567 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-scripts\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.958595 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.958631 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.958684 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.964338 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86d005cc-e014-44b4-b8fa-f402d656ae5a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.968963 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.970715 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.976379 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:47 crc kubenswrapper[4753]: I0129 14:21:47.977757 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-scripts\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.005578 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mlw8\" (UniqueName: \"kubernetes.io/projected/86d005cc-e014-44b4-b8fa-f402d656ae5a-kube-api-access-7mlw8\") pod \"cinder-scheduler-0\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " pod="openstack/cinder-scheduler-0" Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.144308 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.164068 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6feeb92-3011-4e0e-b50d-9839bd14265f" path="/var/lib/kubelet/pods/d6feeb92-3011-4e0e-b50d-9839bd14265f/volumes" Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.403626 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7896b659cf-r8vcb"] Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.582701 4753 generic.go:334] "Generic (PLEG): container finished" podID="4192bada-5336-48ed-9bb7-d6db5a1e6278" containerID="af2fdc76bc0016cdd7bf585f0bccc1c95c9e4d9b502387f54870c1a7b741f38b" exitCode=0 Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.583198 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56665b56dd-pw99j" event={"ID":"4192bada-5336-48ed-9bb7-d6db5a1e6278","Type":"ContainerDied","Data":"af2fdc76bc0016cdd7bf585f0bccc1c95c9e4d9b502387f54870c1a7b741f38b"} Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.615456 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7896b659cf-r8vcb" event={"ID":"82c06532-183f-4527-b423-630110596fb4","Type":"ContainerStarted","Data":"87d0d9f0e044c738cb947de5af9e24b5365d10b11b844923864aa48497f20936"} Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.870697 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.870972 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="ceilometer-central-agent" containerID="cri-o://8c8b56903eb2784b7ee4eb6de5c0202f25c9e47f856620b26ff13909885877da" gracePeriod=30 Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.871109 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="proxy-httpd" containerID="cri-o://79566b6b49c7579aa3f624c9963d2eae687a842e01846f5ebf14cdb5b4ee1694" gracePeriod=30 Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.871179 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="sg-core" containerID="cri-o://f35160890c1be25194e2028f513efc9f2a5c8394af2ddf82cb8ef6da1fdb2e90" gracePeriod=30 Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.871213 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="ceilometer-notification-agent" containerID="cri-o://981d9dd663e275310464c075a209536c041ffd974fc888e0d90e3a0184663d11" gracePeriod=30 Jan 29 14:21:48 crc kubenswrapper[4753]: I0129 14:21:48.990901 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.162:3000/\": read tcp 10.217.0.2:48190->10.217.0.162:3000: read: connection reset by peer" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.078320 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.123896 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.188764 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-httpd-config\") pod \"4192bada-5336-48ed-9bb7-d6db5a1e6278\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.188905 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-config\") pod \"4192bada-5336-48ed-9bb7-d6db5a1e6278\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.188954 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-ovndb-tls-certs\") pod \"4192bada-5336-48ed-9bb7-d6db5a1e6278\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.188972 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-combined-ca-bundle\") pod \"4192bada-5336-48ed-9bb7-d6db5a1e6278\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.189062 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kl7q\" (UniqueName: \"kubernetes.io/projected/4192bada-5336-48ed-9bb7-d6db5a1e6278-kube-api-access-7kl7q\") pod \"4192bada-5336-48ed-9bb7-d6db5a1e6278\" (UID: \"4192bada-5336-48ed-9bb7-d6db5a1e6278\") " Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.210409 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4192bada-5336-48ed-9bb7-d6db5a1e6278-kube-api-access-7kl7q" (OuterVolumeSpecName: "kube-api-access-7kl7q") pod "4192bada-5336-48ed-9bb7-d6db5a1e6278" (UID: "4192bada-5336-48ed-9bb7-d6db5a1e6278"). InnerVolumeSpecName "kube-api-access-7kl7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.225363 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4192bada-5336-48ed-9bb7-d6db5a1e6278" (UID: "4192bada-5336-48ed-9bb7-d6db5a1e6278"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.242583 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4192bada-5336-48ed-9bb7-d6db5a1e6278" (UID: "4192bada-5336-48ed-9bb7-d6db5a1e6278"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.254532 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-config" (OuterVolumeSpecName: "config") pod "4192bada-5336-48ed-9bb7-d6db5a1e6278" (UID: "4192bada-5336-48ed-9bb7-d6db5a1e6278"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.295316 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4192bada-5336-48ed-9bb7-d6db5a1e6278" (UID: "4192bada-5336-48ed-9bb7-d6db5a1e6278"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.296515 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.296535 4753 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.296548 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.296558 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kl7q\" (UniqueName: \"kubernetes.io/projected/4192bada-5336-48ed-9bb7-d6db5a1e6278-kube-api-access-7kl7q\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.296567 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4192bada-5336-48ed-9bb7-d6db5a1e6278-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.361653 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.361785 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.629752 4753 generic.go:334] "Generic (PLEG): container finished" podID="af576162-4633-43b7-908f-a6dc16560cc7" containerID="79566b6b49c7579aa3f624c9963d2eae687a842e01846f5ebf14cdb5b4ee1694" exitCode=0 Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.630060 4753 generic.go:334] "Generic (PLEG): container finished" podID="af576162-4633-43b7-908f-a6dc16560cc7" containerID="f35160890c1be25194e2028f513efc9f2a5c8394af2ddf82cb8ef6da1fdb2e90" exitCode=2 Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.630070 4753 generic.go:334] "Generic (PLEG): container finished" podID="af576162-4633-43b7-908f-a6dc16560cc7" containerID="8c8b56903eb2784b7ee4eb6de5c0202f25c9e47f856620b26ff13909885877da" exitCode=0 Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.630106 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af576162-4633-43b7-908f-a6dc16560cc7","Type":"ContainerDied","Data":"79566b6b49c7579aa3f624c9963d2eae687a842e01846f5ebf14cdb5b4ee1694"} Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.630131 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af576162-4633-43b7-908f-a6dc16560cc7","Type":"ContainerDied","Data":"f35160890c1be25194e2028f513efc9f2a5c8394af2ddf82cb8ef6da1fdb2e90"} Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.630142 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af576162-4633-43b7-908f-a6dc16560cc7","Type":"ContainerDied","Data":"8c8b56903eb2784b7ee4eb6de5c0202f25c9e47f856620b26ff13909885877da"} Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.631563 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56665b56dd-pw99j" event={"ID":"4192bada-5336-48ed-9bb7-d6db5a1e6278","Type":"ContainerDied","Data":"d88912bc2517ad063b12a12057e4c0f0d9a9d8cf92350d8975097dd6d9a079c8"} Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.631591 4753 scope.go:117] "RemoveContainer" containerID="338979fa3be2b8317f4043dc0e2703a440119f91dac85ae2da3ad29090e1a012" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.631710 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56665b56dd-pw99j" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.645500 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7896b659cf-r8vcb" event={"ID":"82c06532-183f-4527-b423-630110596fb4","Type":"ContainerStarted","Data":"c03517e0292b57c6b359b996af8065e26ca76f92b447521f128204efb05538e3"} Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.645564 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7896b659cf-r8vcb" event={"ID":"82c06532-183f-4527-b423-630110596fb4","Type":"ContainerStarted","Data":"94767674e99c2eb5b0abc2208da66b5b5d0e5bbb98eae7fc0cc7902d80c6ed4b"} Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.645848 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.645890 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.647257 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86d005cc-e014-44b4-b8fa-f402d656ae5a","Type":"ContainerStarted","Data":"610ee9ef9361d8e9f8a1747de3404adb6c4ac64de4f821f935f682958decbab7"} Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.663122 4753 scope.go:117] "RemoveContainer" containerID="af2fdc76bc0016cdd7bf585f0bccc1c95c9e4d9b502387f54870c1a7b741f38b" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.672238 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56665b56dd-pw99j"] Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.693217 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56665b56dd-pw99j"] Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.693574 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7896b659cf-r8vcb" podStartSLOduration=2.693549226 podStartE2EDuration="2.693549226s" podCreationTimestamp="2026-01-29 14:21:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:49.684647656 +0000 UTC m=+1144.379382038" watchObservedRunningTime="2026-01-29 14:21:49.693549226 +0000 UTC m=+1144.388283608" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.902573 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 29 14:21:49 crc kubenswrapper[4753]: I0129 14:21:49.934263 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 14:21:50 crc kubenswrapper[4753]: I0129 14:21:50.163115 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4192bada-5336-48ed-9bb7-d6db5a1e6278" path="/var/lib/kubelet/pods/4192bada-5336-48ed-9bb7-d6db5a1e6278/volumes" Jan 29 14:21:51 crc kubenswrapper[4753]: I0129 14:21:51.001315 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86d005cc-e014-44b4-b8fa-f402d656ae5a","Type":"ContainerStarted","Data":"5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef"} Jan 29 14:21:52 crc kubenswrapper[4753]: I0129 14:21:52.018568 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86d005cc-e014-44b4-b8fa-f402d656ae5a","Type":"ContainerStarted","Data":"b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046"} Jan 29 14:21:52 crc kubenswrapper[4753]: I0129 14:21:52.047509 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.047482851 podStartE2EDuration="5.047482851s" podCreationTimestamp="2026-01-29 14:21:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:21:52.04222079 +0000 UTC m=+1146.736955182" watchObservedRunningTime="2026-01-29 14:21:52.047482851 +0000 UTC m=+1146.742217233" Jan 29 14:21:53 crc kubenswrapper[4753]: I0129 14:21:53.033641 4753 generic.go:334] "Generic (PLEG): container finished" podID="af576162-4633-43b7-908f-a6dc16560cc7" containerID="981d9dd663e275310464c075a209536c041ffd974fc888e0d90e3a0184663d11" exitCode=0 Jan 29 14:21:53 crc kubenswrapper[4753]: I0129 14:21:53.033743 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af576162-4633-43b7-908f-a6dc16560cc7","Type":"ContainerDied","Data":"981d9dd663e275310464c075a209536c041ffd974fc888e0d90e3a0184663d11"} Jan 29 14:21:53 crc kubenswrapper[4753]: I0129 14:21:53.146919 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 29 14:21:56 crc kubenswrapper[4753]: I0129 14:21:56.929957 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:21:56 crc kubenswrapper[4753]: I0129 14:21:56.930646 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="48f31315-8403-4d07-927f-4fe66c4db88b" containerName="glance-log" containerID="cri-o://f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224" gracePeriod=30 Jan 29 14:21:56 crc kubenswrapper[4753]: I0129 14:21:56.930760 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="48f31315-8403-4d07-927f-4fe66c4db88b" containerName="glance-httpd" containerID="cri-o://372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c" gracePeriod=30 Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.054412 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.054472 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.054515 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.054990 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12d7924f9ff7f63db0598221481b584d9481ba358c87450c2b5683ad81272c03"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.055043 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://12d7924f9ff7f63db0598221481b584d9481ba358c87450c2b5683ad81272c03" gracePeriod=600 Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.199313 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.162:3000/\": dial tcp 10.217.0.162:3000: connect: connection refused" Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.582731 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.583864 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.902470 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.902768 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="107a450b-f03d-4fbb-ad45-e04584ed3972" containerName="glance-log" containerID="cri-o://82d06422f106535b2475859bf3a3c11ef7c13e57119445daed3d619f602e5be6" gracePeriod=30 Jan 29 14:21:57 crc kubenswrapper[4753]: I0129 14:21:57.903342 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="107a450b-f03d-4fbb-ad45-e04584ed3972" containerName="glance-httpd" containerID="cri-o://324879b4bc43390a72b922e6f1989965ca8ade4892900bc8469ddff0dccfb0ef" gracePeriod=30 Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.087112 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-p4tmx"] Jan 29 14:21:58 crc kubenswrapper[4753]: E0129 14:21:58.088110 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4192bada-5336-48ed-9bb7-d6db5a1e6278" containerName="neutron-api" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.088128 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4192bada-5336-48ed-9bb7-d6db5a1e6278" containerName="neutron-api" Jan 29 14:21:58 crc kubenswrapper[4753]: E0129 14:21:58.088165 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4192bada-5336-48ed-9bb7-d6db5a1e6278" containerName="neutron-httpd" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.112060 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4192bada-5336-48ed-9bb7-d6db5a1e6278" containerName="neutron-httpd" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.092130 4753 generic.go:334] "Generic (PLEG): container finished" podID="107a450b-f03d-4fbb-ad45-e04584ed3972" containerID="82d06422f106535b2475859bf3a3c11ef7c13e57119445daed3d619f602e5be6" exitCode=143 Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.113524 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4192bada-5336-48ed-9bb7-d6db5a1e6278" containerName="neutron-httpd" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.113556 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4192bada-5336-48ed-9bb7-d6db5a1e6278" containerName="neutron-api" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.114522 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"107a450b-f03d-4fbb-ad45-e04584ed3972","Type":"ContainerDied","Data":"82d06422f106535b2475859bf3a3c11ef7c13e57119445daed3d619f602e5be6"} Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.114582 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-p4tmx"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.114680 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p4tmx" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.124768 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="12d7924f9ff7f63db0598221481b584d9481ba358c87450c2b5683ad81272c03" exitCode=0 Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.124923 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"12d7924f9ff7f63db0598221481b584d9481ba358c87450c2b5683ad81272c03"} Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.124948 4753 scope.go:117] "RemoveContainer" containerID="60212ebb28237ec94902995089383e664d1c6ec845691e27febd40b2f34c00cd" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.143665 4753 generic.go:334] "Generic (PLEG): container finished" podID="48f31315-8403-4d07-927f-4fe66c4db88b" containerID="f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224" exitCode=143 Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.144915 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48f31315-8403-4d07-927f-4fe66c4db88b","Type":"ContainerDied","Data":"f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224"} Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.193834 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df97e34-891e-41f6-bf19-24d14e83c610-operator-scripts\") pod \"nova-api-db-create-p4tmx\" (UID: \"7df97e34-891e-41f6-bf19-24d14e83c610\") " pod="openstack/nova-api-db-create-p4tmx" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.194001 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4659g\" (UniqueName: \"kubernetes.io/projected/7df97e34-891e-41f6-bf19-24d14e83c610-kube-api-access-4659g\") pod \"nova-api-db-create-p4tmx\" (UID: \"7df97e34-891e-41f6-bf19-24d14e83c610\") " pod="openstack/nova-api-db-create-p4tmx" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.194176 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-xrf5w"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.195429 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xrf5w" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.219759 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xrf5w"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.296145 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvcnz\" (UniqueName: \"kubernetes.io/projected/fdfa810e-df55-4e25-8280-b6275c932336-kube-api-access-kvcnz\") pod \"nova-cell0-db-create-xrf5w\" (UID: \"fdfa810e-df55-4e25-8280-b6275c932336\") " pod="openstack/nova-cell0-db-create-xrf5w" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.296247 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4659g\" (UniqueName: \"kubernetes.io/projected/7df97e34-891e-41f6-bf19-24d14e83c610-kube-api-access-4659g\") pod \"nova-api-db-create-p4tmx\" (UID: \"7df97e34-891e-41f6-bf19-24d14e83c610\") " pod="openstack/nova-api-db-create-p4tmx" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.296320 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df97e34-891e-41f6-bf19-24d14e83c610-operator-scripts\") pod \"nova-api-db-create-p4tmx\" (UID: \"7df97e34-891e-41f6-bf19-24d14e83c610\") " pod="openstack/nova-api-db-create-p4tmx" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.296356 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdfa810e-df55-4e25-8280-b6275c932336-operator-scripts\") pod \"nova-cell0-db-create-xrf5w\" (UID: \"fdfa810e-df55-4e25-8280-b6275c932336\") " pod="openstack/nova-cell0-db-create-xrf5w" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.297200 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-870b-account-create-update-2rg5l"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.297264 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df97e34-891e-41f6-bf19-24d14e83c610-operator-scripts\") pod \"nova-api-db-create-p4tmx\" (UID: \"7df97e34-891e-41f6-bf19-24d14e83c610\") " pod="openstack/nova-api-db-create-p4tmx" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.298412 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-870b-account-create-update-2rg5l" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.301629 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.313210 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-n6b96"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.314423 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n6b96" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.326225 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4659g\" (UniqueName: \"kubernetes.io/projected/7df97e34-891e-41f6-bf19-24d14e83c610-kube-api-access-4659g\") pod \"nova-api-db-create-p4tmx\" (UID: \"7df97e34-891e-41f6-bf19-24d14e83c610\") " pod="openstack/nova-api-db-create-p4tmx" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.339574 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-870b-account-create-update-2rg5l"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.368036 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.369577 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-n6b96"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.397672 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j5tb\" (UniqueName: \"kubernetes.io/projected/8d2de408-6b00-4029-a1a9-d0ab19401f96-kube-api-access-6j5tb\") pod \"nova-api-870b-account-create-update-2rg5l\" (UID: \"8d2de408-6b00-4029-a1a9-d0ab19401f96\") " pod="openstack/nova-api-870b-account-create-update-2rg5l" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.397735 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dade6dd-52d6-419f-8460-438e78771a6d-operator-scripts\") pod \"nova-cell1-db-create-n6b96\" (UID: \"4dade6dd-52d6-419f-8460-438e78771a6d\") " pod="openstack/nova-cell1-db-create-n6b96" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.397809 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tftg\" (UniqueName: \"kubernetes.io/projected/4dade6dd-52d6-419f-8460-438e78771a6d-kube-api-access-8tftg\") pod \"nova-cell1-db-create-n6b96\" (UID: \"4dade6dd-52d6-419f-8460-438e78771a6d\") " pod="openstack/nova-cell1-db-create-n6b96" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.397843 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdfa810e-df55-4e25-8280-b6275c932336-operator-scripts\") pod \"nova-cell0-db-create-xrf5w\" (UID: \"fdfa810e-df55-4e25-8280-b6275c932336\") " pod="openstack/nova-cell0-db-create-xrf5w" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.397912 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d2de408-6b00-4029-a1a9-d0ab19401f96-operator-scripts\") pod \"nova-api-870b-account-create-update-2rg5l\" (UID: \"8d2de408-6b00-4029-a1a9-d0ab19401f96\") " pod="openstack/nova-api-870b-account-create-update-2rg5l" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.397940 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvcnz\" (UniqueName: \"kubernetes.io/projected/fdfa810e-df55-4e25-8280-b6275c932336-kube-api-access-kvcnz\") pod \"nova-cell0-db-create-xrf5w\" (UID: \"fdfa810e-df55-4e25-8280-b6275c932336\") " pod="openstack/nova-cell0-db-create-xrf5w" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.399368 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdfa810e-df55-4e25-8280-b6275c932336-operator-scripts\") pod \"nova-cell0-db-create-xrf5w\" (UID: \"fdfa810e-df55-4e25-8280-b6275c932336\") " pod="openstack/nova-cell0-db-create-xrf5w" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.420349 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvcnz\" (UniqueName: \"kubernetes.io/projected/fdfa810e-df55-4e25-8280-b6275c932336-kube-api-access-kvcnz\") pod \"nova-cell0-db-create-xrf5w\" (UID: \"fdfa810e-df55-4e25-8280-b6275c932336\") " pod="openstack/nova-cell0-db-create-xrf5w" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.449620 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p4tmx" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.495724 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-vlh99"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.497489 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fad8-account-create-update-vlh99" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.499754 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tftg\" (UniqueName: \"kubernetes.io/projected/4dade6dd-52d6-419f-8460-438e78771a6d-kube-api-access-8tftg\") pod \"nova-cell1-db-create-n6b96\" (UID: \"4dade6dd-52d6-419f-8460-438e78771a6d\") " pod="openstack/nova-cell1-db-create-n6b96" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.499835 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d2de408-6b00-4029-a1a9-d0ab19401f96-operator-scripts\") pod \"nova-api-870b-account-create-update-2rg5l\" (UID: \"8d2de408-6b00-4029-a1a9-d0ab19401f96\") " pod="openstack/nova-api-870b-account-create-update-2rg5l" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.499909 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j5tb\" (UniqueName: \"kubernetes.io/projected/8d2de408-6b00-4029-a1a9-d0ab19401f96-kube-api-access-6j5tb\") pod \"nova-api-870b-account-create-update-2rg5l\" (UID: \"8d2de408-6b00-4029-a1a9-d0ab19401f96\") " pod="openstack/nova-api-870b-account-create-update-2rg5l" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.499935 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dade6dd-52d6-419f-8460-438e78771a6d-operator-scripts\") pod \"nova-cell1-db-create-n6b96\" (UID: \"4dade6dd-52d6-419f-8460-438e78771a6d\") " pod="openstack/nova-cell1-db-create-n6b96" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.500408 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.501008 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dade6dd-52d6-419f-8460-438e78771a6d-operator-scripts\") pod \"nova-cell1-db-create-n6b96\" (UID: \"4dade6dd-52d6-419f-8460-438e78771a6d\") " pod="openstack/nova-cell1-db-create-n6b96" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.501104 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d2de408-6b00-4029-a1a9-d0ab19401f96-operator-scripts\") pod \"nova-api-870b-account-create-update-2rg5l\" (UID: \"8d2de408-6b00-4029-a1a9-d0ab19401f96\") " pod="openstack/nova-api-870b-account-create-update-2rg5l" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.515157 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-vlh99"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.520494 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tftg\" (UniqueName: \"kubernetes.io/projected/4dade6dd-52d6-419f-8460-438e78771a6d-kube-api-access-8tftg\") pod \"nova-cell1-db-create-n6b96\" (UID: \"4dade6dd-52d6-419f-8460-438e78771a6d\") " pod="openstack/nova-cell1-db-create-n6b96" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.521412 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xrf5w" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.523192 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j5tb\" (UniqueName: \"kubernetes.io/projected/8d2de408-6b00-4029-a1a9-d0ab19401f96-kube-api-access-6j5tb\") pod \"nova-api-870b-account-create-update-2rg5l\" (UID: \"8d2de408-6b00-4029-a1a9-d0ab19401f96\") " pod="openstack/nova-api-870b-account-create-update-2rg5l" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.602099 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9730846e-4e43-45f7-8b5d-38d4753b3b80-operator-scripts\") pod \"nova-cell0-fad8-account-create-update-vlh99\" (UID: \"9730846e-4e43-45f7-8b5d-38d4753b3b80\") " pod="openstack/nova-cell0-fad8-account-create-update-vlh99" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.602272 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzqtv\" (UniqueName: \"kubernetes.io/projected/9730846e-4e43-45f7-8b5d-38d4753b3b80-kube-api-access-hzqtv\") pod \"nova-cell0-fad8-account-create-update-vlh99\" (UID: \"9730846e-4e43-45f7-8b5d-38d4753b3b80\") " pod="openstack/nova-cell0-fad8-account-create-update-vlh99" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.620118 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-870b-account-create-update-2rg5l" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.675373 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n6b96" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.676942 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ced3-account-create-update-h5z4v"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.678193 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.681653 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.691521 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ced3-account-create-update-h5z4v"] Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.704755 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzqtv\" (UniqueName: \"kubernetes.io/projected/9730846e-4e43-45f7-8b5d-38d4753b3b80-kube-api-access-hzqtv\") pod \"nova-cell0-fad8-account-create-update-vlh99\" (UID: \"9730846e-4e43-45f7-8b5d-38d4753b3b80\") " pod="openstack/nova-cell0-fad8-account-create-update-vlh99" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.704821 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9730846e-4e43-45f7-8b5d-38d4753b3b80-operator-scripts\") pod \"nova-cell0-fad8-account-create-update-vlh99\" (UID: \"9730846e-4e43-45f7-8b5d-38d4753b3b80\") " pod="openstack/nova-cell0-fad8-account-create-update-vlh99" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.704892 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbjx9\" (UniqueName: \"kubernetes.io/projected/845fdb72-7823-4160-941a-e70288b5f77b-kube-api-access-bbjx9\") pod \"nova-cell1-ced3-account-create-update-h5z4v\" (UID: \"845fdb72-7823-4160-941a-e70288b5f77b\") " pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.704928 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/845fdb72-7823-4160-941a-e70288b5f77b-operator-scripts\") pod \"nova-cell1-ced3-account-create-update-h5z4v\" (UID: \"845fdb72-7823-4160-941a-e70288b5f77b\") " pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.705880 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9730846e-4e43-45f7-8b5d-38d4753b3b80-operator-scripts\") pod \"nova-cell0-fad8-account-create-update-vlh99\" (UID: \"9730846e-4e43-45f7-8b5d-38d4753b3b80\") " pod="openstack/nova-cell0-fad8-account-create-update-vlh99" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.721046 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzqtv\" (UniqueName: \"kubernetes.io/projected/9730846e-4e43-45f7-8b5d-38d4753b3b80-kube-api-access-hzqtv\") pod \"nova-cell0-fad8-account-create-update-vlh99\" (UID: \"9730846e-4e43-45f7-8b5d-38d4753b3b80\") " pod="openstack/nova-cell0-fad8-account-create-update-vlh99" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.806004 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbjx9\" (UniqueName: \"kubernetes.io/projected/845fdb72-7823-4160-941a-e70288b5f77b-kube-api-access-bbjx9\") pod \"nova-cell1-ced3-account-create-update-h5z4v\" (UID: \"845fdb72-7823-4160-941a-e70288b5f77b\") " pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.806063 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/845fdb72-7823-4160-941a-e70288b5f77b-operator-scripts\") pod \"nova-cell1-ced3-account-create-update-h5z4v\" (UID: \"845fdb72-7823-4160-941a-e70288b5f77b\") " pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.807095 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/845fdb72-7823-4160-941a-e70288b5f77b-operator-scripts\") pod \"nova-cell1-ced3-account-create-update-h5z4v\" (UID: \"845fdb72-7823-4160-941a-e70288b5f77b\") " pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.821640 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fad8-account-create-update-vlh99" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.821965 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbjx9\" (UniqueName: \"kubernetes.io/projected/845fdb72-7823-4160-941a-e70288b5f77b-kube-api-access-bbjx9\") pod \"nova-cell1-ced3-account-create-update-h5z4v\" (UID: \"845fdb72-7823-4160-941a-e70288b5f77b\") " pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" Jan 29 14:21:58 crc kubenswrapper[4753]: I0129 14:21:58.999317 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.327525 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.519916 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-combined-ca-bundle\") pod \"af576162-4633-43b7-908f-a6dc16560cc7\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.520054 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-log-httpd\") pod \"af576162-4633-43b7-908f-a6dc16560cc7\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.520136 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-scripts\") pod \"af576162-4633-43b7-908f-a6dc16560cc7\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.520207 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v9kc\" (UniqueName: \"kubernetes.io/projected/af576162-4633-43b7-908f-a6dc16560cc7-kube-api-access-2v9kc\") pod \"af576162-4633-43b7-908f-a6dc16560cc7\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.520261 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-sg-core-conf-yaml\") pod \"af576162-4633-43b7-908f-a6dc16560cc7\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.520316 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-run-httpd\") pod \"af576162-4633-43b7-908f-a6dc16560cc7\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.520403 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-config-data\") pod \"af576162-4633-43b7-908f-a6dc16560cc7\" (UID: \"af576162-4633-43b7-908f-a6dc16560cc7\") " Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.545460 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "af576162-4633-43b7-908f-a6dc16560cc7" (UID: "af576162-4633-43b7-908f-a6dc16560cc7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.547741 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "af576162-4633-43b7-908f-a6dc16560cc7" (UID: "af576162-4633-43b7-908f-a6dc16560cc7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.552438 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af576162-4633-43b7-908f-a6dc16560cc7-kube-api-access-2v9kc" (OuterVolumeSpecName: "kube-api-access-2v9kc") pod "af576162-4633-43b7-908f-a6dc16560cc7" (UID: "af576162-4633-43b7-908f-a6dc16560cc7"). InnerVolumeSpecName "kube-api-access-2v9kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.554053 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-scripts" (OuterVolumeSpecName: "scripts") pod "af576162-4633-43b7-908f-a6dc16560cc7" (UID: "af576162-4633-43b7-908f-a6dc16560cc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.629437 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.629466 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af576162-4633-43b7-908f-a6dc16560cc7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.629474 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.629485 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v9kc\" (UniqueName: \"kubernetes.io/projected/af576162-4633-43b7-908f-a6dc16560cc7-kube-api-access-2v9kc\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.695268 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "af576162-4633-43b7-908f-a6dc16560cc7" (UID: "af576162-4633-43b7-908f-a6dc16560cc7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.731745 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.849361 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af576162-4633-43b7-908f-a6dc16560cc7" (UID: "af576162-4633-43b7-908f-a6dc16560cc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.860237 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-config-data" (OuterVolumeSpecName: "config-data") pod "af576162-4633-43b7-908f-a6dc16560cc7" (UID: "af576162-4633-43b7-908f-a6dc16560cc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.935103 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:21:59 crc kubenswrapper[4753]: I0129 14:21:59.935137 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af576162-4633-43b7-908f-a6dc16560cc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.126954 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xrf5w"] Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.224819 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"606db0f5-bff8-4d65-bbad-63b8b1ba362c","Type":"ContainerStarted","Data":"b9f6872bb6478c7a1880d301b55f528a8869bd68eb2346a5287f912f5e9ed844"} Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.224873 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-n6b96"] Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.224892 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-870b-account-create-update-2rg5l"] Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.229538 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xrf5w" event={"ID":"fdfa810e-df55-4e25-8280-b6275c932336","Type":"ContainerStarted","Data":"03499132c504230176b7e4e663830c97f5f62fa51de4cf6eab3350c27fa680d8"} Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.258253 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"ad5415f8f2b12f61c8e4717f3b18699a52b4bcca8d38b639a61f5684d21e9c46"} Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.260705 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-vlh99"] Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.268293 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ced3-account-create-update-h5z4v"] Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.272124 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.088522818 podStartE2EDuration="18.272101435s" podCreationTimestamp="2026-01-29 14:21:42 +0000 UTC" firstStartedPulling="2026-01-29 14:21:42.939012381 +0000 UTC m=+1137.633746763" lastFinishedPulling="2026-01-29 14:21:59.122591008 +0000 UTC m=+1153.817325380" observedRunningTime="2026-01-29 14:22:00.253564445 +0000 UTC m=+1154.948298827" watchObservedRunningTime="2026-01-29 14:22:00.272101435 +0000 UTC m=+1154.966835817" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.310292 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af576162-4633-43b7-908f-a6dc16560cc7","Type":"ContainerDied","Data":"25764975640a5623a4e5d595bed919920d711fdca6bea86f11d08fd17abb721c"} Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.310364 4753 scope.go:117] "RemoveContainer" containerID="79566b6b49c7579aa3f624c9963d2eae687a842e01846f5ebf14cdb5b4ee1694" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.310536 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: W0129 14:22:00.312009 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9730846e_4e43_45f7_8b5d_38d4753b3b80.slice/crio-948ec19997a6f1a99a01526047cdebe2561fa4aec5639358ce1f049a7c326cba WatchSource:0}: Error finding container 948ec19997a6f1a99a01526047cdebe2561fa4aec5639358ce1f049a7c326cba: Status 404 returned error can't find the container with id 948ec19997a6f1a99a01526047cdebe2561fa4aec5639358ce1f049a7c326cba Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.315085 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n6b96" event={"ID":"4dade6dd-52d6-419f-8460-438e78771a6d","Type":"ContainerStarted","Data":"da52614ae156b30c0346a064624ada510bcd609181c39312fd431ce5f8acf385"} Jan 29 14:22:00 crc kubenswrapper[4753]: W0129 14:22:00.336599 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod845fdb72_7823_4160_941a_e70288b5f77b.slice/crio-00fa8dce14261e9438d8300c6739511e46d28271bad9287d35787a1999da21fb WatchSource:0}: Error finding container 00fa8dce14261e9438d8300c6739511e46d28271bad9287d35787a1999da21fb: Status 404 returned error can't find the container with id 00fa8dce14261e9438d8300c6739511e46d28271bad9287d35787a1999da21fb Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.339768 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.347977 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.369620 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:00 crc kubenswrapper[4753]: E0129 14:22:00.370104 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="ceilometer-notification-agent" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.370124 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="ceilometer-notification-agent" Jan 29 14:22:00 crc kubenswrapper[4753]: E0129 14:22:00.370144 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="sg-core" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.370245 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="sg-core" Jan 29 14:22:00 crc kubenswrapper[4753]: E0129 14:22:00.370261 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="ceilometer-central-agent" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.370268 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="ceilometer-central-agent" Jan 29 14:22:00 crc kubenswrapper[4753]: E0129 14:22:00.370284 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="proxy-httpd" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.370290 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="proxy-httpd" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.371245 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="ceilometer-central-agent" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.371266 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="ceilometer-notification-agent" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.371275 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="sg-core" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.371283 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="af576162-4633-43b7-908f-a6dc16560cc7" containerName="proxy-httpd" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.373477 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.383595 4753 scope.go:117] "RemoveContainer" containerID="f35160890c1be25194e2028f513efc9f2a5c8394af2ddf82cb8ef6da1fdb2e90" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.383942 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.384169 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.393471 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.400123 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-p4tmx"] Jan 29 14:22:00 crc kubenswrapper[4753]: W0129 14:22:00.435419 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7df97e34_891e_41f6_bf19_24d14e83c610.slice/crio-d65c5bbdd306cedae8c7d78117160f10700fe7d5433c05ca9b6df3ffe2762b33 WatchSource:0}: Error finding container d65c5bbdd306cedae8c7d78117160f10700fe7d5433c05ca9b6df3ffe2762b33: Status 404 returned error can't find the container with id d65c5bbdd306cedae8c7d78117160f10700fe7d5433c05ca9b6df3ffe2762b33 Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.465611 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.465681 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-log-httpd\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.465841 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvtbg\" (UniqueName: \"kubernetes.io/projected/b6ecb4be-e2ff-4138-9399-723f1b586a71-kube-api-access-lvtbg\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.465948 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-run-httpd\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.465997 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-scripts\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.466034 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.466055 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-config-data\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.567330 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.567565 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-log-httpd\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.567631 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvtbg\" (UniqueName: \"kubernetes.io/projected/b6ecb4be-e2ff-4138-9399-723f1b586a71-kube-api-access-lvtbg\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.567692 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-run-httpd\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.567718 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-scripts\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.567751 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.567768 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-config-data\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.568769 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-run-httpd\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.568810 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-log-httpd\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.574485 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-scripts\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.574812 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.574823 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-config-data\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.578348 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.599240 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvtbg\" (UniqueName: \"kubernetes.io/projected/b6ecb4be-e2ff-4138-9399-723f1b586a71-kube-api-access-lvtbg\") pod \"ceilometer-0\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " pod="openstack/ceilometer-0" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.605733 4753 scope.go:117] "RemoveContainer" containerID="981d9dd663e275310464c075a209536c041ffd974fc888e0d90e3a0184663d11" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.834941 4753 scope.go:117] "RemoveContainer" containerID="8c8b56903eb2784b7ee4eb6de5c0202f25c9e47f856620b26ff13909885877da" Jan 29 14:22:00 crc kubenswrapper[4753]: I0129 14:22:00.865538 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.005098 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.077287 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-httpd-run\") pod \"48f31315-8403-4d07-927f-4fe66c4db88b\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.077860 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-scripts\") pod \"48f31315-8403-4d07-927f-4fe66c4db88b\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.077953 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-combined-ca-bundle\") pod \"48f31315-8403-4d07-927f-4fe66c4db88b\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.077985 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"48f31315-8403-4d07-927f-4fe66c4db88b\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.078102 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-public-tls-certs\") pod \"48f31315-8403-4d07-927f-4fe66c4db88b\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.078284 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-logs\") pod \"48f31315-8403-4d07-927f-4fe66c4db88b\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.078357 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jldnx\" (UniqueName: \"kubernetes.io/projected/48f31315-8403-4d07-927f-4fe66c4db88b-kube-api-access-jldnx\") pod \"48f31315-8403-4d07-927f-4fe66c4db88b\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.078737 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-config-data\") pod \"48f31315-8403-4d07-927f-4fe66c4db88b\" (UID: \"48f31315-8403-4d07-927f-4fe66c4db88b\") " Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.079571 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "48f31315-8403-4d07-927f-4fe66c4db88b" (UID: "48f31315-8403-4d07-927f-4fe66c4db88b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.082125 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.082251 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-logs" (OuterVolumeSpecName: "logs") pod "48f31315-8403-4d07-927f-4fe66c4db88b" (UID: "48f31315-8403-4d07-927f-4fe66c4db88b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.115428 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-scripts" (OuterVolumeSpecName: "scripts") pod "48f31315-8403-4d07-927f-4fe66c4db88b" (UID: "48f31315-8403-4d07-927f-4fe66c4db88b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.117743 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f31315-8403-4d07-927f-4fe66c4db88b-kube-api-access-jldnx" (OuterVolumeSpecName: "kube-api-access-jldnx") pod "48f31315-8403-4d07-927f-4fe66c4db88b" (UID: "48f31315-8403-4d07-927f-4fe66c4db88b"). InnerVolumeSpecName "kube-api-access-jldnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.127698 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "48f31315-8403-4d07-927f-4fe66c4db88b" (UID: "48f31315-8403-4d07-927f-4fe66c4db88b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.179417 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-config-data" (OuterVolumeSpecName: "config-data") pod "48f31315-8403-4d07-927f-4fe66c4db88b" (UID: "48f31315-8403-4d07-927f-4fe66c4db88b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.186234 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f31315-8403-4d07-927f-4fe66c4db88b-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.186261 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jldnx\" (UniqueName: \"kubernetes.io/projected/48f31315-8403-4d07-927f-4fe66c4db88b-kube-api-access-jldnx\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.186269 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.186278 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.186299 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.186990 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48f31315-8403-4d07-927f-4fe66c4db88b" (UID: "48f31315-8403-4d07-927f-4fe66c4db88b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.193086 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "48f31315-8403-4d07-927f-4fe66c4db88b" (UID: "48f31315-8403-4d07-927f-4fe66c4db88b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.208124 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.287449 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.287487 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.287496 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f31315-8403-4d07-927f-4fe66c4db88b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.340316 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" event={"ID":"845fdb72-7823-4160-941a-e70288b5f77b","Type":"ContainerStarted","Data":"09fccc35a93bd7153a392185cb41cd50ef322765121cbd848e3a3e60eb0e62c6"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.340374 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" event={"ID":"845fdb72-7823-4160-941a-e70288b5f77b","Type":"ContainerStarted","Data":"00fa8dce14261e9438d8300c6739511e46d28271bad9287d35787a1999da21fb"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.355419 4753 generic.go:334] "Generic (PLEG): container finished" podID="107a450b-f03d-4fbb-ad45-e04584ed3972" containerID="324879b4bc43390a72b922e6f1989965ca8ade4892900bc8469ddff0dccfb0ef" exitCode=0 Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.355525 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"107a450b-f03d-4fbb-ad45-e04584ed3972","Type":"ContainerDied","Data":"324879b4bc43390a72b922e6f1989965ca8ade4892900bc8469ddff0dccfb0ef"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.359884 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" podStartSLOduration=3.359863236 podStartE2EDuration="3.359863236s" podCreationTimestamp="2026-01-29 14:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:01.356780123 +0000 UTC m=+1156.051514505" watchObservedRunningTime="2026-01-29 14:22:01.359863236 +0000 UTC m=+1156.054597618" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.360590 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fad8-account-create-update-vlh99" event={"ID":"9730846e-4e43-45f7-8b5d-38d4753b3b80","Type":"ContainerStarted","Data":"b77d8c3d96dc50269ebb9916110f03d8650bc3d367afa27d97606e1aa2292caa"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.360634 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fad8-account-create-update-vlh99" event={"ID":"9730846e-4e43-45f7-8b5d-38d4753b3b80","Type":"ContainerStarted","Data":"948ec19997a6f1a99a01526047cdebe2561fa4aec5639358ce1f049a7c326cba"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.361954 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n6b96" event={"ID":"4dade6dd-52d6-419f-8460-438e78771a6d","Type":"ContainerStarted","Data":"90e322952691e5340d7dfdb0eef5d70192f25afb3fa06e2774112e840dc0d238"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.379553 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-fad8-account-create-update-vlh99" podStartSLOduration=3.379531867 podStartE2EDuration="3.379531867s" podCreationTimestamp="2026-01-29 14:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:01.374663416 +0000 UTC m=+1156.069397798" watchObservedRunningTime="2026-01-29 14:22:01.379531867 +0000 UTC m=+1156.074266239" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.380808 4753 generic.go:334] "Generic (PLEG): container finished" podID="48f31315-8403-4d07-927f-4fe66c4db88b" containerID="372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c" exitCode=0 Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.380886 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.380918 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48f31315-8403-4d07-927f-4fe66c4db88b","Type":"ContainerDied","Data":"372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.380951 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48f31315-8403-4d07-927f-4fe66c4db88b","Type":"ContainerDied","Data":"a12b94b86b73ac008156e4245fb3aad2466169d5aa75a4d5e17208ff728e41c9"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.380967 4753 scope.go:117] "RemoveContainer" containerID="372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.439344 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p4tmx" event={"ID":"7df97e34-891e-41f6-bf19-24d14e83c610","Type":"ContainerStarted","Data":"4e6451e61042e484600018939d4da18de458042e275e4e20bffaa3d9d6779d31"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.439786 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p4tmx" event={"ID":"7df97e34-891e-41f6-bf19-24d14e83c610","Type":"ContainerStarted","Data":"d65c5bbdd306cedae8c7d78117160f10700fe7d5433c05ca9b6df3ffe2762b33"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.443763 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-n6b96" podStartSLOduration=3.44374226 podStartE2EDuration="3.44374226s" podCreationTimestamp="2026-01-29 14:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:01.396036032 +0000 UTC m=+1156.090770414" watchObservedRunningTime="2026-01-29 14:22:01.44374226 +0000 UTC m=+1156.138476642" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.444959 4753 generic.go:334] "Generic (PLEG): container finished" podID="fdfa810e-df55-4e25-8280-b6275c932336" containerID="937a66892ca282be79cde33ce6bfc835a53f9fcd82a044866e67c40c8313e482" exitCode=0 Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.445063 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xrf5w" event={"ID":"fdfa810e-df55-4e25-8280-b6275c932336","Type":"ContainerDied","Data":"937a66892ca282be79cde33ce6bfc835a53f9fcd82a044866e67c40c8313e482"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.460992 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-870b-account-create-update-2rg5l" event={"ID":"8d2de408-6b00-4029-a1a9-d0ab19401f96","Type":"ContainerStarted","Data":"0619b103b3467bd1fe70b0902054e340ccfb8ab97a1116e60aa44770c60580dc"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.461397 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-870b-account-create-update-2rg5l" event={"ID":"8d2de408-6b00-4029-a1a9-d0ab19401f96","Type":"ContainerStarted","Data":"4090ecb8922250fe832e8bf11a3ced8a6fc97e19a09a7ea9a93b12b4317b4ca6"} Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.478604 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-p4tmx" podStartSLOduration=3.47857609 podStartE2EDuration="3.47857609s" podCreationTimestamp="2026-01-29 14:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:01.458219241 +0000 UTC m=+1156.152953643" watchObservedRunningTime="2026-01-29 14:22:01.47857609 +0000 UTC m=+1156.173310472" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.500234 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-870b-account-create-update-2rg5l" podStartSLOduration=3.500213155 podStartE2EDuration="3.500213155s" podCreationTimestamp="2026-01-29 14:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:01.494971793 +0000 UTC m=+1156.189706175" watchObservedRunningTime="2026-01-29 14:22:01.500213155 +0000 UTC m=+1156.194947537" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.553998 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.916454 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.920655 4753 scope.go:117] "RemoveContainer" containerID="f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.984948 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.993259 4753 scope.go:117] "RemoveContainer" containerID="372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c" Jan 29 14:22:01 crc kubenswrapper[4753]: E0129 14:22:01.993768 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c\": container with ID starting with 372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c not found: ID does not exist" containerID="372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.993883 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c"} err="failed to get container status \"372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c\": rpc error: code = NotFound desc = could not find container \"372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c\": container with ID starting with 372fc270c65fa295fb7425ca9f140ad0cc9f0cb8a7273f36494df0e979d1fd6c not found: ID does not exist" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.993982 4753 scope.go:117] "RemoveContainer" containerID="f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224" Jan 29 14:22:01 crc kubenswrapper[4753]: E0129 14:22:01.994613 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224\": container with ID starting with f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224 not found: ID does not exist" containerID="f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224" Jan 29 14:22:01 crc kubenswrapper[4753]: I0129 14:22:01.994719 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224"} err="failed to get container status \"f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224\": rpc error: code = NotFound desc = could not find container \"f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224\": container with ID starting with f2cb03fa25a679407436bec7e5e107c5567e907acccfbc128093594cd5c8b224 not found: ID does not exist" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.002463 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.016323 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:22:02 crc kubenswrapper[4753]: E0129 14:22:02.017501 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107a450b-f03d-4fbb-ad45-e04584ed3972" containerName="glance-httpd" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.017543 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="107a450b-f03d-4fbb-ad45-e04584ed3972" containerName="glance-httpd" Jan 29 14:22:02 crc kubenswrapper[4753]: E0129 14:22:02.017554 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f31315-8403-4d07-927f-4fe66c4db88b" containerName="glance-httpd" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.017560 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f31315-8403-4d07-927f-4fe66c4db88b" containerName="glance-httpd" Jan 29 14:22:02 crc kubenswrapper[4753]: E0129 14:22:02.017580 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f31315-8403-4d07-927f-4fe66c4db88b" containerName="glance-log" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.017586 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f31315-8403-4d07-927f-4fe66c4db88b" containerName="glance-log" Jan 29 14:22:02 crc kubenswrapper[4753]: E0129 14:22:02.017599 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107a450b-f03d-4fbb-ad45-e04584ed3972" containerName="glance-log" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.017605 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="107a450b-f03d-4fbb-ad45-e04584ed3972" containerName="glance-log" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.017861 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="107a450b-f03d-4fbb-ad45-e04584ed3972" containerName="glance-log" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.017879 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="107a450b-f03d-4fbb-ad45-e04584ed3972" containerName="glance-httpd" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.017890 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f31315-8403-4d07-927f-4fe66c4db88b" containerName="glance-httpd" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.017905 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f31315-8403-4d07-927f-4fe66c4db88b" containerName="glance-log" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.022704 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-scripts\") pod \"107a450b-f03d-4fbb-ad45-e04584ed3972\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.022754 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-httpd-run\") pod \"107a450b-f03d-4fbb-ad45-e04584ed3972\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.022847 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-internal-tls-certs\") pod \"107a450b-f03d-4fbb-ad45-e04584ed3972\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.022874 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9qr6\" (UniqueName: \"kubernetes.io/projected/107a450b-f03d-4fbb-ad45-e04584ed3972-kube-api-access-h9qr6\") pod \"107a450b-f03d-4fbb-ad45-e04584ed3972\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.022939 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-config-data\") pod \"107a450b-f03d-4fbb-ad45-e04584ed3972\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.023053 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-combined-ca-bundle\") pod \"107a450b-f03d-4fbb-ad45-e04584ed3972\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.023072 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-logs\") pod \"107a450b-f03d-4fbb-ad45-e04584ed3972\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.023095 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"107a450b-f03d-4fbb-ad45-e04584ed3972\" (UID: \"107a450b-f03d-4fbb-ad45-e04584ed3972\") " Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.024916 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "107a450b-f03d-4fbb-ad45-e04584ed3972" (UID: "107a450b-f03d-4fbb-ad45-e04584ed3972"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.029474 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-scripts" (OuterVolumeSpecName: "scripts") pod "107a450b-f03d-4fbb-ad45-e04584ed3972" (UID: "107a450b-f03d-4fbb-ad45-e04584ed3972"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.029952 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-logs" (OuterVolumeSpecName: "logs") pod "107a450b-f03d-4fbb-ad45-e04584ed3972" (UID: "107a450b-f03d-4fbb-ad45-e04584ed3972"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.035894 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.043741 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.048122 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107a450b-f03d-4fbb-ad45-e04584ed3972-kube-api-access-h9qr6" (OuterVolumeSpecName: "kube-api-access-h9qr6") pod "107a450b-f03d-4fbb-ad45-e04584ed3972" (UID: "107a450b-f03d-4fbb-ad45-e04584ed3972"). InnerVolumeSpecName "kube-api-access-h9qr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.048225 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "107a450b-f03d-4fbb-ad45-e04584ed3972" (UID: "107a450b-f03d-4fbb-ad45-e04584ed3972"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.048386 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.048641 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.083771 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "107a450b-f03d-4fbb-ad45-e04584ed3972" (UID: "107a450b-f03d-4fbb-ad45-e04584ed3972"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.103258 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "107a450b-f03d-4fbb-ad45-e04584ed3972" (UID: "107a450b-f03d-4fbb-ad45-e04584ed3972"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.124945 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.125420 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-config-data\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.125462 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-scripts\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.125537 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.125558 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-logs\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.125586 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.125605 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnnqf\" (UniqueName: \"kubernetes.io/projected/3184c640-0157-4211-aa5a-aada8557e9f8-kube-api-access-nnnqf\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.125780 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.126027 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.126042 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.126064 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.126074 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.126081 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/107a450b-f03d-4fbb-ad45-e04584ed3972-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.126092 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.126103 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9qr6\" (UniqueName: \"kubernetes.io/projected/107a450b-f03d-4fbb-ad45-e04584ed3972-kube-api-access-h9qr6\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.146763 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.150293 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-config-data" (OuterVolumeSpecName: "config-data") pod "107a450b-f03d-4fbb-ad45-e04584ed3972" (UID: "107a450b-f03d-4fbb-ad45-e04584ed3972"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.165682 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f31315-8403-4d07-927f-4fe66c4db88b" path="/var/lib/kubelet/pods/48f31315-8403-4d07-927f-4fe66c4db88b/volumes" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.166466 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af576162-4633-43b7-908f-a6dc16560cc7" path="/var/lib/kubelet/pods/af576162-4633-43b7-908f-a6dc16560cc7/volumes" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.227796 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.227852 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-config-data\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.227886 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-scripts\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.227941 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.227961 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-logs\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.227984 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnnqf\" (UniqueName: \"kubernetes.io/projected/3184c640-0157-4211-aa5a-aada8557e9f8-kube-api-access-nnnqf\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.228004 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.228040 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.228095 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107a450b-f03d-4fbb-ad45-e04584ed3972-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.228106 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.228514 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.228781 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.229064 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-logs\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.234126 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-scripts\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.234141 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.235683 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-config-data\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.237271 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.245140 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnnqf\" (UniqueName: \"kubernetes.io/projected/3184c640-0157-4211-aa5a-aada8557e9f8-kube-api-access-nnnqf\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.262896 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.402024 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.483489 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6ecb4be-e2ff-4138-9399-723f1b586a71","Type":"ContainerStarted","Data":"34a963a7a79a8f1012999d55d7e77a828976deb3d1be05be000c6a37cda111cd"} Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.499555 4753 generic.go:334] "Generic (PLEG): container finished" podID="7df97e34-891e-41f6-bf19-24d14e83c610" containerID="4e6451e61042e484600018939d4da18de458042e275e4e20bffaa3d9d6779d31" exitCode=0 Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.499807 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p4tmx" event={"ID":"7df97e34-891e-41f6-bf19-24d14e83c610","Type":"ContainerDied","Data":"4e6451e61042e484600018939d4da18de458042e275e4e20bffaa3d9d6779d31"} Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.522580 4753 generic.go:334] "Generic (PLEG): container finished" podID="845fdb72-7823-4160-941a-e70288b5f77b" containerID="09fccc35a93bd7153a392185cb41cd50ef322765121cbd848e3a3e60eb0e62c6" exitCode=0 Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.522690 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" event={"ID":"845fdb72-7823-4160-941a-e70288b5f77b","Type":"ContainerDied","Data":"09fccc35a93bd7153a392185cb41cd50ef322765121cbd848e3a3e60eb0e62c6"} Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.544588 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"107a450b-f03d-4fbb-ad45-e04584ed3972","Type":"ContainerDied","Data":"309dc651b59d91cac5a63e3afaae4d369469ea22758191f49aadbe0296ec2404"} Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.544837 4753 scope.go:117] "RemoveContainer" containerID="324879b4bc43390a72b922e6f1989965ca8ade4892900bc8469ddff0dccfb0ef" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.544618 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.547487 4753 generic.go:334] "Generic (PLEG): container finished" podID="8d2de408-6b00-4029-a1a9-d0ab19401f96" containerID="0619b103b3467bd1fe70b0902054e340ccfb8ab97a1116e60aa44770c60580dc" exitCode=0 Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.547556 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-870b-account-create-update-2rg5l" event={"ID":"8d2de408-6b00-4029-a1a9-d0ab19401f96","Type":"ContainerDied","Data":"0619b103b3467bd1fe70b0902054e340ccfb8ab97a1116e60aa44770c60580dc"} Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.552646 4753 generic.go:334] "Generic (PLEG): container finished" podID="9730846e-4e43-45f7-8b5d-38d4753b3b80" containerID="b77d8c3d96dc50269ebb9916110f03d8650bc3d367afa27d97606e1aa2292caa" exitCode=0 Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.552718 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fad8-account-create-update-vlh99" event={"ID":"9730846e-4e43-45f7-8b5d-38d4753b3b80","Type":"ContainerDied","Data":"b77d8c3d96dc50269ebb9916110f03d8650bc3d367afa27d97606e1aa2292caa"} Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.553880 4753 generic.go:334] "Generic (PLEG): container finished" podID="4dade6dd-52d6-419f-8460-438e78771a6d" containerID="90e322952691e5340d7dfdb0eef5d70192f25afb3fa06e2774112e840dc0d238" exitCode=0 Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.553954 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n6b96" event={"ID":"4dade6dd-52d6-419f-8460-438e78771a6d","Type":"ContainerDied","Data":"90e322952691e5340d7dfdb0eef5d70192f25afb3fa06e2774112e840dc0d238"} Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.586407 4753 scope.go:117] "RemoveContainer" containerID="82d06422f106535b2475859bf3a3c11ef7c13e57119445daed3d619f602e5be6" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.645280 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.677300 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.697446 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.699883 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.713711 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.714003 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.742389 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.750273 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.750545 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.750617 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.750676 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-logs\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.750702 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.750778 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qtp8\" (UniqueName: \"kubernetes.io/projected/6810e266-dec6-4731-884b-067f214781c2-kube-api-access-5qtp8\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.750920 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.751030 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.854665 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.854731 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.854756 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.854778 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-logs\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.854795 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.854825 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qtp8\" (UniqueName: \"kubernetes.io/projected/6810e266-dec6-4731-884b-067f214781c2-kube-api-access-5qtp8\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.854872 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.854910 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.860371 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.860660 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.862671 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.862761 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-logs\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.867120 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.871109 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.875282 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.880985 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qtp8\" (UniqueName: \"kubernetes.io/projected/6810e266-dec6-4731-884b-067f214781c2-kube-api-access-5qtp8\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.888006 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " pod="openstack/glance-default-internal-api-0" Jan 29 14:22:02 crc kubenswrapper[4753]: I0129 14:22:02.971398 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xrf5w" Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.046395 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.058688 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvcnz\" (UniqueName: \"kubernetes.io/projected/fdfa810e-df55-4e25-8280-b6275c932336-kube-api-access-kvcnz\") pod \"fdfa810e-df55-4e25-8280-b6275c932336\" (UID: \"fdfa810e-df55-4e25-8280-b6275c932336\") " Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.059010 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdfa810e-df55-4e25-8280-b6275c932336-operator-scripts\") pod \"fdfa810e-df55-4e25-8280-b6275c932336\" (UID: \"fdfa810e-df55-4e25-8280-b6275c932336\") " Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.059709 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdfa810e-df55-4e25-8280-b6275c932336-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fdfa810e-df55-4e25-8280-b6275c932336" (UID: "fdfa810e-df55-4e25-8280-b6275c932336"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.059927 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdfa810e-df55-4e25-8280-b6275c932336-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.065601 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdfa810e-df55-4e25-8280-b6275c932336-kube-api-access-kvcnz" (OuterVolumeSpecName: "kube-api-access-kvcnz") pod "fdfa810e-df55-4e25-8280-b6275c932336" (UID: "fdfa810e-df55-4e25-8280-b6275c932336"). InnerVolumeSpecName "kube-api-access-kvcnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.166908 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvcnz\" (UniqueName: \"kubernetes.io/projected/fdfa810e-df55-4e25-8280-b6275c932336-kube-api-access-kvcnz\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.229947 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.563259 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3184c640-0157-4211-aa5a-aada8557e9f8","Type":"ContainerStarted","Data":"d6674fe2218f2f77bdb97bc90e9d01638b395fd39f7b8d3fbd7084662bcd6fb9"} Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.569112 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6ecb4be-e2ff-4138-9399-723f1b586a71","Type":"ContainerStarted","Data":"d1875b40d89e8eb999f825dfa6aa985fae4a1de5433cfb8169e1aa2dc96506da"} Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.578686 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xrf5w" event={"ID":"fdfa810e-df55-4e25-8280-b6275c932336","Type":"ContainerDied","Data":"03499132c504230176b7e4e663830c97f5f62fa51de4cf6eab3350c27fa680d8"} Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.578732 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03499132c504230176b7e4e663830c97f5f62fa51de4cf6eab3350c27fa680d8" Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.578733 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xrf5w" Jan 29 14:22:03 crc kubenswrapper[4753]: I0129 14:22:03.647897 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.092428 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p4tmx" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.169468 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107a450b-f03d-4fbb-ad45-e04584ed3972" path="/var/lib/kubelet/pods/107a450b-f03d-4fbb-ad45-e04584ed3972/volumes" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.194696 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4659g\" (UniqueName: \"kubernetes.io/projected/7df97e34-891e-41f6-bf19-24d14e83c610-kube-api-access-4659g\") pod \"7df97e34-891e-41f6-bf19-24d14e83c610\" (UID: \"7df97e34-891e-41f6-bf19-24d14e83c610\") " Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.194926 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df97e34-891e-41f6-bf19-24d14e83c610-operator-scripts\") pod \"7df97e34-891e-41f6-bf19-24d14e83c610\" (UID: \"7df97e34-891e-41f6-bf19-24d14e83c610\") " Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.196009 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df97e34-891e-41f6-bf19-24d14e83c610-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7df97e34-891e-41f6-bf19-24d14e83c610" (UID: "7df97e34-891e-41f6-bf19-24d14e83c610"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.204036 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df97e34-891e-41f6-bf19-24d14e83c610-kube-api-access-4659g" (OuterVolumeSpecName: "kube-api-access-4659g") pod "7df97e34-891e-41f6-bf19-24d14e83c610" (UID: "7df97e34-891e-41f6-bf19-24d14e83c610"). InnerVolumeSpecName "kube-api-access-4659g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.281263 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n6b96" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.297560 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df97e34-891e-41f6-bf19-24d14e83c610-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.297596 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4659g\" (UniqueName: \"kubernetes.io/projected/7df97e34-891e-41f6-bf19-24d14e83c610-kube-api-access-4659g\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.334874 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.398219 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dade6dd-52d6-419f-8460-438e78771a6d-operator-scripts\") pod \"4dade6dd-52d6-419f-8460-438e78771a6d\" (UID: \"4dade6dd-52d6-419f-8460-438e78771a6d\") " Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.398673 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tftg\" (UniqueName: \"kubernetes.io/projected/4dade6dd-52d6-419f-8460-438e78771a6d-kube-api-access-8tftg\") pod \"4dade6dd-52d6-419f-8460-438e78771a6d\" (UID: \"4dade6dd-52d6-419f-8460-438e78771a6d\") " Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.398703 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/845fdb72-7823-4160-941a-e70288b5f77b-operator-scripts\") pod \"845fdb72-7823-4160-941a-e70288b5f77b\" (UID: \"845fdb72-7823-4160-941a-e70288b5f77b\") " Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.398745 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbjx9\" (UniqueName: \"kubernetes.io/projected/845fdb72-7823-4160-941a-e70288b5f77b-kube-api-access-bbjx9\") pod \"845fdb72-7823-4160-941a-e70288b5f77b\" (UID: \"845fdb72-7823-4160-941a-e70288b5f77b\") " Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.399874 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/845fdb72-7823-4160-941a-e70288b5f77b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "845fdb72-7823-4160-941a-e70288b5f77b" (UID: "845fdb72-7823-4160-941a-e70288b5f77b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.400622 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dade6dd-52d6-419f-8460-438e78771a6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4dade6dd-52d6-419f-8460-438e78771a6d" (UID: "4dade6dd-52d6-419f-8460-438e78771a6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.403428 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dade6dd-52d6-419f-8460-438e78771a6d-kube-api-access-8tftg" (OuterVolumeSpecName: "kube-api-access-8tftg") pod "4dade6dd-52d6-419f-8460-438e78771a6d" (UID: "4dade6dd-52d6-419f-8460-438e78771a6d"). InnerVolumeSpecName "kube-api-access-8tftg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.403859 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/845fdb72-7823-4160-941a-e70288b5f77b-kube-api-access-bbjx9" (OuterVolumeSpecName: "kube-api-access-bbjx9") pod "845fdb72-7823-4160-941a-e70288b5f77b" (UID: "845fdb72-7823-4160-941a-e70288b5f77b"). InnerVolumeSpecName "kube-api-access-bbjx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.469875 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fad8-account-create-update-vlh99" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.501101 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-870b-account-create-update-2rg5l" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.502489 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9730846e-4e43-45f7-8b5d-38d4753b3b80-operator-scripts\") pod \"9730846e-4e43-45f7-8b5d-38d4753b3b80\" (UID: \"9730846e-4e43-45f7-8b5d-38d4753b3b80\") " Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.502534 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzqtv\" (UniqueName: \"kubernetes.io/projected/9730846e-4e43-45f7-8b5d-38d4753b3b80-kube-api-access-hzqtv\") pod \"9730846e-4e43-45f7-8b5d-38d4753b3b80\" (UID: \"9730846e-4e43-45f7-8b5d-38d4753b3b80\") " Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.503051 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tftg\" (UniqueName: \"kubernetes.io/projected/4dade6dd-52d6-419f-8460-438e78771a6d-kube-api-access-8tftg\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.503064 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/845fdb72-7823-4160-941a-e70288b5f77b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.503082 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbjx9\" (UniqueName: \"kubernetes.io/projected/845fdb72-7823-4160-941a-e70288b5f77b-kube-api-access-bbjx9\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.503092 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dade6dd-52d6-419f-8460-438e78771a6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.503354 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9730846e-4e43-45f7-8b5d-38d4753b3b80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9730846e-4e43-45f7-8b5d-38d4753b3b80" (UID: "9730846e-4e43-45f7-8b5d-38d4753b3b80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.532390 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9730846e-4e43-45f7-8b5d-38d4753b3b80-kube-api-access-hzqtv" (OuterVolumeSpecName: "kube-api-access-hzqtv") pod "9730846e-4e43-45f7-8b5d-38d4753b3b80" (UID: "9730846e-4e43-45f7-8b5d-38d4753b3b80"). InnerVolumeSpecName "kube-api-access-hzqtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.594678 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6ecb4be-e2ff-4138-9399-723f1b586a71","Type":"ContainerStarted","Data":"bfa2ab3034086679be4711526e2073cb927a1b72dc3719d518e23f941db74fc3"} Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.602668 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p4tmx" event={"ID":"7df97e34-891e-41f6-bf19-24d14e83c610","Type":"ContainerDied","Data":"d65c5bbdd306cedae8c7d78117160f10700fe7d5433c05ca9b6df3ffe2762b33"} Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.602879 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d65c5bbdd306cedae8c7d78117160f10700fe7d5433c05ca9b6df3ffe2762b33" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.602915 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p4tmx" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.603916 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d2de408-6b00-4029-a1a9-d0ab19401f96-operator-scripts\") pod \"8d2de408-6b00-4029-a1a9-d0ab19401f96\" (UID: \"8d2de408-6b00-4029-a1a9-d0ab19401f96\") " Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.603964 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j5tb\" (UniqueName: \"kubernetes.io/projected/8d2de408-6b00-4029-a1a9-d0ab19401f96-kube-api-access-6j5tb\") pod \"8d2de408-6b00-4029-a1a9-d0ab19401f96\" (UID: \"8d2de408-6b00-4029-a1a9-d0ab19401f96\") " Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.604484 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9730846e-4e43-45f7-8b5d-38d4753b3b80-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.604503 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzqtv\" (UniqueName: \"kubernetes.io/projected/9730846e-4e43-45f7-8b5d-38d4753b3b80-kube-api-access-hzqtv\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.604698 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d2de408-6b00-4029-a1a9-d0ab19401f96-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d2de408-6b00-4029-a1a9-d0ab19401f96" (UID: "8d2de408-6b00-4029-a1a9-d0ab19401f96"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.608344 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d2de408-6b00-4029-a1a9-d0ab19401f96-kube-api-access-6j5tb" (OuterVolumeSpecName: "kube-api-access-6j5tb") pod "8d2de408-6b00-4029-a1a9-d0ab19401f96" (UID: "8d2de408-6b00-4029-a1a9-d0ab19401f96"). InnerVolumeSpecName "kube-api-access-6j5tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.609984 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" event={"ID":"845fdb72-7823-4160-941a-e70288b5f77b","Type":"ContainerDied","Data":"00fa8dce14261e9438d8300c6739511e46d28271bad9287d35787a1999da21fb"} Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.610006 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00fa8dce14261e9438d8300c6739511e46d28271bad9287d35787a1999da21fb" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.610024 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ced3-account-create-update-h5z4v" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.612960 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6810e266-dec6-4731-884b-067f214781c2","Type":"ContainerStarted","Data":"9a868df539376371754bcb8cc9c201732d3c2ee4f5c9581079b86ae7fa2e189c"} Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.618842 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3184c640-0157-4211-aa5a-aada8557e9f8","Type":"ContainerStarted","Data":"f5895e5563577e5f99ca58e92f19470e5b0e974e28396e83be746eec355480e3"} Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.627500 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-870b-account-create-update-2rg5l" event={"ID":"8d2de408-6b00-4029-a1a9-d0ab19401f96","Type":"ContainerDied","Data":"4090ecb8922250fe832e8bf11a3ced8a6fc97e19a09a7ea9a93b12b4317b4ca6"} Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.627546 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4090ecb8922250fe832e8bf11a3ced8a6fc97e19a09a7ea9a93b12b4317b4ca6" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.627617 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-870b-account-create-update-2rg5l" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.630783 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fad8-account-create-update-vlh99" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.630796 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fad8-account-create-update-vlh99" event={"ID":"9730846e-4e43-45f7-8b5d-38d4753b3b80","Type":"ContainerDied","Data":"948ec19997a6f1a99a01526047cdebe2561fa4aec5639358ce1f049a7c326cba"} Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.630968 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="948ec19997a6f1a99a01526047cdebe2561fa4aec5639358ce1f049a7c326cba" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.633040 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n6b96" event={"ID":"4dade6dd-52d6-419f-8460-438e78771a6d","Type":"ContainerDied","Data":"da52614ae156b30c0346a064624ada510bcd609181c39312fd431ce5f8acf385"} Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.633072 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da52614ae156b30c0346a064624ada510bcd609181c39312fd431ce5f8acf385" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.633074 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n6b96" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.705720 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d2de408-6b00-4029-a1a9-d0ab19401f96-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:04 crc kubenswrapper[4753]: I0129 14:22:04.705747 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j5tb\" (UniqueName: \"kubernetes.io/projected/8d2de408-6b00-4029-a1a9-d0ab19401f96-kube-api-access-6j5tb\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:05 crc kubenswrapper[4753]: I0129 14:22:05.652806 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6810e266-dec6-4731-884b-067f214781c2","Type":"ContainerStarted","Data":"e2d1fb3d7fa36ee6d4949b785910ce4cfa547832c2acade41dcbf22dc90c2c6e"} Jan 29 14:22:05 crc kubenswrapper[4753]: I0129 14:22:05.653384 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6810e266-dec6-4731-884b-067f214781c2","Type":"ContainerStarted","Data":"b0fc984341d3bf9cf81937c474fb2a82c3b897efaf7a2c1e16681e411cbe9085"} Jan 29 14:22:05 crc kubenswrapper[4753]: I0129 14:22:05.657640 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3184c640-0157-4211-aa5a-aada8557e9f8","Type":"ContainerStarted","Data":"bf98498966b9708676b313afca0a0b4bb674752fe39d67e44f9f70b35df870b7"} Jan 29 14:22:05 crc kubenswrapper[4753]: I0129 14:22:05.668338 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6ecb4be-e2ff-4138-9399-723f1b586a71","Type":"ContainerStarted","Data":"d933784f62ffaede68f9ab917303b9b4e747d760578c19ca42ae6f17de192258"} Jan 29 14:22:05 crc kubenswrapper[4753]: I0129 14:22:05.687365 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.687343951 podStartE2EDuration="3.687343951s" podCreationTimestamp="2026-01-29 14:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:05.684068553 +0000 UTC m=+1160.378802935" watchObservedRunningTime="2026-01-29 14:22:05.687343951 +0000 UTC m=+1160.382078333" Jan 29 14:22:05 crc kubenswrapper[4753]: I0129 14:22:05.725204 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.725182962 podStartE2EDuration="4.725182962s" podCreationTimestamp="2026-01-29 14:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:05.721956715 +0000 UTC m=+1160.416691097" watchObservedRunningTime="2026-01-29 14:22:05.725182962 +0000 UTC m=+1160.419917344" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.701542 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6ecb4be-e2ff-4138-9399-723f1b586a71","Type":"ContainerStarted","Data":"d17c4e102795abfc687f012cc98f8bb04d74a573db8de3ae7fda99c02075ee6b"} Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.702107 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.728580 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8c5ld"] Jan 29 14:22:08 crc kubenswrapper[4753]: E0129 14:22:08.729007 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dade6dd-52d6-419f-8460-438e78771a6d" containerName="mariadb-database-create" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729033 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dade6dd-52d6-419f-8460-438e78771a6d" containerName="mariadb-database-create" Jan 29 14:22:08 crc kubenswrapper[4753]: E0129 14:22:08.729075 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845fdb72-7823-4160-941a-e70288b5f77b" containerName="mariadb-account-create-update" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729086 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="845fdb72-7823-4160-941a-e70288b5f77b" containerName="mariadb-account-create-update" Jan 29 14:22:08 crc kubenswrapper[4753]: E0129 14:22:08.729099 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfa810e-df55-4e25-8280-b6275c932336" containerName="mariadb-database-create" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729112 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfa810e-df55-4e25-8280-b6275c932336" containerName="mariadb-database-create" Jan 29 14:22:08 crc kubenswrapper[4753]: E0129 14:22:08.729127 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9730846e-4e43-45f7-8b5d-38d4753b3b80" containerName="mariadb-account-create-update" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729135 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9730846e-4e43-45f7-8b5d-38d4753b3b80" containerName="mariadb-account-create-update" Jan 29 14:22:08 crc kubenswrapper[4753]: E0129 14:22:08.729182 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d2de408-6b00-4029-a1a9-d0ab19401f96" containerName="mariadb-account-create-update" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729191 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d2de408-6b00-4029-a1a9-d0ab19401f96" containerName="mariadb-account-create-update" Jan 29 14:22:08 crc kubenswrapper[4753]: E0129 14:22:08.729209 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df97e34-891e-41f6-bf19-24d14e83c610" containerName="mariadb-database-create" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729217 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df97e34-891e-41f6-bf19-24d14e83c610" containerName="mariadb-database-create" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729451 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="845fdb72-7823-4160-941a-e70288b5f77b" containerName="mariadb-account-create-update" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729477 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df97e34-891e-41f6-bf19-24d14e83c610" containerName="mariadb-database-create" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729497 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dade6dd-52d6-419f-8460-438e78771a6d" containerName="mariadb-database-create" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729514 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d2de408-6b00-4029-a1a9-d0ab19401f96" containerName="mariadb-account-create-update" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729530 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdfa810e-df55-4e25-8280-b6275c932336" containerName="mariadb-database-create" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.729544 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9730846e-4e43-45f7-8b5d-38d4753b3b80" containerName="mariadb-account-create-update" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.730300 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.732787 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zm7mz" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.733077 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.733205 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.750815 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8c5ld"] Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.766380 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.485590461 podStartE2EDuration="8.766350428s" podCreationTimestamp="2026-01-29 14:22:00 +0000 UTC" firstStartedPulling="2026-01-29 14:22:01.604713325 +0000 UTC m=+1156.299447707" lastFinishedPulling="2026-01-29 14:22:07.885473292 +0000 UTC m=+1162.580207674" observedRunningTime="2026-01-29 14:22:08.744186139 +0000 UTC m=+1163.438920531" watchObservedRunningTime="2026-01-29 14:22:08.766350428 +0000 UTC m=+1163.461084810" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.882105 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9shz\" (UniqueName: \"kubernetes.io/projected/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-kube-api-access-c9shz\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.882555 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-scripts\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.882653 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-config-data\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.882901 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.985213 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.985378 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9shz\" (UniqueName: \"kubernetes.io/projected/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-kube-api-access-c9shz\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.985533 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-scripts\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.985797 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-config-data\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.997044 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-scripts\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.997364 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:08 crc kubenswrapper[4753]: I0129 14:22:08.997500 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-config-data\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:09 crc kubenswrapper[4753]: I0129 14:22:09.002640 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9shz\" (UniqueName: \"kubernetes.io/projected/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-kube-api-access-c9shz\") pod \"nova-cell0-conductor-db-sync-8c5ld\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:09 crc kubenswrapper[4753]: I0129 14:22:09.046271 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:09 crc kubenswrapper[4753]: I0129 14:22:09.436469 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:09 crc kubenswrapper[4753]: W0129 14:22:09.655486 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6ffcfa6_8ed5_4d92_9e63_87756c03f8e8.slice/crio-ec68f3fa2ff9e373511d3aeef8c0230f731fe74461953d0815f8bec1825f4ac9 WatchSource:0}: Error finding container ec68f3fa2ff9e373511d3aeef8c0230f731fe74461953d0815f8bec1825f4ac9: Status 404 returned error can't find the container with id ec68f3fa2ff9e373511d3aeef8c0230f731fe74461953d0815f8bec1825f4ac9 Jan 29 14:22:09 crc kubenswrapper[4753]: I0129 14:22:09.663288 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8c5ld"] Jan 29 14:22:09 crc kubenswrapper[4753]: I0129 14:22:09.713553 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8c5ld" event={"ID":"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8","Type":"ContainerStarted","Data":"ec68f3fa2ff9e373511d3aeef8c0230f731fe74461953d0815f8bec1825f4ac9"} Jan 29 14:22:10 crc kubenswrapper[4753]: I0129 14:22:10.727525 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="ceilometer-central-agent" containerID="cri-o://d1875b40d89e8eb999f825dfa6aa985fae4a1de5433cfb8169e1aa2dc96506da" gracePeriod=30 Jan 29 14:22:10 crc kubenswrapper[4753]: I0129 14:22:10.727599 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="proxy-httpd" containerID="cri-o://d17c4e102795abfc687f012cc98f8bb04d74a573db8de3ae7fda99c02075ee6b" gracePeriod=30 Jan 29 14:22:10 crc kubenswrapper[4753]: I0129 14:22:10.727610 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="sg-core" containerID="cri-o://d933784f62ffaede68f9ab917303b9b4e747d760578c19ca42ae6f17de192258" gracePeriod=30 Jan 29 14:22:10 crc kubenswrapper[4753]: I0129 14:22:10.727660 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="ceilometer-notification-agent" containerID="cri-o://bfa2ab3034086679be4711526e2073cb927a1b72dc3719d518e23f941db74fc3" gracePeriod=30 Jan 29 14:22:11 crc kubenswrapper[4753]: I0129 14:22:11.739973 4753 generic.go:334] "Generic (PLEG): container finished" podID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerID="d17c4e102795abfc687f012cc98f8bb04d74a573db8de3ae7fda99c02075ee6b" exitCode=0 Jan 29 14:22:11 crc kubenswrapper[4753]: I0129 14:22:11.740332 4753 generic.go:334] "Generic (PLEG): container finished" podID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerID="d933784f62ffaede68f9ab917303b9b4e747d760578c19ca42ae6f17de192258" exitCode=2 Jan 29 14:22:11 crc kubenswrapper[4753]: I0129 14:22:11.740344 4753 generic.go:334] "Generic (PLEG): container finished" podID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerID="bfa2ab3034086679be4711526e2073cb927a1b72dc3719d518e23f941db74fc3" exitCode=0 Jan 29 14:22:11 crc kubenswrapper[4753]: I0129 14:22:11.740090 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6ecb4be-e2ff-4138-9399-723f1b586a71","Type":"ContainerDied","Data":"d17c4e102795abfc687f012cc98f8bb04d74a573db8de3ae7fda99c02075ee6b"} Jan 29 14:22:11 crc kubenswrapper[4753]: I0129 14:22:11.740384 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6ecb4be-e2ff-4138-9399-723f1b586a71","Type":"ContainerDied","Data":"d933784f62ffaede68f9ab917303b9b4e747d760578c19ca42ae6f17de192258"} Jan 29 14:22:11 crc kubenswrapper[4753]: I0129 14:22:11.740403 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6ecb4be-e2ff-4138-9399-723f1b586a71","Type":"ContainerDied","Data":"bfa2ab3034086679be4711526e2073cb927a1b72dc3719d518e23f941db74fc3"} Jan 29 14:22:12 crc kubenswrapper[4753]: I0129 14:22:12.402188 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 14:22:12 crc kubenswrapper[4753]: I0129 14:22:12.402498 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 14:22:12 crc kubenswrapper[4753]: I0129 14:22:12.457227 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 14:22:12 crc kubenswrapper[4753]: I0129 14:22:12.459158 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 14:22:12 crc kubenswrapper[4753]: I0129 14:22:12.749842 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 14:22:12 crc kubenswrapper[4753]: I0129 14:22:12.751512 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 14:22:13 crc kubenswrapper[4753]: I0129 14:22:13.046589 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:13 crc kubenswrapper[4753]: I0129 14:22:13.046634 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:13 crc kubenswrapper[4753]: I0129 14:22:13.089444 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:13 crc kubenswrapper[4753]: I0129 14:22:13.107677 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:13 crc kubenswrapper[4753]: I0129 14:22:13.765016 4753 generic.go:334] "Generic (PLEG): container finished" podID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerID="d1875b40d89e8eb999f825dfa6aa985fae4a1de5433cfb8169e1aa2dc96506da" exitCode=0 Jan 29 14:22:13 crc kubenswrapper[4753]: I0129 14:22:13.765225 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6ecb4be-e2ff-4138-9399-723f1b586a71","Type":"ContainerDied","Data":"d1875b40d89e8eb999f825dfa6aa985fae4a1de5433cfb8169e1aa2dc96506da"} Jan 29 14:22:13 crc kubenswrapper[4753]: I0129 14:22:13.766248 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:13 crc kubenswrapper[4753]: I0129 14:22:13.766401 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:14 crc kubenswrapper[4753]: I0129 14:22:14.640278 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 14:22:14 crc kubenswrapper[4753]: I0129 14:22:14.668521 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 14:22:15 crc kubenswrapper[4753]: I0129 14:22:15.614293 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:15 crc kubenswrapper[4753]: I0129 14:22:15.618302 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.449726 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.605700 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-config-data\") pod \"b6ecb4be-e2ff-4138-9399-723f1b586a71\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.605779 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-run-httpd\") pod \"b6ecb4be-e2ff-4138-9399-723f1b586a71\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.605835 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-log-httpd\") pod \"b6ecb4be-e2ff-4138-9399-723f1b586a71\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.605904 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-sg-core-conf-yaml\") pod \"b6ecb4be-e2ff-4138-9399-723f1b586a71\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.605932 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-combined-ca-bundle\") pod \"b6ecb4be-e2ff-4138-9399-723f1b586a71\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.606022 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvtbg\" (UniqueName: \"kubernetes.io/projected/b6ecb4be-e2ff-4138-9399-723f1b586a71-kube-api-access-lvtbg\") pod \"b6ecb4be-e2ff-4138-9399-723f1b586a71\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.606044 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-scripts\") pod \"b6ecb4be-e2ff-4138-9399-723f1b586a71\" (UID: \"b6ecb4be-e2ff-4138-9399-723f1b586a71\") " Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.606300 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b6ecb4be-e2ff-4138-9399-723f1b586a71" (UID: "b6ecb4be-e2ff-4138-9399-723f1b586a71"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.606444 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b6ecb4be-e2ff-4138-9399-723f1b586a71" (UID: "b6ecb4be-e2ff-4138-9399-723f1b586a71"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.606777 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.606797 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6ecb4be-e2ff-4138-9399-723f1b586a71-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.611399 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-scripts" (OuterVolumeSpecName: "scripts") pod "b6ecb4be-e2ff-4138-9399-723f1b586a71" (UID: "b6ecb4be-e2ff-4138-9399-723f1b586a71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.612080 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ecb4be-e2ff-4138-9399-723f1b586a71-kube-api-access-lvtbg" (OuterVolumeSpecName: "kube-api-access-lvtbg") pod "b6ecb4be-e2ff-4138-9399-723f1b586a71" (UID: "b6ecb4be-e2ff-4138-9399-723f1b586a71"). InnerVolumeSpecName "kube-api-access-lvtbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.647898 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b6ecb4be-e2ff-4138-9399-723f1b586a71" (UID: "b6ecb4be-e2ff-4138-9399-723f1b586a71"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.683680 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6ecb4be-e2ff-4138-9399-723f1b586a71" (UID: "b6ecb4be-e2ff-4138-9399-723f1b586a71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.708962 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.708997 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.709012 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvtbg\" (UniqueName: \"kubernetes.io/projected/b6ecb4be-e2ff-4138-9399-723f1b586a71-kube-api-access-lvtbg\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.709025 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.729955 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-config-data" (OuterVolumeSpecName: "config-data") pod "b6ecb4be-e2ff-4138-9399-723f1b586a71" (UID: "b6ecb4be-e2ff-4138-9399-723f1b586a71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.810722 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ecb4be-e2ff-4138-9399-723f1b586a71-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.834464 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8c5ld" event={"ID":"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8","Type":"ContainerStarted","Data":"5ebc3f1b2205648bb7601685da242e41c1b97a5b780c0a59881e50d9c9fc0cc9"} Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.837131 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6ecb4be-e2ff-4138-9399-723f1b586a71","Type":"ContainerDied","Data":"34a963a7a79a8f1012999d55d7e77a828976deb3d1be05be000c6a37cda111cd"} Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.837192 4753 scope.go:117] "RemoveContainer" containerID="d17c4e102795abfc687f012cc98f8bb04d74a573db8de3ae7fda99c02075ee6b" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.837329 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.858906 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-8c5ld" podStartSLOduration=2.322327971 podStartE2EDuration="12.858885845s" podCreationTimestamp="2026-01-29 14:22:08 +0000 UTC" firstStartedPulling="2026-01-29 14:22:09.65784865 +0000 UTC m=+1164.352583032" lastFinishedPulling="2026-01-29 14:22:20.194406494 +0000 UTC m=+1174.889140906" observedRunningTime="2026-01-29 14:22:20.851912617 +0000 UTC m=+1175.546647009" watchObservedRunningTime="2026-01-29 14:22:20.858885845 +0000 UTC m=+1175.553620237" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.868453 4753 scope.go:117] "RemoveContainer" containerID="d933784f62ffaede68f9ab917303b9b4e747d760578c19ca42ae6f17de192258" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.893076 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.894524 4753 scope.go:117] "RemoveContainer" containerID="bfa2ab3034086679be4711526e2073cb927a1b72dc3719d518e23f941db74fc3" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.910289 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.917803 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:20 crc kubenswrapper[4753]: E0129 14:22:20.918174 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="ceilometer-central-agent" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.918189 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="ceilometer-central-agent" Jan 29 14:22:20 crc kubenswrapper[4753]: E0129 14:22:20.918207 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="sg-core" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.918212 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="sg-core" Jan 29 14:22:20 crc kubenswrapper[4753]: E0129 14:22:20.918221 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="proxy-httpd" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.918228 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="proxy-httpd" Jan 29 14:22:20 crc kubenswrapper[4753]: E0129 14:22:20.918239 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="ceilometer-notification-agent" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.918244 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="ceilometer-notification-agent" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.918403 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="ceilometer-notification-agent" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.918419 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="sg-core" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.918430 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="ceilometer-central-agent" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.918443 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" containerName="proxy-httpd" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.919988 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.922697 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.923876 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.926022 4753 scope.go:117] "RemoveContainer" containerID="d1875b40d89e8eb999f825dfa6aa985fae4a1de5433cfb8169e1aa2dc96506da" Jan 29 14:22:20 crc kubenswrapper[4753]: I0129 14:22:20.954914 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.015697 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.015747 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-run-httpd\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.015778 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-log-httpd\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.015827 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-scripts\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.015903 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjmwl\" (UniqueName: \"kubernetes.io/projected/19ab7c78-359d-4d8c-9b79-50f20d6d5377-kube-api-access-pjmwl\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.016818 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-config-data\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.016869 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.118383 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.118431 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-run-httpd\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.118465 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-log-httpd\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.118502 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-scripts\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.118575 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjmwl\" (UniqueName: \"kubernetes.io/projected/19ab7c78-359d-4d8c-9b79-50f20d6d5377-kube-api-access-pjmwl\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.118627 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-config-data\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.118644 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.119598 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-log-httpd\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.120074 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-run-httpd\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.121964 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.123766 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-scripts\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.123945 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-config-data\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.124270 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.142358 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjmwl\" (UniqueName: \"kubernetes.io/projected/19ab7c78-359d-4d8c-9b79-50f20d6d5377-kube-api-access-pjmwl\") pod \"ceilometer-0\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.236510 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.803655 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:21 crc kubenswrapper[4753]: I0129 14:22:21.854864 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19ab7c78-359d-4d8c-9b79-50f20d6d5377","Type":"ContainerStarted","Data":"f6d6b16f1bb5514b73c58379f3c2875a390766075a0c5955fe5665a909bc2346"} Jan 29 14:22:22 crc kubenswrapper[4753]: I0129 14:22:22.165736 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6ecb4be-e2ff-4138-9399-723f1b586a71" path="/var/lib/kubelet/pods/b6ecb4be-e2ff-4138-9399-723f1b586a71/volumes" Jan 29 14:22:22 crc kubenswrapper[4753]: I0129 14:22:22.870436 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19ab7c78-359d-4d8c-9b79-50f20d6d5377","Type":"ContainerStarted","Data":"a16af39cb5363457dc4c9c84bf72c40383174786fb254630cb536295a322a2ad"} Jan 29 14:22:23 crc kubenswrapper[4753]: I0129 14:22:23.882647 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19ab7c78-359d-4d8c-9b79-50f20d6d5377","Type":"ContainerStarted","Data":"3c9c98dd39e9d1afc45f5142211e557ff60dc8d5a76e4945d79d6ec1980b6877"} Jan 29 14:22:24 crc kubenswrapper[4753]: I0129 14:22:24.895484 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19ab7c78-359d-4d8c-9b79-50f20d6d5377","Type":"ContainerStarted","Data":"c61aa197a906c41adba84f2dbf0e7c04cc53c9fcb21a5dd6c59f004f5003ed96"} Jan 29 14:22:26 crc kubenswrapper[4753]: I0129 14:22:26.914642 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19ab7c78-359d-4d8c-9b79-50f20d6d5377","Type":"ContainerStarted","Data":"5db8b2bb0f1853ca77053736156202f831fb81b3c571c0820299883e021e200b"} Jan 29 14:22:26 crc kubenswrapper[4753]: I0129 14:22:26.915687 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 14:22:26 crc kubenswrapper[4753]: I0129 14:22:26.935750 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.81603624 podStartE2EDuration="6.935725713s" podCreationTimestamp="2026-01-29 14:22:20 +0000 UTC" firstStartedPulling="2026-01-29 14:22:21.788508046 +0000 UTC m=+1176.483242438" lastFinishedPulling="2026-01-29 14:22:25.908197529 +0000 UTC m=+1180.602931911" observedRunningTime="2026-01-29 14:22:26.934411138 +0000 UTC m=+1181.629145520" watchObservedRunningTime="2026-01-29 14:22:26.935725713 +0000 UTC m=+1181.630460105" Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.326786 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.327646 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="ceilometer-central-agent" containerID="cri-o://a16af39cb5363457dc4c9c84bf72c40383174786fb254630cb536295a322a2ad" gracePeriod=30 Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.327746 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="sg-core" containerID="cri-o://c61aa197a906c41adba84f2dbf0e7c04cc53c9fcb21a5dd6c59f004f5003ed96" gracePeriod=30 Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.327746 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="proxy-httpd" containerID="cri-o://5db8b2bb0f1853ca77053736156202f831fb81b3c571c0820299883e021e200b" gracePeriod=30 Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.327747 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="ceilometer-notification-agent" containerID="cri-o://3c9c98dd39e9d1afc45f5142211e557ff60dc8d5a76e4945d79d6ec1980b6877" gracePeriod=30 Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.982403 4753 generic.go:334] "Generic (PLEG): container finished" podID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerID="5db8b2bb0f1853ca77053736156202f831fb81b3c571c0820299883e021e200b" exitCode=0 Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.982691 4753 generic.go:334] "Generic (PLEG): container finished" podID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerID="c61aa197a906c41adba84f2dbf0e7c04cc53c9fcb21a5dd6c59f004f5003ed96" exitCode=2 Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.982700 4753 generic.go:334] "Generic (PLEG): container finished" podID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerID="3c9c98dd39e9d1afc45f5142211e557ff60dc8d5a76e4945d79d6ec1980b6877" exitCode=0 Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.982707 4753 generic.go:334] "Generic (PLEG): container finished" podID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerID="a16af39cb5363457dc4c9c84bf72c40383174786fb254630cb536295a322a2ad" exitCode=0 Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.982554 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19ab7c78-359d-4d8c-9b79-50f20d6d5377","Type":"ContainerDied","Data":"5db8b2bb0f1853ca77053736156202f831fb81b3c571c0820299883e021e200b"} Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.982787 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19ab7c78-359d-4d8c-9b79-50f20d6d5377","Type":"ContainerDied","Data":"c61aa197a906c41adba84f2dbf0e7c04cc53c9fcb21a5dd6c59f004f5003ed96"} Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.982801 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19ab7c78-359d-4d8c-9b79-50f20d6d5377","Type":"ContainerDied","Data":"3c9c98dd39e9d1afc45f5142211e557ff60dc8d5a76e4945d79d6ec1980b6877"} Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.982811 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19ab7c78-359d-4d8c-9b79-50f20d6d5377","Type":"ContainerDied","Data":"a16af39cb5363457dc4c9c84bf72c40383174786fb254630cb536295a322a2ad"} Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.984873 4753 generic.go:334] "Generic (PLEG): container finished" podID="d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8" containerID="5ebc3f1b2205648bb7601685da242e41c1b97a5b780c0a59881e50d9c9fc0cc9" exitCode=0 Jan 29 14:22:31 crc kubenswrapper[4753]: I0129 14:22:31.984919 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8c5ld" event={"ID":"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8","Type":"ContainerDied","Data":"5ebc3f1b2205648bb7601685da242e41c1b97a5b780c0a59881e50d9c9fc0cc9"} Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.224540 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.288647 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-log-httpd\") pod \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.288684 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-sg-core-conf-yaml\") pod \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.288717 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-run-httpd\") pod \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.288737 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-config-data\") pod \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.288771 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-combined-ca-bundle\") pod \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.288789 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-scripts\") pod \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.288848 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjmwl\" (UniqueName: \"kubernetes.io/projected/19ab7c78-359d-4d8c-9b79-50f20d6d5377-kube-api-access-pjmwl\") pod \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\" (UID: \"19ab7c78-359d-4d8c-9b79-50f20d6d5377\") " Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.289655 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "19ab7c78-359d-4d8c-9b79-50f20d6d5377" (UID: "19ab7c78-359d-4d8c-9b79-50f20d6d5377"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.289780 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "19ab7c78-359d-4d8c-9b79-50f20d6d5377" (UID: "19ab7c78-359d-4d8c-9b79-50f20d6d5377"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.293935 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-scripts" (OuterVolumeSpecName: "scripts") pod "19ab7c78-359d-4d8c-9b79-50f20d6d5377" (UID: "19ab7c78-359d-4d8c-9b79-50f20d6d5377"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.294169 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ab7c78-359d-4d8c-9b79-50f20d6d5377-kube-api-access-pjmwl" (OuterVolumeSpecName: "kube-api-access-pjmwl") pod "19ab7c78-359d-4d8c-9b79-50f20d6d5377" (UID: "19ab7c78-359d-4d8c-9b79-50f20d6d5377"). InnerVolumeSpecName "kube-api-access-pjmwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.316345 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "19ab7c78-359d-4d8c-9b79-50f20d6d5377" (UID: "19ab7c78-359d-4d8c-9b79-50f20d6d5377"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.358727 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19ab7c78-359d-4d8c-9b79-50f20d6d5377" (UID: "19ab7c78-359d-4d8c-9b79-50f20d6d5377"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.373857 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-config-data" (OuterVolumeSpecName: "config-data") pod "19ab7c78-359d-4d8c-9b79-50f20d6d5377" (UID: "19ab7c78-359d-4d8c-9b79-50f20d6d5377"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.391383 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.391415 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjmwl\" (UniqueName: \"kubernetes.io/projected/19ab7c78-359d-4d8c-9b79-50f20d6d5377-kube-api-access-pjmwl\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.391426 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.391436 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.391445 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19ab7c78-359d-4d8c-9b79-50f20d6d5377-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.391453 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.391462 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ab7c78-359d-4d8c-9b79-50f20d6d5377-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.997678 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19ab7c78-359d-4d8c-9b79-50f20d6d5377","Type":"ContainerDied","Data":"f6d6b16f1bb5514b73c58379f3c2875a390766075a0c5955fe5665a909bc2346"} Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.997734 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:32 crc kubenswrapper[4753]: I0129 14:22:32.997764 4753 scope.go:117] "RemoveContainer" containerID="5db8b2bb0f1853ca77053736156202f831fb81b3c571c0820299883e021e200b" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.055880 4753 scope.go:117] "RemoveContainer" containerID="c61aa197a906c41adba84f2dbf0e7c04cc53c9fcb21a5dd6c59f004f5003ed96" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.085772 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.095716 4753 scope.go:117] "RemoveContainer" containerID="3c9c98dd39e9d1afc45f5142211e557ff60dc8d5a76e4945d79d6ec1980b6877" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.102718 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.115255 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:33 crc kubenswrapper[4753]: E0129 14:22:33.115809 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="sg-core" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.115886 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="sg-core" Jan 29 14:22:33 crc kubenswrapper[4753]: E0129 14:22:33.115957 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="ceilometer-notification-agent" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.116006 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="ceilometer-notification-agent" Jan 29 14:22:33 crc kubenswrapper[4753]: E0129 14:22:33.116066 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="proxy-httpd" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.116119 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="proxy-httpd" Jan 29 14:22:33 crc kubenswrapper[4753]: E0129 14:22:33.116202 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="ceilometer-central-agent" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.116261 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="ceilometer-central-agent" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.116470 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="ceilometer-central-agent" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.116532 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="ceilometer-notification-agent" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.116589 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="proxy-httpd" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.116646 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" containerName="sg-core" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.118226 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.120849 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.121186 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.186985 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.201548 4753 scope.go:117] "RemoveContainer" containerID="a16af39cb5363457dc4c9c84bf72c40383174786fb254630cb536295a322a2ad" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.208838 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-log-httpd\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.208875 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-scripts\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.208941 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.209062 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-config-data\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.209100 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.209131 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjt7q\" (UniqueName: \"kubernetes.io/projected/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-kube-api-access-hjt7q\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.209381 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-run-httpd\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.311500 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-config-data\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.311565 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.311607 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjt7q\" (UniqueName: \"kubernetes.io/projected/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-kube-api-access-hjt7q\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.311668 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-run-httpd\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.311719 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-log-httpd\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.311743 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-scripts\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.311765 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.320968 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-log-httpd\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.321529 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-config-data\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.321812 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-run-httpd\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.321955 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.322913 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-scripts\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.323492 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.335823 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjt7q\" (UniqueName: \"kubernetes.io/projected/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-kube-api-access-hjt7q\") pod \"ceilometer-0\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.451969 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.492387 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.514862 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-combined-ca-bundle\") pod \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.514932 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9shz\" (UniqueName: \"kubernetes.io/projected/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-kube-api-access-c9shz\") pod \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.515144 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-config-data\") pod \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.515215 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-scripts\") pod \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\" (UID: \"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8\") " Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.520022 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-kube-api-access-c9shz" (OuterVolumeSpecName: "kube-api-access-c9shz") pod "d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8" (UID: "d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8"). InnerVolumeSpecName "kube-api-access-c9shz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.522131 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-scripts" (OuterVolumeSpecName: "scripts") pod "d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8" (UID: "d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.546090 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-config-data" (OuterVolumeSpecName: "config-data") pod "d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8" (UID: "d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.547281 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8" (UID: "d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.618244 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.618282 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.618295 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.618309 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9shz\" (UniqueName: \"kubernetes.io/projected/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8-kube-api-access-c9shz\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:33 crc kubenswrapper[4753]: I0129 14:22:33.952603 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:22:33 crc kubenswrapper[4753]: W0129 14:22:33.962539 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1d44c22_d153_48fd_ac3a_cd43ab7b5339.slice/crio-7498d543e284a6b1f4a47109845cfde5024ef772c95e0dde9afae1a5b55420d2 WatchSource:0}: Error finding container 7498d543e284a6b1f4a47109845cfde5024ef772c95e0dde9afae1a5b55420d2: Status 404 returned error can't find the container with id 7498d543e284a6b1f4a47109845cfde5024ef772c95e0dde9afae1a5b55420d2 Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.029359 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8c5ld" event={"ID":"d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8","Type":"ContainerDied","Data":"ec68f3fa2ff9e373511d3aeef8c0230f731fe74461953d0815f8bec1825f4ac9"} Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.029812 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec68f3fa2ff9e373511d3aeef8c0230f731fe74461953d0815f8bec1825f4ac9" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.029581 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8c5ld" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.030944 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1d44c22-d153-48fd-ac3a-cd43ab7b5339","Type":"ContainerStarted","Data":"7498d543e284a6b1f4a47109845cfde5024ef772c95e0dde9afae1a5b55420d2"} Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.129535 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 14:22:34 crc kubenswrapper[4753]: E0129 14:22:34.130033 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8" containerName="nova-cell0-conductor-db-sync" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.130072 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8" containerName="nova-cell0-conductor-db-sync" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.130288 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8" containerName="nova-cell0-conductor-db-sync" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.130920 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.133270 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zm7mz" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.133578 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.140633 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.172968 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ab7c78-359d-4d8c-9b79-50f20d6d5377" path="/var/lib/kubelet/pods/19ab7c78-359d-4d8c-9b79-50f20d6d5377/volumes" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.228029 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.228168 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.228231 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8vx6\" (UniqueName: \"kubernetes.io/projected/fb7a325b-7833-484f-8bba-7dc85ebf57cd-kube-api-access-b8vx6\") pod \"nova-cell0-conductor-0\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.331140 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8vx6\" (UniqueName: \"kubernetes.io/projected/fb7a325b-7833-484f-8bba-7dc85ebf57cd-kube-api-access-b8vx6\") pod \"nova-cell0-conductor-0\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.331551 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.331735 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.337564 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.343947 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.349673 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8vx6\" (UniqueName: \"kubernetes.io/projected/fb7a325b-7833-484f-8bba-7dc85ebf57cd-kube-api-access-b8vx6\") pod \"nova-cell0-conductor-0\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.461393 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:34 crc kubenswrapper[4753]: I0129 14:22:34.910831 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 14:22:35 crc kubenswrapper[4753]: I0129 14:22:35.046259 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fb7a325b-7833-484f-8bba-7dc85ebf57cd","Type":"ContainerStarted","Data":"272d86541960e9b3fd28693f32258e7157d62b325454981c2f00d03254c22cc8"} Jan 29 14:22:35 crc kubenswrapper[4753]: I0129 14:22:35.048090 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1d44c22-d153-48fd-ac3a-cd43ab7b5339","Type":"ContainerStarted","Data":"646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9"} Jan 29 14:22:36 crc kubenswrapper[4753]: I0129 14:22:36.057554 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fb7a325b-7833-484f-8bba-7dc85ebf57cd","Type":"ContainerStarted","Data":"cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf"} Jan 29 14:22:36 crc kubenswrapper[4753]: I0129 14:22:36.057916 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:36 crc kubenswrapper[4753]: I0129 14:22:36.060205 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1d44c22-d153-48fd-ac3a-cd43ab7b5339","Type":"ContainerStarted","Data":"ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7"} Jan 29 14:22:36 crc kubenswrapper[4753]: I0129 14:22:36.060236 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1d44c22-d153-48fd-ac3a-cd43ab7b5339","Type":"ContainerStarted","Data":"a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53"} Jan 29 14:22:36 crc kubenswrapper[4753]: I0129 14:22:36.079552 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.079530994 podStartE2EDuration="2.079530994s" podCreationTimestamp="2026-01-29 14:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:36.070598993 +0000 UTC m=+1190.765333375" watchObservedRunningTime="2026-01-29 14:22:36.079530994 +0000 UTC m=+1190.774265376" Jan 29 14:22:38 crc kubenswrapper[4753]: I0129 14:22:38.079251 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1d44c22-d153-48fd-ac3a-cd43ab7b5339","Type":"ContainerStarted","Data":"de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939"} Jan 29 14:22:38 crc kubenswrapper[4753]: I0129 14:22:38.079890 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 14:22:38 crc kubenswrapper[4753]: I0129 14:22:38.114602 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.437586747 podStartE2EDuration="5.114576324s" podCreationTimestamp="2026-01-29 14:22:33 +0000 UTC" firstStartedPulling="2026-01-29 14:22:33.965517861 +0000 UTC m=+1188.660252243" lastFinishedPulling="2026-01-29 14:22:37.642507428 +0000 UTC m=+1192.337241820" observedRunningTime="2026-01-29 14:22:38.103007783 +0000 UTC m=+1192.797742255" watchObservedRunningTime="2026-01-29 14:22:38.114576324 +0000 UTC m=+1192.809310706" Jan 29 14:22:44 crc kubenswrapper[4753]: I0129 14:22:44.511138 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.198021 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xnbzg"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.200636 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.204873 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.221723 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xnbzg"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.225568 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.284262 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnpj6\" (UniqueName: \"kubernetes.io/projected/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-kube-api-access-dnpj6\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.284685 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.284830 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-config-data\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.285001 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-scripts\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.385813 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.385875 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-config-data\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.385954 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-scripts\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.386010 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnpj6\" (UniqueName: \"kubernetes.io/projected/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-kube-api-access-dnpj6\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.392462 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-scripts\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.392680 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-config-data\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.410237 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.415405 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnpj6\" (UniqueName: \"kubernetes.io/projected/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-kube-api-access-dnpj6\") pod \"nova-cell0-cell-mapping-xnbzg\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.467319 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.468857 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.471866 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.484117 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.529786 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.532708 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.538840 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.544949 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.555018 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.594707 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-config-data\") pod \"nova-scheduler-0\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " pod="openstack/nova-scheduler-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.595041 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-495sm\" (UniqueName: \"kubernetes.io/projected/d591a9c5-2e54-4a18-a2e3-0facb010b535-kube-api-access-495sm\") pod \"nova-scheduler-0\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " pod="openstack/nova-scheduler-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.595233 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " pod="openstack/nova-scheduler-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.696950 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.698399 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.734543 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.740002 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9eff08-23b2-4ee7-8b11-35ff24520929-logs\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.740114 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-config-data\") pod \"nova-scheduler-0\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " pod="openstack/nova-scheduler-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.740235 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88kz6\" (UniqueName: \"kubernetes.io/projected/c948b692-cedf-4e4c-9876-69cf7f95d8b2-kube-api-access-88kz6\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.740351 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.740437 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxk2h\" (UniqueName: \"kubernetes.io/projected/3f9eff08-23b2-4ee7-8b11-35ff24520929-kube-api-access-bxk2h\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.740512 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-config-data\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.740635 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.740715 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c948b692-cedf-4e4c-9876-69cf7f95d8b2-logs\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.740825 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-495sm\" (UniqueName: \"kubernetes.io/projected/d591a9c5-2e54-4a18-a2e3-0facb010b535-kube-api-access-495sm\") pod \"nova-scheduler-0\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " pod="openstack/nova-scheduler-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.740953 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " pod="openstack/nova-scheduler-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.741080 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-config-data\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.771518 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-config-data\") pod \"nova-scheduler-0\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " pod="openstack/nova-scheduler-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.783853 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " pod="openstack/nova-scheduler-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.821045 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-495sm\" (UniqueName: \"kubernetes.io/projected/d591a9c5-2e54-4a18-a2e3-0facb010b535-kube-api-access-495sm\") pod \"nova-scheduler-0\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " pod="openstack/nova-scheduler-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.821913 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.845206 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.846393 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.848926 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88kz6\" (UniqueName: \"kubernetes.io/projected/c948b692-cedf-4e4c-9876-69cf7f95d8b2-kube-api-access-88kz6\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.848985 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.849006 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxk2h\" (UniqueName: \"kubernetes.io/projected/3f9eff08-23b2-4ee7-8b11-35ff24520929-kube-api-access-bxk2h\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.849022 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-config-data\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.849075 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.849093 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c948b692-cedf-4e4c-9876-69cf7f95d8b2-logs\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.849205 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-config-data\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.849233 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9eff08-23b2-4ee7-8b11-35ff24520929-logs\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.849649 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9eff08-23b2-4ee7-8b11-35ff24520929-logs\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.851635 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c948b692-cedf-4e4c-9876-69cf7f95d8b2-logs\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.858474 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-config-data\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.858814 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.859858 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.868791 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.870591 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-config-data\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.898236 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.908753 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxk2h\" (UniqueName: \"kubernetes.io/projected/3f9eff08-23b2-4ee7-8b11-35ff24520929-kube-api-access-bxk2h\") pod \"nova-metadata-0\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " pod="openstack/nova-metadata-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.913942 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88kz6\" (UniqueName: \"kubernetes.io/projected/c948b692-cedf-4e4c-9876-69cf7f95d8b2-kube-api-access-88kz6\") pod \"nova-api-0\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " pod="openstack/nova-api-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.953971 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb6fj\" (UniqueName: \"kubernetes.io/projected/c49944e0-0732-4d96-9521-f0aac0d45c4a-kube-api-access-tb6fj\") pod \"nova-cell1-novncproxy-0\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.954145 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.954178 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.982168 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-lhrcx"] Jan 29 14:22:45 crc kubenswrapper[4753]: I0129 14:22:45.984107 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.008220 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-lhrcx"] Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.055326 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.055370 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.055401 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-svc\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.055421 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.055460 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb6fj\" (UniqueName: \"kubernetes.io/projected/c49944e0-0732-4d96-9521-f0aac0d45c4a-kube-api-access-tb6fj\") pod \"nova-cell1-novncproxy-0\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.055482 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-swift-storage-0\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.055509 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.055538 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-config\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.055891 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqbm4\" (UniqueName: \"kubernetes.io/projected/593c409e-6f89-4cb9-8ab0-7910612780db-kube-api-access-kqbm4\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.063796 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.064970 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.070182 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb6fj\" (UniqueName: \"kubernetes.io/projected/c49944e0-0732-4d96-9521-f0aac0d45c4a-kube-api-access-tb6fj\") pod \"nova-cell1-novncproxy-0\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.099298 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.161660 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-svc\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.161701 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.161753 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-swift-storage-0\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.161778 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.161808 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-config\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.161858 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqbm4\" (UniqueName: \"kubernetes.io/projected/593c409e-6f89-4cb9-8ab0-7910612780db-kube-api-access-kqbm4\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.162541 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-svc\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.162874 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.163350 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.163683 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-swift-storage-0\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.163895 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-config\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.166743 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.179003 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqbm4\" (UniqueName: \"kubernetes.io/projected/593c409e-6f89-4cb9-8ab0-7910612780db-kube-api-access-kqbm4\") pod \"dnsmasq-dns-5bfb54f9b5-lhrcx\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.201512 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.227098 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.322262 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xnbzg"] Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.324746 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:46 crc kubenswrapper[4753]: W0129 14:22:46.348379 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2d3ab6e_1910_48a5_bbd7_c4ec38a37571.slice/crio-eeaba57164ed03334c820990d1f0578690d47bf35b6b07727373e7fb017fd510 WatchSource:0}: Error finding container eeaba57164ed03334c820990d1f0578690d47bf35b6b07727373e7fb017fd510: Status 404 returned error can't find the container with id eeaba57164ed03334c820990d1f0578690d47bf35b6b07727373e7fb017fd510 Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.585162 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.762343 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hgtzp"] Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.764662 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.769331 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.769509 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.780076 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8ltv\" (UniqueName: \"kubernetes.io/projected/d314405b-2bd4-44b1-93f3-89d92059a50c-kube-api-access-w8ltv\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.780538 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-config-data\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.780776 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.780985 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-scripts\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.799292 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hgtzp"] Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.876855 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.886356 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.887325 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.891948 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-scripts\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.892145 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8ltv\" (UniqueName: \"kubernetes.io/projected/d314405b-2bd4-44b1-93f3-89d92059a50c-kube-api-access-w8ltv\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.892326 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-config-data\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.895564 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.897747 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-scripts\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.898282 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-config-data\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.907843 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8ltv\" (UniqueName: \"kubernetes.io/projected/d314405b-2bd4-44b1-93f3-89d92059a50c-kube-api-access-w8ltv\") pod \"nova-cell1-conductor-db-sync-hgtzp\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:46 crc kubenswrapper[4753]: W0129 14:22:46.944606 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f9eff08_23b2_4ee7_8b11_35ff24520929.slice/crio-07e79512415fff1e2d527dd641f99052a17e6ae1654e7a1b1c4928a3df130aed WatchSource:0}: Error finding container 07e79512415fff1e2d527dd641f99052a17e6ae1654e7a1b1c4928a3df130aed: Status 404 returned error can't find the container with id 07e79512415fff1e2d527dd641f99052a17e6ae1654e7a1b1c4928a3df130aed Jan 29 14:22:46 crc kubenswrapper[4753]: I0129 14:22:46.949557 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.065688 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-lhrcx"] Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.110365 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.204344 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" event={"ID":"593c409e-6f89-4cb9-8ab0-7910612780db","Type":"ContainerStarted","Data":"2a50d1f0c99bdb7f8526600576379385f9397a9272e98e1e55db5b8298016372"} Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.206387 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d591a9c5-2e54-4a18-a2e3-0facb010b535","Type":"ContainerStarted","Data":"58cb66836a5675372b2b22ef87e402ea441d7defc908b35544298af67b007bbe"} Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.209325 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xnbzg" event={"ID":"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571","Type":"ContainerStarted","Data":"7e38bce9afd9e377481b189403130a86ffaab75f07eb1b987d4aa052f07ce6b2"} Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.209381 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xnbzg" event={"ID":"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571","Type":"ContainerStarted","Data":"eeaba57164ed03334c820990d1f0578690d47bf35b6b07727373e7fb017fd510"} Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.214310 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c948b692-cedf-4e4c-9876-69cf7f95d8b2","Type":"ContainerStarted","Data":"7a9fd980235c99a902de7e17f795b7d3512053e65e1461794708ae37ff79f391"} Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.216790 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9eff08-23b2-4ee7-8b11-35ff24520929","Type":"ContainerStarted","Data":"07e79512415fff1e2d527dd641f99052a17e6ae1654e7a1b1c4928a3df130aed"} Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.217972 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c49944e0-0732-4d96-9521-f0aac0d45c4a","Type":"ContainerStarted","Data":"932baa668ebfd1e9b92264b04d1eeb70703449a10fc8b4f94208fb1b7e25698e"} Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.237466 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xnbzg" podStartSLOduration=2.237442262 podStartE2EDuration="2.237442262s" podCreationTimestamp="2026-01-29 14:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:47.226483087 +0000 UTC m=+1201.921217469" watchObservedRunningTime="2026-01-29 14:22:47.237442262 +0000 UTC m=+1201.932176644" Jan 29 14:22:47 crc kubenswrapper[4753]: I0129 14:22:47.697776 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hgtzp"] Jan 29 14:22:48 crc kubenswrapper[4753]: I0129 14:22:48.236256 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hgtzp" event={"ID":"d314405b-2bd4-44b1-93f3-89d92059a50c","Type":"ContainerStarted","Data":"e3a8991b04ac7365eb3eda08eaa04bd54d2d3c72451622541a0846ad24cc2e74"} Jan 29 14:22:48 crc kubenswrapper[4753]: I0129 14:22:48.238595 4753 generic.go:334] "Generic (PLEG): container finished" podID="593c409e-6f89-4cb9-8ab0-7910612780db" containerID="d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0" exitCode=0 Jan 29 14:22:48 crc kubenswrapper[4753]: I0129 14:22:48.238661 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" event={"ID":"593c409e-6f89-4cb9-8ab0-7910612780db","Type":"ContainerDied","Data":"d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0"} Jan 29 14:22:49 crc kubenswrapper[4753]: I0129 14:22:49.341509 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:22:49 crc kubenswrapper[4753]: I0129 14:22:49.350060 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.287862 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c49944e0-0732-4d96-9521-f0aac0d45c4a","Type":"ContainerStarted","Data":"c2dd409f1ec25620fee980ea2ff1580887be01abfbc8ed8b108b40048a1db0f2"} Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.288101 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="c49944e0-0732-4d96-9521-f0aac0d45c4a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c2dd409f1ec25620fee980ea2ff1580887be01abfbc8ed8b108b40048a1db0f2" gracePeriod=30 Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.290661 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hgtzp" event={"ID":"d314405b-2bd4-44b1-93f3-89d92059a50c","Type":"ContainerStarted","Data":"ce2920a02cf6ae18e9004d13fee37b55154d0a3dda23fc0d8807c46cadda2a1c"} Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.293266 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" event={"ID":"593c409e-6f89-4cb9-8ab0-7910612780db","Type":"ContainerStarted","Data":"d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56"} Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.294010 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.295512 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d591a9c5-2e54-4a18-a2e3-0facb010b535","Type":"ContainerStarted","Data":"13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba"} Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.297447 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c948b692-cedf-4e4c-9876-69cf7f95d8b2","Type":"ContainerStarted","Data":"c844790d6a36e2fe98f2d9a31627a26c158334833c94a639c65527d06c578920"} Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.297513 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c948b692-cedf-4e4c-9876-69cf7f95d8b2","Type":"ContainerStarted","Data":"18fc42871e53aacb2ff7c4d1cf5169dc0240bb80bfef3d4f1fb19a89b1daea4e"} Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.299357 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9eff08-23b2-4ee7-8b11-35ff24520929","Type":"ContainerStarted","Data":"c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284"} Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.299391 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9eff08-23b2-4ee7-8b11-35ff24520929","Type":"ContainerStarted","Data":"56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b"} Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.299446 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f9eff08-23b2-4ee7-8b11-35ff24520929" containerName="nova-metadata-log" containerID="cri-o://56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b" gracePeriod=30 Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.299470 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f9eff08-23b2-4ee7-8b11-35ff24520929" containerName="nova-metadata-metadata" containerID="cri-o://c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284" gracePeriod=30 Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.311296 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.160751451 podStartE2EDuration="6.311271769s" podCreationTimestamp="2026-01-29 14:22:45 +0000 UTC" firstStartedPulling="2026-01-29 14:22:46.860575496 +0000 UTC m=+1201.555309878" lastFinishedPulling="2026-01-29 14:22:50.011095804 +0000 UTC m=+1204.705830196" observedRunningTime="2026-01-29 14:22:51.304740804 +0000 UTC m=+1205.999475196" watchObservedRunningTime="2026-01-29 14:22:51.311271769 +0000 UTC m=+1206.006006161" Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.340480 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.214959839 podStartE2EDuration="6.340450544s" podCreationTimestamp="2026-01-29 14:22:45 +0000 UTC" firstStartedPulling="2026-01-29 14:22:46.882900756 +0000 UTC m=+1201.577635148" lastFinishedPulling="2026-01-29 14:22:50.008391451 +0000 UTC m=+1204.703125853" observedRunningTime="2026-01-29 14:22:51.328192454 +0000 UTC m=+1206.022926836" watchObservedRunningTime="2026-01-29 14:22:51.340450544 +0000 UTC m=+1206.035184946" Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.358740 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" podStartSLOduration=6.358715845 podStartE2EDuration="6.358715845s" podCreationTimestamp="2026-01-29 14:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:51.352870798 +0000 UTC m=+1206.047605220" watchObservedRunningTime="2026-01-29 14:22:51.358715845 +0000 UTC m=+1206.053450227" Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.380258 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.315430953 podStartE2EDuration="6.380241095s" podCreationTimestamp="2026-01-29 14:22:45 +0000 UTC" firstStartedPulling="2026-01-29 14:22:46.946951309 +0000 UTC m=+1201.641685681" lastFinishedPulling="2026-01-29 14:22:50.011761441 +0000 UTC m=+1204.706495823" observedRunningTime="2026-01-29 14:22:51.377504031 +0000 UTC m=+1206.072238413" watchObservedRunningTime="2026-01-29 14:22:51.380241095 +0000 UTC m=+1206.074975467" Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.410271 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hgtzp" podStartSLOduration=5.410249431 podStartE2EDuration="5.410249431s" podCreationTimestamp="2026-01-29 14:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:51.391955059 +0000 UTC m=+1206.086689451" watchObservedRunningTime="2026-01-29 14:22:51.410249431 +0000 UTC m=+1206.104983813" Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.419115 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.060586878 podStartE2EDuration="6.419096539s" podCreationTimestamp="2026-01-29 14:22:45 +0000 UTC" firstStartedPulling="2026-01-29 14:22:46.606419611 +0000 UTC m=+1201.301153993" lastFinishedPulling="2026-01-29 14:22:49.964929262 +0000 UTC m=+1204.659663654" observedRunningTime="2026-01-29 14:22:51.406604203 +0000 UTC m=+1206.101338585" watchObservedRunningTime="2026-01-29 14:22:51.419096539 +0000 UTC m=+1206.113830911" Jan 29 14:22:51 crc kubenswrapper[4753]: I0129 14:22:51.908486 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.051482 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-config-data\") pod \"3f9eff08-23b2-4ee7-8b11-35ff24520929\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.051536 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxk2h\" (UniqueName: \"kubernetes.io/projected/3f9eff08-23b2-4ee7-8b11-35ff24520929-kube-api-access-bxk2h\") pod \"3f9eff08-23b2-4ee7-8b11-35ff24520929\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.051678 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-combined-ca-bundle\") pod \"3f9eff08-23b2-4ee7-8b11-35ff24520929\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.051823 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9eff08-23b2-4ee7-8b11-35ff24520929-logs\") pod \"3f9eff08-23b2-4ee7-8b11-35ff24520929\" (UID: \"3f9eff08-23b2-4ee7-8b11-35ff24520929\") " Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.054297 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f9eff08-23b2-4ee7-8b11-35ff24520929-logs" (OuterVolumeSpecName: "logs") pod "3f9eff08-23b2-4ee7-8b11-35ff24520929" (UID: "3f9eff08-23b2-4ee7-8b11-35ff24520929"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.061340 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9eff08-23b2-4ee7-8b11-35ff24520929-kube-api-access-bxk2h" (OuterVolumeSpecName: "kube-api-access-bxk2h") pod "3f9eff08-23b2-4ee7-8b11-35ff24520929" (UID: "3f9eff08-23b2-4ee7-8b11-35ff24520929"). InnerVolumeSpecName "kube-api-access-bxk2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.083257 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f9eff08-23b2-4ee7-8b11-35ff24520929" (UID: "3f9eff08-23b2-4ee7-8b11-35ff24520929"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.105379 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-config-data" (OuterVolumeSpecName: "config-data") pod "3f9eff08-23b2-4ee7-8b11-35ff24520929" (UID: "3f9eff08-23b2-4ee7-8b11-35ff24520929"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.156551 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.156596 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9eff08-23b2-4ee7-8b11-35ff24520929-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.156615 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9eff08-23b2-4ee7-8b11-35ff24520929-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.156633 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxk2h\" (UniqueName: \"kubernetes.io/projected/3f9eff08-23b2-4ee7-8b11-35ff24520929-kube-api-access-bxk2h\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.315684 4753 generic.go:334] "Generic (PLEG): container finished" podID="3f9eff08-23b2-4ee7-8b11-35ff24520929" containerID="c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284" exitCode=0 Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.316318 4753 generic.go:334] "Generic (PLEG): container finished" podID="3f9eff08-23b2-4ee7-8b11-35ff24520929" containerID="56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b" exitCode=143 Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.315868 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9eff08-23b2-4ee7-8b11-35ff24520929","Type":"ContainerDied","Data":"c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284"} Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.318070 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9eff08-23b2-4ee7-8b11-35ff24520929","Type":"ContainerDied","Data":"56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b"} Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.318090 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9eff08-23b2-4ee7-8b11-35ff24520929","Type":"ContainerDied","Data":"07e79512415fff1e2d527dd641f99052a17e6ae1654e7a1b1c4928a3df130aed"} Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.318125 4753 scope.go:117] "RemoveContainer" containerID="c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.315981 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.376306 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.397602 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.404484 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:52 crc kubenswrapper[4753]: E0129 14:22:52.405001 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9eff08-23b2-4ee7-8b11-35ff24520929" containerName="nova-metadata-log" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.405023 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9eff08-23b2-4ee7-8b11-35ff24520929" containerName="nova-metadata-log" Jan 29 14:22:52 crc kubenswrapper[4753]: E0129 14:22:52.405050 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9eff08-23b2-4ee7-8b11-35ff24520929" containerName="nova-metadata-metadata" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.405060 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9eff08-23b2-4ee7-8b11-35ff24520929" containerName="nova-metadata-metadata" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.405259 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9eff08-23b2-4ee7-8b11-35ff24520929" containerName="nova-metadata-metadata" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.405282 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9eff08-23b2-4ee7-8b11-35ff24520929" containerName="nova-metadata-log" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.407781 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.411467 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.413020 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.413219 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.418477 4753 scope.go:117] "RemoveContainer" containerID="56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.442874 4753 scope.go:117] "RemoveContainer" containerID="c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284" Jan 29 14:22:52 crc kubenswrapper[4753]: E0129 14:22:52.444242 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284\": container with ID starting with c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284 not found: ID does not exist" containerID="c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.444300 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284"} err="failed to get container status \"c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284\": rpc error: code = NotFound desc = could not find container \"c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284\": container with ID starting with c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284 not found: ID does not exist" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.444335 4753 scope.go:117] "RemoveContainer" containerID="56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b" Jan 29 14:22:52 crc kubenswrapper[4753]: E0129 14:22:52.446441 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b\": container with ID starting with 56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b not found: ID does not exist" containerID="56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.446488 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b"} err="failed to get container status \"56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b\": rpc error: code = NotFound desc = could not find container \"56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b\": container with ID starting with 56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b not found: ID does not exist" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.446525 4753 scope.go:117] "RemoveContainer" containerID="c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.446912 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284"} err="failed to get container status \"c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284\": rpc error: code = NotFound desc = could not find container \"c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284\": container with ID starting with c8947dcc4207e46e6190822f519b07b1aa2af9a71df1239d5b1998da4c473284 not found: ID does not exist" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.446939 4753 scope.go:117] "RemoveContainer" containerID="56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.447319 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b"} err="failed to get container status \"56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b\": rpc error: code = NotFound desc = could not find container \"56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b\": container with ID starting with 56a84241bdcb113032d7035250002a67dd6c39d3e38eb88017c37184435d6b9b not found: ID does not exist" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.570199 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.570294 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.570916 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjw6s\" (UniqueName: \"kubernetes.io/projected/36a2b054-47de-46b6-b660-2166a618bf3d-kube-api-access-pjw6s\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.571712 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-config-data\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.571943 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36a2b054-47de-46b6-b660-2166a618bf3d-logs\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.674646 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjw6s\" (UniqueName: \"kubernetes.io/projected/36a2b054-47de-46b6-b660-2166a618bf3d-kube-api-access-pjw6s\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.674725 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-config-data\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.674774 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36a2b054-47de-46b6-b660-2166a618bf3d-logs\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.674848 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.674886 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.675593 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36a2b054-47de-46b6-b660-2166a618bf3d-logs\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.683594 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-config-data\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.683977 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.696934 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.723953 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjw6s\" (UniqueName: \"kubernetes.io/projected/36a2b054-47de-46b6-b660-2166a618bf3d-kube-api-access-pjw6s\") pod \"nova-metadata-0\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " pod="openstack/nova-metadata-0" Jan 29 14:22:52 crc kubenswrapper[4753]: I0129 14:22:52.739067 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:22:53 crc kubenswrapper[4753]: I0129 14:22:53.251245 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:53 crc kubenswrapper[4753]: W0129 14:22:53.270054 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36a2b054_47de_46b6_b660_2166a618bf3d.slice/crio-856647db80b8f4b2438f9d5334e97e934eee35979993ac35cf9bfdf720e0a45a WatchSource:0}: Error finding container 856647db80b8f4b2438f9d5334e97e934eee35979993ac35cf9bfdf720e0a45a: Status 404 returned error can't find the container with id 856647db80b8f4b2438f9d5334e97e934eee35979993ac35cf9bfdf720e0a45a Jan 29 14:22:53 crc kubenswrapper[4753]: I0129 14:22:53.333471 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36a2b054-47de-46b6-b660-2166a618bf3d","Type":"ContainerStarted","Data":"856647db80b8f4b2438f9d5334e97e934eee35979993ac35cf9bfdf720e0a45a"} Jan 29 14:22:54 crc kubenswrapper[4753]: I0129 14:22:54.166027 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9eff08-23b2-4ee7-8b11-35ff24520929" path="/var/lib/kubelet/pods/3f9eff08-23b2-4ee7-8b11-35ff24520929/volumes" Jan 29 14:22:54 crc kubenswrapper[4753]: I0129 14:22:54.347763 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36a2b054-47de-46b6-b660-2166a618bf3d","Type":"ContainerStarted","Data":"337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906"} Jan 29 14:22:54 crc kubenswrapper[4753]: I0129 14:22:54.348871 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36a2b054-47de-46b6-b660-2166a618bf3d","Type":"ContainerStarted","Data":"0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001"} Jan 29 14:22:54 crc kubenswrapper[4753]: I0129 14:22:54.381586 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.3815584100000002 podStartE2EDuration="2.38155841s" podCreationTimestamp="2026-01-29 14:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:22:54.373830923 +0000 UTC m=+1209.068565345" watchObservedRunningTime="2026-01-29 14:22:54.38155841 +0000 UTC m=+1209.076292802" Jan 29 14:22:55 crc kubenswrapper[4753]: I0129 14:22:55.360100 4753 generic.go:334] "Generic (PLEG): container finished" podID="b2d3ab6e-1910-48a5-bbd7-c4ec38a37571" containerID="7e38bce9afd9e377481b189403130a86ffaab75f07eb1b987d4aa052f07ce6b2" exitCode=0 Jan 29 14:22:55 crc kubenswrapper[4753]: I0129 14:22:55.360273 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xnbzg" event={"ID":"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571","Type":"ContainerDied","Data":"7e38bce9afd9e377481b189403130a86ffaab75f07eb1b987d4aa052f07ce6b2"} Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.100047 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.100499 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.140948 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.202337 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.202385 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.227412 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.326365 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.388403 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-j42ph"] Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.389250 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" podUID="dbf11a80-f998-4c27-8534-c6634ef15703" containerName="dnsmasq-dns" containerID="cri-o://0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864" gracePeriod=10 Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.454251 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.952458 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.990134 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-scripts\") pod \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.990215 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-config-data\") pod \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.990266 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnpj6\" (UniqueName: \"kubernetes.io/projected/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-kube-api-access-dnpj6\") pod \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " Jan 29 14:22:56 crc kubenswrapper[4753]: I0129 14:22:56.990345 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-combined-ca-bundle\") pod \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\" (UID: \"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571\") " Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.007118 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-kube-api-access-dnpj6" (OuterVolumeSpecName: "kube-api-access-dnpj6") pod "b2d3ab6e-1910-48a5-bbd7-c4ec38a37571" (UID: "b2d3ab6e-1910-48a5-bbd7-c4ec38a37571"). InnerVolumeSpecName "kube-api-access-dnpj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.011315 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-scripts" (OuterVolumeSpecName: "scripts") pod "b2d3ab6e-1910-48a5-bbd7-c4ec38a37571" (UID: "b2d3ab6e-1910-48a5-bbd7-c4ec38a37571"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.091698 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.091730 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnpj6\" (UniqueName: \"kubernetes.io/projected/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-kube-api-access-dnpj6\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.099305 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-config-data" (OuterVolumeSpecName: "config-data") pod "b2d3ab6e-1910-48a5-bbd7-c4ec38a37571" (UID: "b2d3ab6e-1910-48a5-bbd7-c4ec38a37571"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.118087 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2d3ab6e-1910-48a5-bbd7-c4ec38a37571" (UID: "b2d3ab6e-1910-48a5-bbd7-c4ec38a37571"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.137295 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.219319 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.220979 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.286353 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.286365 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.323755 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-config\") pod \"dbf11a80-f998-4c27-8534-c6634ef15703\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.323925 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdbhh\" (UniqueName: \"kubernetes.io/projected/dbf11a80-f998-4c27-8534-c6634ef15703-kube-api-access-jdbhh\") pod \"dbf11a80-f998-4c27-8534-c6634ef15703\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.324036 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-nb\") pod \"dbf11a80-f998-4c27-8534-c6634ef15703\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.324732 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-svc\") pod \"dbf11a80-f998-4c27-8534-c6634ef15703\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.325655 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-sb\") pod \"dbf11a80-f998-4c27-8534-c6634ef15703\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.326134 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-swift-storage-0\") pod \"dbf11a80-f998-4c27-8534-c6634ef15703\" (UID: \"dbf11a80-f998-4c27-8534-c6634ef15703\") " Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.328990 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf11a80-f998-4c27-8534-c6634ef15703-kube-api-access-jdbhh" (OuterVolumeSpecName: "kube-api-access-jdbhh") pod "dbf11a80-f998-4c27-8534-c6634ef15703" (UID: "dbf11a80-f998-4c27-8534-c6634ef15703"). InnerVolumeSpecName "kube-api-access-jdbhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.339872 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdbhh\" (UniqueName: \"kubernetes.io/projected/dbf11a80-f998-4c27-8534-c6634ef15703-kube-api-access-jdbhh\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.372900 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dbf11a80-f998-4c27-8534-c6634ef15703" (UID: "dbf11a80-f998-4c27-8534-c6634ef15703"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.383215 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dbf11a80-f998-4c27-8534-c6634ef15703" (UID: "dbf11a80-f998-4c27-8534-c6634ef15703"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.388041 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-config" (OuterVolumeSpecName: "config") pod "dbf11a80-f998-4c27-8534-c6634ef15703" (UID: "dbf11a80-f998-4c27-8534-c6634ef15703"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.392009 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dbf11a80-f998-4c27-8534-c6634ef15703" (UID: "dbf11a80-f998-4c27-8534-c6634ef15703"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.392989 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dbf11a80-f998-4c27-8534-c6634ef15703" (UID: "dbf11a80-f998-4c27-8534-c6634ef15703"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.398516 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xnbzg" event={"ID":"b2d3ab6e-1910-48a5-bbd7-c4ec38a37571","Type":"ContainerDied","Data":"eeaba57164ed03334c820990d1f0578690d47bf35b6b07727373e7fb017fd510"} Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.398573 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeaba57164ed03334c820990d1f0578690d47bf35b6b07727373e7fb017fd510" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.398681 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xnbzg" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.411462 4753 generic.go:334] "Generic (PLEG): container finished" podID="dbf11a80-f998-4c27-8534-c6634ef15703" containerID="0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864" exitCode=0 Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.411630 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" event={"ID":"dbf11a80-f998-4c27-8534-c6634ef15703","Type":"ContainerDied","Data":"0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864"} Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.411665 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" event={"ID":"dbf11a80-f998-4c27-8534-c6634ef15703","Type":"ContainerDied","Data":"bb4b8c9c6714ae832c15e65edc1dc6e985937e4cce85dde6168b7fa51ce8bdc8"} Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.411686 4753 scope.go:117] "RemoveContainer" containerID="0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.411758 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.442277 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.442617 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.442627 4753 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.442639 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.442647 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbf11a80-f998-4c27-8534-c6634ef15703-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.472711 4753 scope.go:117] "RemoveContainer" containerID="f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.475318 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-j42ph"] Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.484948 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b4f5fc4f-j42ph"] Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.497368 4753 scope.go:117] "RemoveContainer" containerID="0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864" Jan 29 14:22:57 crc kubenswrapper[4753]: E0129 14:22:57.497875 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864\": container with ID starting with 0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864 not found: ID does not exist" containerID="0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.497930 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864"} err="failed to get container status \"0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864\": rpc error: code = NotFound desc = could not find container \"0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864\": container with ID starting with 0a996bb769a00e7cbd4fb03b051fe90bde9cd00b0952deed575f4c04bdfc3864 not found: ID does not exist" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.497964 4753 scope.go:117] "RemoveContainer" containerID="f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7" Jan 29 14:22:57 crc kubenswrapper[4753]: E0129 14:22:57.499303 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7\": container with ID starting with f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7 not found: ID does not exist" containerID="f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.499358 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7"} err="failed to get container status \"f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7\": rpc error: code = NotFound desc = could not find container \"f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7\": container with ID starting with f28defe7945c7f42934f571d02ead0f885b8d22f5473fe380b1c97a96f007ac7 not found: ID does not exist" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.632266 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.632702 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerName="nova-api-log" containerID="cri-o://18fc42871e53aacb2ff7c4d1cf5169dc0240bb80bfef3d4f1fb19a89b1daea4e" gracePeriod=30 Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.632824 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerName="nova-api-api" containerID="cri-o://c844790d6a36e2fe98f2d9a31627a26c158334833c94a639c65527d06c578920" gracePeriod=30 Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.650316 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.659192 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.659422 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="36a2b054-47de-46b6-b660-2166a618bf3d" containerName="nova-metadata-log" containerID="cri-o://0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001" gracePeriod=30 Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.659568 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="36a2b054-47de-46b6-b660-2166a618bf3d" containerName="nova-metadata-metadata" containerID="cri-o://337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906" gracePeriod=30 Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.739518 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 14:22:57 crc kubenswrapper[4753]: I0129 14:22:57.739569 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.165537 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf11a80-f998-4c27-8534-c6634ef15703" path="/var/lib/kubelet/pods/dbf11a80-f998-4c27-8534-c6634ef15703/volumes" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.344792 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.366927 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-config-data\") pod \"36a2b054-47de-46b6-b660-2166a618bf3d\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.367054 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-combined-ca-bundle\") pod \"36a2b054-47de-46b6-b660-2166a618bf3d\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.367103 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36a2b054-47de-46b6-b660-2166a618bf3d-logs\") pod \"36a2b054-47de-46b6-b660-2166a618bf3d\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.367221 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjw6s\" (UniqueName: \"kubernetes.io/projected/36a2b054-47de-46b6-b660-2166a618bf3d-kube-api-access-pjw6s\") pod \"36a2b054-47de-46b6-b660-2166a618bf3d\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.367322 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-nova-metadata-tls-certs\") pod \"36a2b054-47de-46b6-b660-2166a618bf3d\" (UID: \"36a2b054-47de-46b6-b660-2166a618bf3d\") " Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.367910 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a2b054-47de-46b6-b660-2166a618bf3d-logs" (OuterVolumeSpecName: "logs") pod "36a2b054-47de-46b6-b660-2166a618bf3d" (UID: "36a2b054-47de-46b6-b660-2166a618bf3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.377334 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a2b054-47de-46b6-b660-2166a618bf3d-kube-api-access-pjw6s" (OuterVolumeSpecName: "kube-api-access-pjw6s") pod "36a2b054-47de-46b6-b660-2166a618bf3d" (UID: "36a2b054-47de-46b6-b660-2166a618bf3d"). InnerVolumeSpecName "kube-api-access-pjw6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.397091 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-config-data" (OuterVolumeSpecName: "config-data") pod "36a2b054-47de-46b6-b660-2166a618bf3d" (UID: "36a2b054-47de-46b6-b660-2166a618bf3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.422910 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36a2b054-47de-46b6-b660-2166a618bf3d" (UID: "36a2b054-47de-46b6-b660-2166a618bf3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.428784 4753 generic.go:334] "Generic (PLEG): container finished" podID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerID="18fc42871e53aacb2ff7c4d1cf5169dc0240bb80bfef3d4f1fb19a89b1daea4e" exitCode=143 Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.428938 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c948b692-cedf-4e4c-9876-69cf7f95d8b2","Type":"ContainerDied","Data":"18fc42871e53aacb2ff7c4d1cf5169dc0240bb80bfef3d4f1fb19a89b1daea4e"} Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.431992 4753 generic.go:334] "Generic (PLEG): container finished" podID="36a2b054-47de-46b6-b660-2166a618bf3d" containerID="337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906" exitCode=0 Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.432114 4753 generic.go:334] "Generic (PLEG): container finished" podID="36a2b054-47de-46b6-b660-2166a618bf3d" containerID="0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001" exitCode=143 Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.432305 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36a2b054-47de-46b6-b660-2166a618bf3d","Type":"ContainerDied","Data":"337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906"} Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.432418 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36a2b054-47de-46b6-b660-2166a618bf3d","Type":"ContainerDied","Data":"0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001"} Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.432499 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36a2b054-47de-46b6-b660-2166a618bf3d","Type":"ContainerDied","Data":"856647db80b8f4b2438f9d5334e97e934eee35979993ac35cf9bfdf720e0a45a"} Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.432572 4753 scope.go:117] "RemoveContainer" containerID="337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.432769 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.437812 4753 generic.go:334] "Generic (PLEG): container finished" podID="d314405b-2bd4-44b1-93f3-89d92059a50c" containerID="ce2920a02cf6ae18e9004d13fee37b55154d0a3dda23fc0d8807c46cadda2a1c" exitCode=0 Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.438272 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hgtzp" event={"ID":"d314405b-2bd4-44b1-93f3-89d92059a50c","Type":"ContainerDied","Data":"ce2920a02cf6ae18e9004d13fee37b55154d0a3dda23fc0d8807c46cadda2a1c"} Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.441648 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d591a9c5-2e54-4a18-a2e3-0facb010b535" containerName="nova-scheduler-scheduler" containerID="cri-o://13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba" gracePeriod=30 Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.454469 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "36a2b054-47de-46b6-b660-2166a618bf3d" (UID: "36a2b054-47de-46b6-b660-2166a618bf3d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.469230 4753 scope.go:117] "RemoveContainer" containerID="0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.469709 4753 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.469746 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.469759 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a2b054-47de-46b6-b660-2166a618bf3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.469770 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36a2b054-47de-46b6-b660-2166a618bf3d-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.469779 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjw6s\" (UniqueName: \"kubernetes.io/projected/36a2b054-47de-46b6-b660-2166a618bf3d-kube-api-access-pjw6s\") on node \"crc\" DevicePath \"\"" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.491447 4753 scope.go:117] "RemoveContainer" containerID="337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906" Jan 29 14:22:58 crc kubenswrapper[4753]: E0129 14:22:58.492046 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906\": container with ID starting with 337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906 not found: ID does not exist" containerID="337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.492117 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906"} err="failed to get container status \"337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906\": rpc error: code = NotFound desc = could not find container \"337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906\": container with ID starting with 337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906 not found: ID does not exist" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.492247 4753 scope.go:117] "RemoveContainer" containerID="0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001" Jan 29 14:22:58 crc kubenswrapper[4753]: E0129 14:22:58.492532 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001\": container with ID starting with 0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001 not found: ID does not exist" containerID="0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.492565 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001"} err="failed to get container status \"0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001\": rpc error: code = NotFound desc = could not find container \"0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001\": container with ID starting with 0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001 not found: ID does not exist" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.492584 4753 scope.go:117] "RemoveContainer" containerID="337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.492838 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906"} err="failed to get container status \"337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906\": rpc error: code = NotFound desc = could not find container \"337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906\": container with ID starting with 337a4064fe25a02176178e097cfcb5ddadec27a5aa32eab15c98ba57ce39b906 not found: ID does not exist" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.492869 4753 scope.go:117] "RemoveContainer" containerID="0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.493312 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001"} err="failed to get container status \"0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001\": rpc error: code = NotFound desc = could not find container \"0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001\": container with ID starting with 0f086577218fbaa058ac7c7c032860b433e475b7b6a5314fabda28d66ebf9001 not found: ID does not exist" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.784687 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.796274 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.810555 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:58 crc kubenswrapper[4753]: E0129 14:22:58.811063 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf11a80-f998-4c27-8534-c6634ef15703" containerName="dnsmasq-dns" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.811081 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf11a80-f998-4c27-8534-c6634ef15703" containerName="dnsmasq-dns" Jan 29 14:22:58 crc kubenswrapper[4753]: E0129 14:22:58.811099 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a2b054-47de-46b6-b660-2166a618bf3d" containerName="nova-metadata-metadata" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.811105 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a2b054-47de-46b6-b660-2166a618bf3d" containerName="nova-metadata-metadata" Jan 29 14:22:58 crc kubenswrapper[4753]: E0129 14:22:58.811122 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2d3ab6e-1910-48a5-bbd7-c4ec38a37571" containerName="nova-manage" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.811129 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2d3ab6e-1910-48a5-bbd7-c4ec38a37571" containerName="nova-manage" Jan 29 14:22:58 crc kubenswrapper[4753]: E0129 14:22:58.811144 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf11a80-f998-4c27-8534-c6634ef15703" containerName="init" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.811162 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf11a80-f998-4c27-8534-c6634ef15703" containerName="init" Jan 29 14:22:58 crc kubenswrapper[4753]: E0129 14:22:58.811180 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a2b054-47de-46b6-b660-2166a618bf3d" containerName="nova-metadata-log" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.811185 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a2b054-47de-46b6-b660-2166a618bf3d" containerName="nova-metadata-log" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.811366 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a2b054-47de-46b6-b660-2166a618bf3d" containerName="nova-metadata-log" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.811376 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a2b054-47de-46b6-b660-2166a618bf3d" containerName="nova-metadata-metadata" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.811389 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2d3ab6e-1910-48a5-bbd7-c4ec38a37571" containerName="nova-manage" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.811406 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf11a80-f998-4c27-8534-c6634ef15703" containerName="dnsmasq-dns" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.812459 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.815726 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.815909 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.829654 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.881686 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.881769 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-config-data\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.881805 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spv8n\" (UniqueName: \"kubernetes.io/projected/7f9b383a-32b0-4302-96ae-4bcd900cd383-kube-api-access-spv8n\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.881838 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9b383a-32b0-4302-96ae-4bcd900cd383-logs\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.881911 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.984304 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-config-data\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.984716 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spv8n\" (UniqueName: \"kubernetes.io/projected/7f9b383a-32b0-4302-96ae-4bcd900cd383-kube-api-access-spv8n\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.984944 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9b383a-32b0-4302-96ae-4bcd900cd383-logs\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.985222 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.985553 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.985564 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9b383a-32b0-4302-96ae-4bcd900cd383-logs\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.991062 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:58 crc kubenswrapper[4753]: I0129 14:22:58.991929 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-config-data\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:59 crc kubenswrapper[4753]: I0129 14:22:59.000367 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:59 crc kubenswrapper[4753]: I0129 14:22:59.026285 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spv8n\" (UniqueName: \"kubernetes.io/projected/7f9b383a-32b0-4302-96ae-4bcd900cd383-kube-api-access-spv8n\") pod \"nova-metadata-0\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " pod="openstack/nova-metadata-0" Jan 29 14:22:59 crc kubenswrapper[4753]: I0129 14:22:59.127130 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:22:59 crc kubenswrapper[4753]: I0129 14:22:59.638782 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:22:59 crc kubenswrapper[4753]: I0129 14:22:59.906286 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.007905 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-config-data\") pod \"d314405b-2bd4-44b1-93f3-89d92059a50c\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.008003 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8ltv\" (UniqueName: \"kubernetes.io/projected/d314405b-2bd4-44b1-93f3-89d92059a50c-kube-api-access-w8ltv\") pod \"d314405b-2bd4-44b1-93f3-89d92059a50c\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.008192 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-combined-ca-bundle\") pod \"d314405b-2bd4-44b1-93f3-89d92059a50c\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.008228 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-scripts\") pod \"d314405b-2bd4-44b1-93f3-89d92059a50c\" (UID: \"d314405b-2bd4-44b1-93f3-89d92059a50c\") " Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.014700 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d314405b-2bd4-44b1-93f3-89d92059a50c-kube-api-access-w8ltv" (OuterVolumeSpecName: "kube-api-access-w8ltv") pod "d314405b-2bd4-44b1-93f3-89d92059a50c" (UID: "d314405b-2bd4-44b1-93f3-89d92059a50c"). InnerVolumeSpecName "kube-api-access-w8ltv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.019071 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-scripts" (OuterVolumeSpecName: "scripts") pod "d314405b-2bd4-44b1-93f3-89d92059a50c" (UID: "d314405b-2bd4-44b1-93f3-89d92059a50c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.039525 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d314405b-2bd4-44b1-93f3-89d92059a50c" (UID: "d314405b-2bd4-44b1-93f3-89d92059a50c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.039979 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-config-data" (OuterVolumeSpecName: "config-data") pod "d314405b-2bd4-44b1-93f3-89d92059a50c" (UID: "d314405b-2bd4-44b1-93f3-89d92059a50c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.110903 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.111345 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.111358 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d314405b-2bd4-44b1-93f3-89d92059a50c-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.111373 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8ltv\" (UniqueName: \"kubernetes.io/projected/d314405b-2bd4-44b1-93f3-89d92059a50c-kube-api-access-w8ltv\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.163378 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a2b054-47de-46b6-b660-2166a618bf3d" path="/var/lib/kubelet/pods/36a2b054-47de-46b6-b660-2166a618bf3d/volumes" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.461400 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f9b383a-32b0-4302-96ae-4bcd900cd383","Type":"ContainerStarted","Data":"308efde5a1b73816a7b74039efd8e127ce5aafd5f6d5ee3906227a9fcfccf2b8"} Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.461503 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f9b383a-32b0-4302-96ae-4bcd900cd383","Type":"ContainerStarted","Data":"06f2a776446be8e207b874f0c09730d6dd63037c338c1c964cfe4ac94fa1b220"} Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.461534 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f9b383a-32b0-4302-96ae-4bcd900cd383","Type":"ContainerStarted","Data":"7ade97f6a50978c412d673c038a2ea1a096aec1c8a7e3af70a77c8dd43c962b0"} Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.463451 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hgtzp" event={"ID":"d314405b-2bd4-44b1-93f3-89d92059a50c","Type":"ContainerDied","Data":"e3a8991b04ac7365eb3eda08eaa04bd54d2d3c72451622541a0846ad24cc2e74"} Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.463495 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hgtzp" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.463504 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3a8991b04ac7365eb3eda08eaa04bd54d2d3c72451622541a0846ad24cc2e74" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.499191 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.499162225 podStartE2EDuration="2.499162225s" podCreationTimestamp="2026-01-29 14:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:00.488462218 +0000 UTC m=+1215.183196640" watchObservedRunningTime="2026-01-29 14:23:00.499162225 +0000 UTC m=+1215.193896627" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.557672 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 14:23:00 crc kubenswrapper[4753]: E0129 14:23:00.558250 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d314405b-2bd4-44b1-93f3-89d92059a50c" containerName="nova-cell1-conductor-db-sync" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.558276 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d314405b-2bd4-44b1-93f3-89d92059a50c" containerName="nova-cell1-conductor-db-sync" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.558494 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d314405b-2bd4-44b1-93f3-89d92059a50c" containerName="nova-cell1-conductor-db-sync" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.559404 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.562288 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.570250 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.625071 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.625204 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.625247 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxrm9\" (UniqueName: \"kubernetes.io/projected/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-kube-api-access-pxrm9\") pod \"nova-cell1-conductor-0\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.726975 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.727098 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxrm9\" (UniqueName: \"kubernetes.io/projected/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-kube-api-access-pxrm9\") pod \"nova-cell1-conductor-0\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.727321 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.736783 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.737288 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.746118 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxrm9\" (UniqueName: \"kubernetes.io/projected/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-kube-api-access-pxrm9\") pod \"nova-cell1-conductor-0\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:00 crc kubenswrapper[4753]: I0129 14:23:00.888784 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:01 crc kubenswrapper[4753]: E0129 14:23:01.104135 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 14:23:01 crc kubenswrapper[4753]: E0129 14:23:01.106319 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 14:23:01 crc kubenswrapper[4753]: E0129 14:23:01.108608 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 14:23:01 crc kubenswrapper[4753]: E0129 14:23:01.108787 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d591a9c5-2e54-4a18-a2e3-0facb010b535" containerName="nova-scheduler-scheduler" Jan 29 14:23:01 crc kubenswrapper[4753]: I0129 14:23:01.408106 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 14:23:01 crc kubenswrapper[4753]: W0129 14:23:01.417004 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf44a6dc_a0bc_487f_b82a_05e0d08aa7ea.slice/crio-1552ae38dfaf11b830c740e2f045634d853aca7e5898bddb3981b56216ade5a1 WatchSource:0}: Error finding container 1552ae38dfaf11b830c740e2f045634d853aca7e5898bddb3981b56216ade5a1: Status 404 returned error can't find the container with id 1552ae38dfaf11b830c740e2f045634d853aca7e5898bddb3981b56216ade5a1 Jan 29 14:23:01 crc kubenswrapper[4753]: I0129 14:23:01.475724 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea","Type":"ContainerStarted","Data":"1552ae38dfaf11b830c740e2f045634d853aca7e5898bddb3981b56216ade5a1"} Jan 29 14:23:01 crc kubenswrapper[4753]: I0129 14:23:01.737793 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b4f5fc4f-j42ph" podUID="dbf11a80-f998-4c27-8534-c6634ef15703" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.164:5353: i/o timeout" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.467988 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.472882 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-combined-ca-bundle\") pod \"d591a9c5-2e54-4a18-a2e3-0facb010b535\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.473013 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-config-data\") pod \"d591a9c5-2e54-4a18-a2e3-0facb010b535\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.473138 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-495sm\" (UniqueName: \"kubernetes.io/projected/d591a9c5-2e54-4a18-a2e3-0facb010b535-kube-api-access-495sm\") pod \"d591a9c5-2e54-4a18-a2e3-0facb010b535\" (UID: \"d591a9c5-2e54-4a18-a2e3-0facb010b535\") " Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.503592 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d591a9c5-2e54-4a18-a2e3-0facb010b535-kube-api-access-495sm" (OuterVolumeSpecName: "kube-api-access-495sm") pod "d591a9c5-2e54-4a18-a2e3-0facb010b535" (UID: "d591a9c5-2e54-4a18-a2e3-0facb010b535"). InnerVolumeSpecName "kube-api-access-495sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.526412 4753 generic.go:334] "Generic (PLEG): container finished" podID="d591a9c5-2e54-4a18-a2e3-0facb010b535" containerID="13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba" exitCode=0 Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.526505 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.526522 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d591a9c5-2e54-4a18-a2e3-0facb010b535","Type":"ContainerDied","Data":"13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba"} Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.526769 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d591a9c5-2e54-4a18-a2e3-0facb010b535","Type":"ContainerDied","Data":"58cb66836a5675372b2b22ef87e402ea441d7defc908b35544298af67b007bbe"} Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.526818 4753 scope.go:117] "RemoveContainer" containerID="13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.531354 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea","Type":"ContainerStarted","Data":"1b5479e0d4430d6fc3745e3ed9afa5a4334d10df67d89187482d0455266b8f05"} Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.531645 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.531779 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-config-data" (OuterVolumeSpecName: "config-data") pod "d591a9c5-2e54-4a18-a2e3-0facb010b535" (UID: "d591a9c5-2e54-4a18-a2e3-0facb010b535"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.550769 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d591a9c5-2e54-4a18-a2e3-0facb010b535" (UID: "d591a9c5-2e54-4a18-a2e3-0facb010b535"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.577953 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-495sm\" (UniqueName: \"kubernetes.io/projected/d591a9c5-2e54-4a18-a2e3-0facb010b535-kube-api-access-495sm\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.577992 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.578006 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d591a9c5-2e54-4a18-a2e3-0facb010b535-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.587021 4753 scope.go:117] "RemoveContainer" containerID="13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba" Jan 29 14:23:02 crc kubenswrapper[4753]: E0129 14:23:02.588306 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba\": container with ID starting with 13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba not found: ID does not exist" containerID="13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.588420 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba"} err="failed to get container status \"13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba\": rpc error: code = NotFound desc = could not find container \"13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba\": container with ID starting with 13e5d544e013224734c99bf79120a2ccb1fb03f69eb80d57839ca51f6bdc45ba not found: ID does not exist" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.864992 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.8649624 podStartE2EDuration="2.8649624s" podCreationTimestamp="2026-01-29 14:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:02.550656448 +0000 UTC m=+1217.245390840" watchObservedRunningTime="2026-01-29 14:23:02.8649624 +0000 UTC m=+1217.559696802" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.876048 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.904395 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.922966 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:02 crc kubenswrapper[4753]: E0129 14:23:02.923541 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d591a9c5-2e54-4a18-a2e3-0facb010b535" containerName="nova-scheduler-scheduler" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.923567 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d591a9c5-2e54-4a18-a2e3-0facb010b535" containerName="nova-scheduler-scheduler" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.923868 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d591a9c5-2e54-4a18-a2e3-0facb010b535" containerName="nova-scheduler-scheduler" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.924724 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.927416 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.935923 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.986974 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-config-data\") pod \"nova-scheduler-0\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.987056 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:02 crc kubenswrapper[4753]: I0129 14:23:02.987127 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4ts6\" (UniqueName: \"kubernetes.io/projected/1fff4213-2daa-4a3d-9802-ee2f39232e15-kube-api-access-w4ts6\") pod \"nova-scheduler-0\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.089221 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.089757 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4ts6\" (UniqueName: \"kubernetes.io/projected/1fff4213-2daa-4a3d-9802-ee2f39232e15-kube-api-access-w4ts6\") pod \"nova-scheduler-0\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.090307 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-config-data\") pod \"nova-scheduler-0\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.097144 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-config-data\") pod \"nova-scheduler-0\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.100967 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.121250 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4ts6\" (UniqueName: \"kubernetes.io/projected/1fff4213-2daa-4a3d-9802-ee2f39232e15-kube-api-access-w4ts6\") pod \"nova-scheduler-0\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.253984 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.511333 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.553108 4753 generic.go:334] "Generic (PLEG): container finished" podID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerID="c844790d6a36e2fe98f2d9a31627a26c158334833c94a639c65527d06c578920" exitCode=0 Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.553197 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c948b692-cedf-4e4c-9876-69cf7f95d8b2","Type":"ContainerDied","Data":"c844790d6a36e2fe98f2d9a31627a26c158334833c94a639c65527d06c578920"} Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.572678 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.714084 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c948b692-cedf-4e4c-9876-69cf7f95d8b2-logs\") pod \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.714616 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-config-data\") pod \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.714497 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c948b692-cedf-4e4c-9876-69cf7f95d8b2-logs" (OuterVolumeSpecName: "logs") pod "c948b692-cedf-4e4c-9876-69cf7f95d8b2" (UID: "c948b692-cedf-4e4c-9876-69cf7f95d8b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.714708 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88kz6\" (UniqueName: \"kubernetes.io/projected/c948b692-cedf-4e4c-9876-69cf7f95d8b2-kube-api-access-88kz6\") pod \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.714807 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-combined-ca-bundle\") pod \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\" (UID: \"c948b692-cedf-4e4c-9876-69cf7f95d8b2\") " Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.715204 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c948b692-cedf-4e4c-9876-69cf7f95d8b2-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.721762 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c948b692-cedf-4e4c-9876-69cf7f95d8b2-kube-api-access-88kz6" (OuterVolumeSpecName: "kube-api-access-88kz6") pod "c948b692-cedf-4e4c-9876-69cf7f95d8b2" (UID: "c948b692-cedf-4e4c-9876-69cf7f95d8b2"). InnerVolumeSpecName "kube-api-access-88kz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.755284 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-config-data" (OuterVolumeSpecName: "config-data") pod "c948b692-cedf-4e4c-9876-69cf7f95d8b2" (UID: "c948b692-cedf-4e4c-9876-69cf7f95d8b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.760027 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c948b692-cedf-4e4c-9876-69cf7f95d8b2" (UID: "c948b692-cedf-4e4c-9876-69cf7f95d8b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.772384 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.816381 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.816500 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88kz6\" (UniqueName: \"kubernetes.io/projected/c948b692-cedf-4e4c-9876-69cf7f95d8b2-kube-api-access-88kz6\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:03 crc kubenswrapper[4753]: I0129 14:23:03.816558 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c948b692-cedf-4e4c-9876-69cf7f95d8b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.127666 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.130427 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.175223 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d591a9c5-2e54-4a18-a2e3-0facb010b535" path="/var/lib/kubelet/pods/d591a9c5-2e54-4a18-a2e3-0facb010b535/volumes" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.589416 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fff4213-2daa-4a3d-9802-ee2f39232e15","Type":"ContainerStarted","Data":"d21f9f454fbad56dc31ecf236885f4f7d85e3d99cbd2b1aab2f6aca36d593874"} Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.589531 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fff4213-2daa-4a3d-9802-ee2f39232e15","Type":"ContainerStarted","Data":"6edc214cbc4c81d13d9353a233b801d7de40b8232d1f179121b79e96bf952152"} Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.595972 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.597049 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c948b692-cedf-4e4c-9876-69cf7f95d8b2","Type":"ContainerDied","Data":"7a9fd980235c99a902de7e17f795b7d3512053e65e1461794708ae37ff79f391"} Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.597107 4753 scope.go:117] "RemoveContainer" containerID="c844790d6a36e2fe98f2d9a31627a26c158334833c94a639c65527d06c578920" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.628277 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.62819937 podStartE2EDuration="2.62819937s" podCreationTimestamp="2026-01-29 14:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:04.61855433 +0000 UTC m=+1219.313288742" watchObservedRunningTime="2026-01-29 14:23:04.62819937 +0000 UTC m=+1219.322933762" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.674188 4753 scope.go:117] "RemoveContainer" containerID="18fc42871e53aacb2ff7c4d1cf5169dc0240bb80bfef3d4f1fb19a89b1daea4e" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.699856 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.716814 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.730460 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:04 crc kubenswrapper[4753]: E0129 14:23:04.731569 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerName="nova-api-api" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.731588 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerName="nova-api-api" Jan 29 14:23:04 crc kubenswrapper[4753]: E0129 14:23:04.731622 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerName="nova-api-log" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.731628 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerName="nova-api-log" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.731827 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerName="nova-api-api" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.731853 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" containerName="nova-api-log" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.732984 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.735956 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.741542 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.845616 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-config-data\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.845730 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.845968 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc50a465-1831-48ea-aeda-d9861e5e2007-logs\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.846319 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77c8r\" (UniqueName: \"kubernetes.io/projected/dc50a465-1831-48ea-aeda-d9861e5e2007-kube-api-access-77c8r\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.947669 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-config-data\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.947746 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.947800 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc50a465-1831-48ea-aeda-d9861e5e2007-logs\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.947909 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77c8r\" (UniqueName: \"kubernetes.io/projected/dc50a465-1831-48ea-aeda-d9861e5e2007-kube-api-access-77c8r\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.950270 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc50a465-1831-48ea-aeda-d9861e5e2007-logs\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.956363 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-config-data\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.957892 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:04 crc kubenswrapper[4753]: I0129 14:23:04.978759 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77c8r\" (UniqueName: \"kubernetes.io/projected/dc50a465-1831-48ea-aeda-d9861e5e2007-kube-api-access-77c8r\") pod \"nova-api-0\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " pod="openstack/nova-api-0" Jan 29 14:23:05 crc kubenswrapper[4753]: I0129 14:23:05.050963 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:05 crc kubenswrapper[4753]: I0129 14:23:05.546237 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:05 crc kubenswrapper[4753]: W0129 14:23:05.560376 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc50a465_1831_48ea_aeda_d9861e5e2007.slice/crio-f2321ee234d043c3534b6b79d5a8f7027d2c27e114ef538ffdc4bec2441bfc7d WatchSource:0}: Error finding container f2321ee234d043c3534b6b79d5a8f7027d2c27e114ef538ffdc4bec2441bfc7d: Status 404 returned error can't find the container with id f2321ee234d043c3534b6b79d5a8f7027d2c27e114ef538ffdc4bec2441bfc7d Jan 29 14:23:05 crc kubenswrapper[4753]: I0129 14:23:05.606437 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc50a465-1831-48ea-aeda-d9861e5e2007","Type":"ContainerStarted","Data":"f2321ee234d043c3534b6b79d5a8f7027d2c27e114ef538ffdc4bec2441bfc7d"} Jan 29 14:23:06 crc kubenswrapper[4753]: I0129 14:23:06.176417 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c948b692-cedf-4e4c-9876-69cf7f95d8b2" path="/var/lib/kubelet/pods/c948b692-cedf-4e4c-9876-69cf7f95d8b2/volumes" Jan 29 14:23:06 crc kubenswrapper[4753]: I0129 14:23:06.621363 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc50a465-1831-48ea-aeda-d9861e5e2007","Type":"ContainerStarted","Data":"32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3"} Jan 29 14:23:06 crc kubenswrapper[4753]: I0129 14:23:06.621679 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc50a465-1831-48ea-aeda-d9861e5e2007","Type":"ContainerStarted","Data":"2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6"} Jan 29 14:23:06 crc kubenswrapper[4753]: I0129 14:23:06.645242 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.645220955 podStartE2EDuration="2.645220955s" podCreationTimestamp="2026-01-29 14:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:06.638776061 +0000 UTC m=+1221.333510443" watchObservedRunningTime="2026-01-29 14:23:06.645220955 +0000 UTC m=+1221.339955337" Jan 29 14:23:07 crc kubenswrapper[4753]: I0129 14:23:07.248086 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:23:07 crc kubenswrapper[4753]: I0129 14:23:07.248366 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9045fb02-a12f-47c9-afe8-80f12d1599c2" containerName="kube-state-metrics" containerID="cri-o://961176ee521f54c0ddc517579ff16849595beaa66b484e3eb7f1bd0bcea7b8e3" gracePeriod=30 Jan 29 14:23:07 crc kubenswrapper[4753]: E0129 14:23:07.461360 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9045fb02_a12f_47c9_afe8_80f12d1599c2.slice/crio-conmon-961176ee521f54c0ddc517579ff16849595beaa66b484e3eb7f1bd0bcea7b8e3.scope\": RecentStats: unable to find data in memory cache]" Jan 29 14:23:07 crc kubenswrapper[4753]: I0129 14:23:07.635638 4753 generic.go:334] "Generic (PLEG): container finished" podID="9045fb02-a12f-47c9-afe8-80f12d1599c2" containerID="961176ee521f54c0ddc517579ff16849595beaa66b484e3eb7f1bd0bcea7b8e3" exitCode=2 Jan 29 14:23:07 crc kubenswrapper[4753]: I0129 14:23:07.635742 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9045fb02-a12f-47c9-afe8-80f12d1599c2","Type":"ContainerDied","Data":"961176ee521f54c0ddc517579ff16849595beaa66b484e3eb7f1bd0bcea7b8e3"} Jan 29 14:23:07 crc kubenswrapper[4753]: I0129 14:23:07.717227 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 14:23:07 crc kubenswrapper[4753]: I0129 14:23:07.908400 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzpqc\" (UniqueName: \"kubernetes.io/projected/9045fb02-a12f-47c9-afe8-80f12d1599c2-kube-api-access-zzpqc\") pod \"9045fb02-a12f-47c9-afe8-80f12d1599c2\" (UID: \"9045fb02-a12f-47c9-afe8-80f12d1599c2\") " Jan 29 14:23:07 crc kubenswrapper[4753]: I0129 14:23:07.917726 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9045fb02-a12f-47c9-afe8-80f12d1599c2-kube-api-access-zzpqc" (OuterVolumeSpecName: "kube-api-access-zzpqc") pod "9045fb02-a12f-47c9-afe8-80f12d1599c2" (UID: "9045fb02-a12f-47c9-afe8-80f12d1599c2"). InnerVolumeSpecName "kube-api-access-zzpqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.011289 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzpqc\" (UniqueName: \"kubernetes.io/projected/9045fb02-a12f-47c9-afe8-80f12d1599c2-kube-api-access-zzpqc\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.256253 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.645346 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9045fb02-a12f-47c9-afe8-80f12d1599c2","Type":"ContainerDied","Data":"2b69f0188b65b1f471344137471b113f829a52450704527065301b617bac75d7"} Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.645428 4753 scope.go:117] "RemoveContainer" containerID="961176ee521f54c0ddc517579ff16849595beaa66b484e3eb7f1bd0bcea7b8e3" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.645435 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.688033 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.706743 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.727738 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:23:08 crc kubenswrapper[4753]: E0129 14:23:08.728509 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9045fb02-a12f-47c9-afe8-80f12d1599c2" containerName="kube-state-metrics" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.728541 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9045fb02-a12f-47c9-afe8-80f12d1599c2" containerName="kube-state-metrics" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.728886 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9045fb02-a12f-47c9-afe8-80f12d1599c2" containerName="kube-state-metrics" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.729876 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.733002 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.733183 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.742628 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.833349 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.833957 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.834136 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.834416 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8w7g\" (UniqueName: \"kubernetes.io/projected/05315649-b501-4aae-9c14-4e632b89be53-kube-api-access-h8w7g\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.937130 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.937594 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.937655 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.937795 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8w7g\" (UniqueName: \"kubernetes.io/projected/05315649-b501-4aae-9c14-4e632b89be53-kube-api-access-h8w7g\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.941864 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.942262 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.943006 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:08 crc kubenswrapper[4753]: I0129 14:23:08.969408 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8w7g\" (UniqueName: \"kubernetes.io/projected/05315649-b501-4aae-9c14-4e632b89be53-kube-api-access-h8w7g\") pod \"kube-state-metrics-0\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " pod="openstack/kube-state-metrics-0" Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.056743 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.121858 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.122139 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="ceilometer-central-agent" containerID="cri-o://646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9" gracePeriod=30 Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.122489 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="ceilometer-notification-agent" containerID="cri-o://a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53" gracePeriod=30 Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.122503 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="proxy-httpd" containerID="cri-o://de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939" gracePeriod=30 Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.122578 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="sg-core" containerID="cri-o://ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7" gracePeriod=30 Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.129807 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.130496 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.604632 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:23:09 crc kubenswrapper[4753]: W0129 14:23:09.605898 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05315649_b501_4aae_9c14_4e632b89be53.slice/crio-36e4c9d67763ebbdfa3c794e1f00f6b4e59a7abd195ad09b4271debed373fbce WatchSource:0}: Error finding container 36e4c9d67763ebbdfa3c794e1f00f6b4e59a7abd195ad09b4271debed373fbce: Status 404 returned error can't find the container with id 36e4c9d67763ebbdfa3c794e1f00f6b4e59a7abd195ad09b4271debed373fbce Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.658922 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"05315649-b501-4aae-9c14-4e632b89be53","Type":"ContainerStarted","Data":"36e4c9d67763ebbdfa3c794e1f00f6b4e59a7abd195ad09b4271debed373fbce"} Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.662342 4753 generic.go:334] "Generic (PLEG): container finished" podID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerID="de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939" exitCode=0 Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.662402 4753 generic.go:334] "Generic (PLEG): container finished" podID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerID="ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7" exitCode=2 Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.662411 4753 generic.go:334] "Generic (PLEG): container finished" podID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerID="646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9" exitCode=0 Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.662406 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1d44c22-d153-48fd-ac3a-cd43ab7b5339","Type":"ContainerDied","Data":"de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939"} Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.662480 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1d44c22-d153-48fd-ac3a-cd43ab7b5339","Type":"ContainerDied","Data":"ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7"} Jan 29 14:23:09 crc kubenswrapper[4753]: I0129 14:23:09.662495 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1d44c22-d153-48fd-ac3a-cd43ab7b5339","Type":"ContainerDied","Data":"646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9"} Jan 29 14:23:10 crc kubenswrapper[4753]: I0129 14:23:10.140477 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 14:23:10 crc kubenswrapper[4753]: I0129 14:23:10.141096 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 14:23:10 crc kubenswrapper[4753]: I0129 14:23:10.160192 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9045fb02-a12f-47c9-afe8-80f12d1599c2" path="/var/lib/kubelet/pods/9045fb02-a12f-47c9-afe8-80f12d1599c2/volumes" Jan 29 14:23:10 crc kubenswrapper[4753]: I0129 14:23:10.674512 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"05315649-b501-4aae-9c14-4e632b89be53","Type":"ContainerStarted","Data":"27f10dc0be419156c4795db55c1bed92ee49c68890c1310f78d7e5642ab655c9"} Jan 29 14:23:10 crc kubenswrapper[4753]: I0129 14:23:10.676707 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 29 14:23:10 crc kubenswrapper[4753]: I0129 14:23:10.700391 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.292753871 podStartE2EDuration="2.700364193s" podCreationTimestamp="2026-01-29 14:23:08 +0000 UTC" firstStartedPulling="2026-01-29 14:23:09.609608918 +0000 UTC m=+1224.304343310" lastFinishedPulling="2026-01-29 14:23:10.01721925 +0000 UTC m=+1224.711953632" observedRunningTime="2026-01-29 14:23:10.697879406 +0000 UTC m=+1225.392613858" watchObservedRunningTime="2026-01-29 14:23:10.700364193 +0000 UTC m=+1225.395098605" Jan 29 14:23:10 crc kubenswrapper[4753]: I0129 14:23:10.922169 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.292439 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.403706 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-config-data\") pod \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.403825 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-combined-ca-bundle\") pod \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.403881 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjt7q\" (UniqueName: \"kubernetes.io/projected/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-kube-api-access-hjt7q\") pod \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.403924 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-scripts\") pod \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.403952 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-log-httpd\") pod \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.404041 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-run-httpd\") pod \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.404414 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f1d44c22-d153-48fd-ac3a-cd43ab7b5339" (UID: "f1d44c22-d153-48fd-ac3a-cd43ab7b5339"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.404596 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f1d44c22-d153-48fd-ac3a-cd43ab7b5339" (UID: "f1d44c22-d153-48fd-ac3a-cd43ab7b5339"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.404719 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-sg-core-conf-yaml\") pod \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\" (UID: \"f1d44c22-d153-48fd-ac3a-cd43ab7b5339\") " Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.405445 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.405464 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.416379 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-scripts" (OuterVolumeSpecName: "scripts") pod "f1d44c22-d153-48fd-ac3a-cd43ab7b5339" (UID: "f1d44c22-d153-48fd-ac3a-cd43ab7b5339"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.419834 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-kube-api-access-hjt7q" (OuterVolumeSpecName: "kube-api-access-hjt7q") pod "f1d44c22-d153-48fd-ac3a-cd43ab7b5339" (UID: "f1d44c22-d153-48fd-ac3a-cd43ab7b5339"). InnerVolumeSpecName "kube-api-access-hjt7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.446279 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f1d44c22-d153-48fd-ac3a-cd43ab7b5339" (UID: "f1d44c22-d153-48fd-ac3a-cd43ab7b5339"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.476244 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1d44c22-d153-48fd-ac3a-cd43ab7b5339" (UID: "f1d44c22-d153-48fd-ac3a-cd43ab7b5339"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.507324 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.507351 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.507362 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjt7q\" (UniqueName: \"kubernetes.io/projected/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-kube-api-access-hjt7q\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.507371 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.526884 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-config-data" (OuterVolumeSpecName: "config-data") pod "f1d44c22-d153-48fd-ac3a-cd43ab7b5339" (UID: "f1d44c22-d153-48fd-ac3a-cd43ab7b5339"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.612050 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1d44c22-d153-48fd-ac3a-cd43ab7b5339-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.686058 4753 generic.go:334] "Generic (PLEG): container finished" podID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerID="a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53" exitCode=0 Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.686126 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1d44c22-d153-48fd-ac3a-cd43ab7b5339","Type":"ContainerDied","Data":"a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53"} Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.686165 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.686186 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1d44c22-d153-48fd-ac3a-cd43ab7b5339","Type":"ContainerDied","Data":"7498d543e284a6b1f4a47109845cfde5024ef772c95e0dde9afae1a5b55420d2"} Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.686208 4753 scope.go:117] "RemoveContainer" containerID="de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.726568 4753 scope.go:117] "RemoveContainer" containerID="ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.760212 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.768376 4753 scope.go:117] "RemoveContainer" containerID="a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.776191 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.781794 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:11 crc kubenswrapper[4753]: E0129 14:23:11.782223 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="ceilometer-central-agent" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.782242 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="ceilometer-central-agent" Jan 29 14:23:11 crc kubenswrapper[4753]: E0129 14:23:11.782257 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="ceilometer-notification-agent" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.782264 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="ceilometer-notification-agent" Jan 29 14:23:11 crc kubenswrapper[4753]: E0129 14:23:11.782278 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="sg-core" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.782284 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="sg-core" Jan 29 14:23:11 crc kubenswrapper[4753]: E0129 14:23:11.782301 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="proxy-httpd" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.782308 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="proxy-httpd" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.782515 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="ceilometer-notification-agent" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.782525 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="sg-core" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.782536 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="proxy-httpd" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.782546 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" containerName="ceilometer-central-agent" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.784183 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.793942 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.810988 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.811338 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.812244 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.824515 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.824571 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-log-httpd\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.824634 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-run-httpd\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.824652 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.824678 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldkqm\" (UniqueName: \"kubernetes.io/projected/78077c16-0597-4700-98f2-116c33dac268-kube-api-access-ldkqm\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.824820 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-scripts\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.824913 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.824946 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-config-data\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.829333 4753 scope.go:117] "RemoveContainer" containerID="646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.851310 4753 scope.go:117] "RemoveContainer" containerID="de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939" Jan 29 14:23:11 crc kubenswrapper[4753]: E0129 14:23:11.851700 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939\": container with ID starting with de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939 not found: ID does not exist" containerID="de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.851730 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939"} err="failed to get container status \"de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939\": rpc error: code = NotFound desc = could not find container \"de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939\": container with ID starting with de0ab4bad2b0025d07cad73070bbfa4bfca67c4008f78a897cb336bf97c37939 not found: ID does not exist" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.851751 4753 scope.go:117] "RemoveContainer" containerID="ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7" Jan 29 14:23:11 crc kubenswrapper[4753]: E0129 14:23:11.852012 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7\": container with ID starting with ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7 not found: ID does not exist" containerID="ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.852064 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7"} err="failed to get container status \"ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7\": rpc error: code = NotFound desc = could not find container \"ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7\": container with ID starting with ad8b8ccde0c11abf8f69078baaf6c7b5519d12d4e1757ab87dd6d75758fa80c7 not found: ID does not exist" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.852096 4753 scope.go:117] "RemoveContainer" containerID="a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53" Jan 29 14:23:11 crc kubenswrapper[4753]: E0129 14:23:11.852427 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53\": container with ID starting with a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53 not found: ID does not exist" containerID="a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.852451 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53"} err="failed to get container status \"a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53\": rpc error: code = NotFound desc = could not find container \"a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53\": container with ID starting with a8b18156b92ba5c78a5d1e2ea43444337a3a9cb19e26a6ba893d9aa7cbeeba53 not found: ID does not exist" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.852465 4753 scope.go:117] "RemoveContainer" containerID="646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9" Jan 29 14:23:11 crc kubenswrapper[4753]: E0129 14:23:11.852788 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9\": container with ID starting with 646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9 not found: ID does not exist" containerID="646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.852836 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9"} err="failed to get container status \"646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9\": rpc error: code = NotFound desc = could not find container \"646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9\": container with ID starting with 646a23e37f8b67d7972e5ee38a9d475430910c67dcf57a117b3021314fbae7f9 not found: ID does not exist" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.927338 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldkqm\" (UniqueName: \"kubernetes.io/projected/78077c16-0597-4700-98f2-116c33dac268-kube-api-access-ldkqm\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.927402 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-scripts\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.927446 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.927467 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-config-data\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.927543 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.927575 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-log-httpd\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.927621 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-run-httpd\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.927640 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.928553 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-log-httpd\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.928631 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-run-httpd\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.932018 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-config-data\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.932424 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-scripts\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.932589 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.934532 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.935372 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:11 crc kubenswrapper[4753]: I0129 14:23:11.952791 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldkqm\" (UniqueName: \"kubernetes.io/projected/78077c16-0597-4700-98f2-116c33dac268-kube-api-access-ldkqm\") pod \"ceilometer-0\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " pod="openstack/ceilometer-0" Jan 29 14:23:12 crc kubenswrapper[4753]: I0129 14:23:12.134116 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:12 crc kubenswrapper[4753]: I0129 14:23:12.188575 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d44c22-d153-48fd-ac3a-cd43ab7b5339" path="/var/lib/kubelet/pods/f1d44c22-d153-48fd-ac3a-cd43ab7b5339/volumes" Jan 29 14:23:12 crc kubenswrapper[4753]: I0129 14:23:12.629647 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:12 crc kubenswrapper[4753]: W0129 14:23:12.632673 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78077c16_0597_4700_98f2_116c33dac268.slice/crio-e8275fc95846345399737879e8ff13a6c67679998d9e017b7a56c89ec71be40d WatchSource:0}: Error finding container e8275fc95846345399737879e8ff13a6c67679998d9e017b7a56c89ec71be40d: Status 404 returned error can't find the container with id e8275fc95846345399737879e8ff13a6c67679998d9e017b7a56c89ec71be40d Jan 29 14:23:12 crc kubenswrapper[4753]: I0129 14:23:12.696413 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78077c16-0597-4700-98f2-116c33dac268","Type":"ContainerStarted","Data":"e8275fc95846345399737879e8ff13a6c67679998d9e017b7a56c89ec71be40d"} Jan 29 14:23:13 crc kubenswrapper[4753]: I0129 14:23:13.254634 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 14:23:13 crc kubenswrapper[4753]: I0129 14:23:13.309378 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 14:23:13 crc kubenswrapper[4753]: I0129 14:23:13.713983 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78077c16-0597-4700-98f2-116c33dac268","Type":"ContainerStarted","Data":"971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a"} Jan 29 14:23:13 crc kubenswrapper[4753]: I0129 14:23:13.745578 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 14:23:14 crc kubenswrapper[4753]: I0129 14:23:14.723830 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78077c16-0597-4700-98f2-116c33dac268","Type":"ContainerStarted","Data":"9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd"} Jan 29 14:23:15 crc kubenswrapper[4753]: I0129 14:23:15.051190 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 14:23:15 crc kubenswrapper[4753]: I0129 14:23:15.051250 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 14:23:16 crc kubenswrapper[4753]: I0129 14:23:16.133325 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 14:23:16 crc kubenswrapper[4753]: I0129 14:23:16.133393 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 14:23:16 crc kubenswrapper[4753]: I0129 14:23:16.748403 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78077c16-0597-4700-98f2-116c33dac268","Type":"ContainerStarted","Data":"fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055"} Jan 29 14:23:18 crc kubenswrapper[4753]: I0129 14:23:18.799009 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78077c16-0597-4700-98f2-116c33dac268","Type":"ContainerStarted","Data":"ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0"} Jan 29 14:23:18 crc kubenswrapper[4753]: I0129 14:23:18.799747 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 14:23:18 crc kubenswrapper[4753]: I0129 14:23:18.826486 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.74568773 podStartE2EDuration="7.826461585s" podCreationTimestamp="2026-01-29 14:23:11 +0000 UTC" firstStartedPulling="2026-01-29 14:23:12.635351982 +0000 UTC m=+1227.330086374" lastFinishedPulling="2026-01-29 14:23:17.716125837 +0000 UTC m=+1232.410860229" observedRunningTime="2026-01-29 14:23:18.823868335 +0000 UTC m=+1233.518602727" watchObservedRunningTime="2026-01-29 14:23:18.826461585 +0000 UTC m=+1233.521196007" Jan 29 14:23:19 crc kubenswrapper[4753]: I0129 14:23:19.080425 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 29 14:23:19 crc kubenswrapper[4753]: I0129 14:23:19.144137 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 14:23:19 crc kubenswrapper[4753]: I0129 14:23:19.147237 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 14:23:19 crc kubenswrapper[4753]: I0129 14:23:19.164082 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 14:23:19 crc kubenswrapper[4753]: I0129 14:23:19.818797 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 14:23:21 crc kubenswrapper[4753]: I0129 14:23:21.833839 4753 generic.go:334] "Generic (PLEG): container finished" podID="c49944e0-0732-4d96-9521-f0aac0d45c4a" containerID="c2dd409f1ec25620fee980ea2ff1580887be01abfbc8ed8b108b40048a1db0f2" exitCode=137 Jan 29 14:23:21 crc kubenswrapper[4753]: I0129 14:23:21.833970 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c49944e0-0732-4d96-9521-f0aac0d45c4a","Type":"ContainerDied","Data":"c2dd409f1ec25620fee980ea2ff1580887be01abfbc8ed8b108b40048a1db0f2"} Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.339346 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.360478 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-config-data\") pod \"c49944e0-0732-4d96-9521-f0aac0d45c4a\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.360542 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-combined-ca-bundle\") pod \"c49944e0-0732-4d96-9521-f0aac0d45c4a\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.360781 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb6fj\" (UniqueName: \"kubernetes.io/projected/c49944e0-0732-4d96-9521-f0aac0d45c4a-kube-api-access-tb6fj\") pod \"c49944e0-0732-4d96-9521-f0aac0d45c4a\" (UID: \"c49944e0-0732-4d96-9521-f0aac0d45c4a\") " Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.368476 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c49944e0-0732-4d96-9521-f0aac0d45c4a-kube-api-access-tb6fj" (OuterVolumeSpecName: "kube-api-access-tb6fj") pod "c49944e0-0732-4d96-9521-f0aac0d45c4a" (UID: "c49944e0-0732-4d96-9521-f0aac0d45c4a"). InnerVolumeSpecName "kube-api-access-tb6fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.395720 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c49944e0-0732-4d96-9521-f0aac0d45c4a" (UID: "c49944e0-0732-4d96-9521-f0aac0d45c4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.399515 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-config-data" (OuterVolumeSpecName: "config-data") pod "c49944e0-0732-4d96-9521-f0aac0d45c4a" (UID: "c49944e0-0732-4d96-9521-f0aac0d45c4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.462310 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.462634 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c49944e0-0732-4d96-9521-f0aac0d45c4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.462647 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb6fj\" (UniqueName: \"kubernetes.io/projected/c49944e0-0732-4d96-9521-f0aac0d45c4a-kube-api-access-tb6fj\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.851390 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c49944e0-0732-4d96-9521-f0aac0d45c4a","Type":"ContainerDied","Data":"932baa668ebfd1e9b92264b04d1eeb70703449a10fc8b4f94208fb1b7e25698e"} Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.851482 4753 scope.go:117] "RemoveContainer" containerID="c2dd409f1ec25620fee980ea2ff1580887be01abfbc8ed8b108b40048a1db0f2" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.851787 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.913022 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.929332 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.946953 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:23:22 crc kubenswrapper[4753]: E0129 14:23:22.947405 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49944e0-0732-4d96-9521-f0aac0d45c4a" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.947426 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49944e0-0732-4d96-9521-f0aac0d45c4a" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.947618 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c49944e0-0732-4d96-9521-f0aac0d45c4a" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.948260 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.949838 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.951429 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.951637 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.951756 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.975631 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.975721 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.975771 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.975799 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:22 crc kubenswrapper[4753]: I0129 14:23:22.975833 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4pt2\" (UniqueName: \"kubernetes.io/projected/de34f6dc-67dd-4054-84a9-a051e0ba2876-kube-api-access-f4pt2\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.077476 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.077629 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.077716 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.077758 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.077813 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4pt2\" (UniqueName: \"kubernetes.io/projected/de34f6dc-67dd-4054-84a9-a051e0ba2876-kube-api-access-f4pt2\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.083069 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.083581 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.085275 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.086601 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.096004 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4pt2\" (UniqueName: \"kubernetes.io/projected/de34f6dc-67dd-4054-84a9-a051e0ba2876-kube-api-access-f4pt2\") pod \"nova-cell1-novncproxy-0\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.262846 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.766799 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:23:23 crc kubenswrapper[4753]: W0129 14:23:23.775044 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde34f6dc_67dd_4054_84a9_a051e0ba2876.slice/crio-648e99ffb8599c2ec868390143f0a3162a551d58bc4c5e7e71ce50e71e20b390 WatchSource:0}: Error finding container 648e99ffb8599c2ec868390143f0a3162a551d58bc4c5e7e71ce50e71e20b390: Status 404 returned error can't find the container with id 648e99ffb8599c2ec868390143f0a3162a551d58bc4c5e7e71ce50e71e20b390 Jan 29 14:23:23 crc kubenswrapper[4753]: I0129 14:23:23.866731 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"de34f6dc-67dd-4054-84a9-a051e0ba2876","Type":"ContainerStarted","Data":"648e99ffb8599c2ec868390143f0a3162a551d58bc4c5e7e71ce50e71e20b390"} Jan 29 14:23:24 crc kubenswrapper[4753]: I0129 14:23:24.164384 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c49944e0-0732-4d96-9521-f0aac0d45c4a" path="/var/lib/kubelet/pods/c49944e0-0732-4d96-9521-f0aac0d45c4a/volumes" Jan 29 14:23:24 crc kubenswrapper[4753]: I0129 14:23:24.881265 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"de34f6dc-67dd-4054-84a9-a051e0ba2876","Type":"ContainerStarted","Data":"857539f16ab7d5f43263fb3fc69c40c7cfc6306d139f5a0e01c992147a025e17"} Jan 29 14:23:24 crc kubenswrapper[4753]: I0129 14:23:24.914531 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.914511164 podStartE2EDuration="2.914511164s" podCreationTimestamp="2026-01-29 14:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:24.904275351 +0000 UTC m=+1239.599009743" watchObservedRunningTime="2026-01-29 14:23:24.914511164 +0000 UTC m=+1239.609245546" Jan 29 14:23:25 crc kubenswrapper[4753]: I0129 14:23:25.055788 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 14:23:25 crc kubenswrapper[4753]: I0129 14:23:25.056449 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 14:23:25 crc kubenswrapper[4753]: I0129 14:23:25.065489 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 14:23:25 crc kubenswrapper[4753]: I0129 14:23:25.068467 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 14:23:25 crc kubenswrapper[4753]: I0129 14:23:25.891582 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 14:23:25 crc kubenswrapper[4753]: I0129 14:23:25.896199 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.112443 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-zrdnm"] Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.116608 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.138299 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-zrdnm"] Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.138880 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-config\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.138981 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-swift-storage-0\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.139075 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-sb\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.139112 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-svc\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.139140 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-nb\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.139192 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm2dc\" (UniqueName: \"kubernetes.io/projected/386b408e-6ac5-4f1d-8403-be4fc8aec57d-kube-api-access-jm2dc\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.259046 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-nb\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.260712 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm2dc\" (UniqueName: \"kubernetes.io/projected/386b408e-6ac5-4f1d-8403-be4fc8aec57d-kube-api-access-jm2dc\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.260816 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-config\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.261013 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-swift-storage-0\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.261068 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-sb\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.261088 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-svc\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.259994 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-nb\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.262693 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-config\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.262728 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-swift-storage-0\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.262880 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-svc\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.263361 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-sb\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.294433 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm2dc\" (UniqueName: \"kubernetes.io/projected/386b408e-6ac5-4f1d-8403-be4fc8aec57d-kube-api-access-jm2dc\") pod \"dnsmasq-dns-867cd545c7-zrdnm\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.446358 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.875087 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-zrdnm"] Jan 29 14:23:26 crc kubenswrapper[4753]: I0129 14:23:26.904622 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" event={"ID":"386b408e-6ac5-4f1d-8403-be4fc8aec57d","Type":"ContainerStarted","Data":"1f6dc0380a03f660a067b38b55a506e9e2023e8197e9373759736a6c99588bed"} Jan 29 14:23:27 crc kubenswrapper[4753]: I0129 14:23:27.923046 4753 generic.go:334] "Generic (PLEG): container finished" podID="386b408e-6ac5-4f1d-8403-be4fc8aec57d" containerID="f30589dcc7d57df07d54caec7f4038d30558bc380d3ec02279fbac2497ee8ee2" exitCode=0 Jan 29 14:23:27 crc kubenswrapper[4753]: I0129 14:23:27.925089 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" event={"ID":"386b408e-6ac5-4f1d-8403-be4fc8aec57d","Type":"ContainerDied","Data":"f30589dcc7d57df07d54caec7f4038d30558bc380d3ec02279fbac2497ee8ee2"} Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.236894 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.238013 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="proxy-httpd" containerID="cri-o://ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0" gracePeriod=30 Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.238159 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="sg-core" containerID="cri-o://fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055" gracePeriod=30 Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.238206 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="ceilometer-notification-agent" containerID="cri-o://9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd" gracePeriod=30 Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.237984 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="ceilometer-central-agent" containerID="cri-o://971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a" gracePeriod=30 Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.259060 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.197:3000/\": EOF" Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.265248 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.940004 4753 generic.go:334] "Generic (PLEG): container finished" podID="78077c16-0597-4700-98f2-116c33dac268" containerID="ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0" exitCode=0 Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.940046 4753 generic.go:334] "Generic (PLEG): container finished" podID="78077c16-0597-4700-98f2-116c33dac268" containerID="fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055" exitCode=2 Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.940057 4753 generic.go:334] "Generic (PLEG): container finished" podID="78077c16-0597-4700-98f2-116c33dac268" containerID="971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a" exitCode=0 Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.940111 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78077c16-0597-4700-98f2-116c33dac268","Type":"ContainerDied","Data":"ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0"} Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.940400 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78077c16-0597-4700-98f2-116c33dac268","Type":"ContainerDied","Data":"fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055"} Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.940431 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78077c16-0597-4700-98f2-116c33dac268","Type":"ContainerDied","Data":"971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a"} Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.943444 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" event={"ID":"386b408e-6ac5-4f1d-8403-be4fc8aec57d","Type":"ContainerStarted","Data":"f7c124c9fdda70a8e8042c2355a73ef99b59bc98a46f42c31fb3d71e038de5ed"} Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.943661 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:28 crc kubenswrapper[4753]: I0129 14:23:28.986236 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" podStartSLOduration=2.986203842 podStartE2EDuration="2.986203842s" podCreationTimestamp="2026-01-29 14:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:28.974760157 +0000 UTC m=+1243.669494549" watchObservedRunningTime="2026-01-29 14:23:28.986203842 +0000 UTC m=+1243.680938264" Jan 29 14:23:29 crc kubenswrapper[4753]: I0129 14:23:29.179124 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:29 crc kubenswrapper[4753]: I0129 14:23:29.179786 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerName="nova-api-log" containerID="cri-o://2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6" gracePeriod=30 Jan 29 14:23:29 crc kubenswrapper[4753]: I0129 14:23:29.180128 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerName="nova-api-api" containerID="cri-o://32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3" gracePeriod=30 Jan 29 14:23:29 crc kubenswrapper[4753]: I0129 14:23:29.968011 4753 generic.go:334] "Generic (PLEG): container finished" podID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerID="2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6" exitCode=143 Jan 29 14:23:29 crc kubenswrapper[4753]: I0129 14:23:29.968064 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc50a465-1831-48ea-aeda-d9861e5e2007","Type":"ContainerDied","Data":"2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6"} Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.470868 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.551354 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-ceilometer-tls-certs\") pod \"78077c16-0597-4700-98f2-116c33dac268\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.551902 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-combined-ca-bundle\") pod \"78077c16-0597-4700-98f2-116c33dac268\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.551988 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-sg-core-conf-yaml\") pod \"78077c16-0597-4700-98f2-116c33dac268\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.552080 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-log-httpd\") pod \"78077c16-0597-4700-98f2-116c33dac268\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.552195 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-run-httpd\") pod \"78077c16-0597-4700-98f2-116c33dac268\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.552244 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-config-data\") pod \"78077c16-0597-4700-98f2-116c33dac268\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.552291 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldkqm\" (UniqueName: \"kubernetes.io/projected/78077c16-0597-4700-98f2-116c33dac268-kube-api-access-ldkqm\") pod \"78077c16-0597-4700-98f2-116c33dac268\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.552345 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-scripts\") pod \"78077c16-0597-4700-98f2-116c33dac268\" (UID: \"78077c16-0597-4700-98f2-116c33dac268\") " Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.552772 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "78077c16-0597-4700-98f2-116c33dac268" (UID: "78077c16-0597-4700-98f2-116c33dac268"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.553884 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.554043 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "78077c16-0597-4700-98f2-116c33dac268" (UID: "78077c16-0597-4700-98f2-116c33dac268"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.577321 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78077c16-0597-4700-98f2-116c33dac268-kube-api-access-ldkqm" (OuterVolumeSpecName: "kube-api-access-ldkqm") pod "78077c16-0597-4700-98f2-116c33dac268" (UID: "78077c16-0597-4700-98f2-116c33dac268"). InnerVolumeSpecName "kube-api-access-ldkqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.578390 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-scripts" (OuterVolumeSpecName: "scripts") pod "78077c16-0597-4700-98f2-116c33dac268" (UID: "78077c16-0597-4700-98f2-116c33dac268"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.598798 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "78077c16-0597-4700-98f2-116c33dac268" (UID: "78077c16-0597-4700-98f2-116c33dac268"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.645540 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "78077c16-0597-4700-98f2-116c33dac268" (UID: "78077c16-0597-4700-98f2-116c33dac268"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.651320 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78077c16-0597-4700-98f2-116c33dac268" (UID: "78077c16-0597-4700-98f2-116c33dac268"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.657921 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldkqm\" (UniqueName: \"kubernetes.io/projected/78077c16-0597-4700-98f2-116c33dac268-kube-api-access-ldkqm\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.657956 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.657971 4753 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.657985 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.657996 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.658007 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78077c16-0597-4700-98f2-116c33dac268-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.705172 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-config-data" (OuterVolumeSpecName: "config-data") pod "78077c16-0597-4700-98f2-116c33dac268" (UID: "78077c16-0597-4700-98f2-116c33dac268"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.759796 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78077c16-0597-4700-98f2-116c33dac268-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.980721 4753 generic.go:334] "Generic (PLEG): container finished" podID="78077c16-0597-4700-98f2-116c33dac268" containerID="9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd" exitCode=0 Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.980789 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78077c16-0597-4700-98f2-116c33dac268","Type":"ContainerDied","Data":"9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd"} Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.980817 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.980846 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78077c16-0597-4700-98f2-116c33dac268","Type":"ContainerDied","Data":"e8275fc95846345399737879e8ff13a6c67679998d9e017b7a56c89ec71be40d"} Jan 29 14:23:30 crc kubenswrapper[4753]: I0129 14:23:30.980883 4753 scope.go:117] "RemoveContainer" containerID="ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.020137 4753 scope.go:117] "RemoveContainer" containerID="fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.029327 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.043676 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.053759 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:31 crc kubenswrapper[4753]: E0129 14:23:31.054104 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="ceilometer-notification-agent" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.054123 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="ceilometer-notification-agent" Jan 29 14:23:31 crc kubenswrapper[4753]: E0129 14:23:31.054140 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="ceilometer-central-agent" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.054161 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="ceilometer-central-agent" Jan 29 14:23:31 crc kubenswrapper[4753]: E0129 14:23:31.054172 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="proxy-httpd" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.054178 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="proxy-httpd" Jan 29 14:23:31 crc kubenswrapper[4753]: E0129 14:23:31.054204 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="sg-core" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.054209 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="sg-core" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.054363 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="sg-core" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.054382 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="proxy-httpd" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.054393 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="ceilometer-central-agent" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.054405 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="78077c16-0597-4700-98f2-116c33dac268" containerName="ceilometer-notification-agent" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.056192 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.062952 4753 scope.go:117] "RemoveContainer" containerID="9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.064726 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.064924 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.064927 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.082651 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.142489 4753 scope.go:117] "RemoveContainer" containerID="971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.163323 4753 scope.go:117] "RemoveContainer" containerID="ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0" Jan 29 14:23:31 crc kubenswrapper[4753]: E0129 14:23:31.164643 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0\": container with ID starting with ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0 not found: ID does not exist" containerID="ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.164692 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0"} err="failed to get container status \"ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0\": rpc error: code = NotFound desc = could not find container \"ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0\": container with ID starting with ec24885ebe525f8cc89680661e05db373d7c9d123e4862206f0acde4481917a0 not found: ID does not exist" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.164718 4753 scope.go:117] "RemoveContainer" containerID="fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055" Jan 29 14:23:31 crc kubenswrapper[4753]: E0129 14:23:31.165048 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055\": container with ID starting with fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055 not found: ID does not exist" containerID="fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.165088 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055"} err="failed to get container status \"fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055\": rpc error: code = NotFound desc = could not find container \"fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055\": container with ID starting with fae30fc29dcd42c8f3f35836af0c04a522ccdd46dd0c728c4868a413c0312055 not found: ID does not exist" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.165111 4753 scope.go:117] "RemoveContainer" containerID="9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.166071 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-scripts\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.166113 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-log-httpd\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.166207 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.166260 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-run-httpd\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.166294 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-config-data\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.166333 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c6mc\" (UniqueName: \"kubernetes.io/projected/a7d0250a-2990-49be-9486-ce7e4302e9e0-kube-api-access-2c6mc\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.166368 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.166400 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: E0129 14:23:31.166711 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd\": container with ID starting with 9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd not found: ID does not exist" containerID="9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.166740 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd"} err="failed to get container status \"9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd\": rpc error: code = NotFound desc = could not find container \"9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd\": container with ID starting with 9155a6e193f788497accf4a406990d607e47d0d20cecdd032b94e90990d035bd not found: ID does not exist" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.166769 4753 scope.go:117] "RemoveContainer" containerID="971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a" Jan 29 14:23:31 crc kubenswrapper[4753]: E0129 14:23:31.166999 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a\": container with ID starting with 971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a not found: ID does not exist" containerID="971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.167022 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a"} err="failed to get container status \"971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a\": rpc error: code = NotFound desc = could not find container \"971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a\": container with ID starting with 971a69abe0a9c4a3466c4fe0a5cb96eb05d286dd2b2a06b47631fbd6bf35923a not found: ID does not exist" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.267853 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-scripts\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.267908 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-log-httpd\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.267949 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.268014 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-run-httpd\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.268048 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-config-data\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.268087 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c6mc\" (UniqueName: \"kubernetes.io/projected/a7d0250a-2990-49be-9486-ce7e4302e9e0-kube-api-access-2c6mc\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.268122 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.268217 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.268468 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-log-httpd\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.269366 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-run-httpd\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.274587 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.275656 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-config-data\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.275897 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-scripts\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.277073 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.280287 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.289983 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c6mc\" (UniqueName: \"kubernetes.io/projected/a7d0250a-2990-49be-9486-ce7e4302e9e0-kube-api-access-2c6mc\") pod \"ceilometer-0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.453286 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.922191 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:31 crc kubenswrapper[4753]: W0129 14:23:31.932570 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7d0250a_2990_49be_9486_ce7e4302e9e0.slice/crio-aa13b726a6cb8767ea49c4829a1fecbcee0a3661bc1ba2a24dbe533e686cfd15 WatchSource:0}: Error finding container aa13b726a6cb8767ea49c4829a1fecbcee0a3661bc1ba2a24dbe533e686cfd15: Status 404 returned error can't find the container with id aa13b726a6cb8767ea49c4829a1fecbcee0a3661bc1ba2a24dbe533e686cfd15 Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.962753 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:31 crc kubenswrapper[4753]: I0129 14:23:31.992750 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7d0250a-2990-49be-9486-ce7e4302e9e0","Type":"ContainerStarted","Data":"aa13b726a6cb8767ea49c4829a1fecbcee0a3661bc1ba2a24dbe533e686cfd15"} Jan 29 14:23:32 crc kubenswrapper[4753]: I0129 14:23:32.171441 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78077c16-0597-4700-98f2-116c33dac268" path="/var/lib/kubelet/pods/78077c16-0597-4700-98f2-116c33dac268/volumes" Jan 29 14:23:32 crc kubenswrapper[4753]: I0129 14:23:32.839077 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:32 crc kubenswrapper[4753]: I0129 14:23:32.916411 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77c8r\" (UniqueName: \"kubernetes.io/projected/dc50a465-1831-48ea-aeda-d9861e5e2007-kube-api-access-77c8r\") pod \"dc50a465-1831-48ea-aeda-d9861e5e2007\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " Jan 29 14:23:32 crc kubenswrapper[4753]: I0129 14:23:32.916505 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc50a465-1831-48ea-aeda-d9861e5e2007-logs\") pod \"dc50a465-1831-48ea-aeda-d9861e5e2007\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " Jan 29 14:23:32 crc kubenswrapper[4753]: I0129 14:23:32.916583 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-config-data\") pod \"dc50a465-1831-48ea-aeda-d9861e5e2007\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " Jan 29 14:23:32 crc kubenswrapper[4753]: I0129 14:23:32.916611 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-combined-ca-bundle\") pod \"dc50a465-1831-48ea-aeda-d9861e5e2007\" (UID: \"dc50a465-1831-48ea-aeda-d9861e5e2007\") " Jan 29 14:23:32 crc kubenswrapper[4753]: I0129 14:23:32.917564 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc50a465-1831-48ea-aeda-d9861e5e2007-logs" (OuterVolumeSpecName: "logs") pod "dc50a465-1831-48ea-aeda-d9861e5e2007" (UID: "dc50a465-1831-48ea-aeda-d9861e5e2007"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:23:32 crc kubenswrapper[4753]: I0129 14:23:32.926234 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc50a465-1831-48ea-aeda-d9861e5e2007-kube-api-access-77c8r" (OuterVolumeSpecName: "kube-api-access-77c8r") pod "dc50a465-1831-48ea-aeda-d9861e5e2007" (UID: "dc50a465-1831-48ea-aeda-d9861e5e2007"). InnerVolumeSpecName "kube-api-access-77c8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:32 crc kubenswrapper[4753]: I0129 14:23:32.954659 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-config-data" (OuterVolumeSpecName: "config-data") pod "dc50a465-1831-48ea-aeda-d9861e5e2007" (UID: "dc50a465-1831-48ea-aeda-d9861e5e2007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:32 crc kubenswrapper[4753]: I0129 14:23:32.961125 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc50a465-1831-48ea-aeda-d9861e5e2007" (UID: "dc50a465-1831-48ea-aeda-d9861e5e2007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.012081 4753 generic.go:334] "Generic (PLEG): container finished" podID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerID="32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3" exitCode=0 Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.012141 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc50a465-1831-48ea-aeda-d9861e5e2007","Type":"ContainerDied","Data":"32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3"} Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.012181 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc50a465-1831-48ea-aeda-d9861e5e2007","Type":"ContainerDied","Data":"f2321ee234d043c3534b6b79d5a8f7027d2c27e114ef538ffdc4bec2441bfc7d"} Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.012196 4753 scope.go:117] "RemoveContainer" containerID="32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.012308 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.019409 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77c8r\" (UniqueName: \"kubernetes.io/projected/dc50a465-1831-48ea-aeda-d9861e5e2007-kube-api-access-77c8r\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.019447 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc50a465-1831-48ea-aeda-d9861e5e2007-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.019457 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.019467 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc50a465-1831-48ea-aeda-d9861e5e2007-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.065097 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.075947 4753 scope.go:117] "RemoveContainer" containerID="2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.080326 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.112970 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:33 crc kubenswrapper[4753]: E0129 14:23:33.113453 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerName="nova-api-log" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.113472 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerName="nova-api-log" Jan 29 14:23:33 crc kubenswrapper[4753]: E0129 14:23:33.113496 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerName="nova-api-api" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.113502 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerName="nova-api-api" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.113690 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerName="nova-api-log" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.113709 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" containerName="nova-api-api" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.114701 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.117764 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.118098 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.118220 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.135457 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.154380 4753 scope.go:117] "RemoveContainer" containerID="32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3" Jan 29 14:23:33 crc kubenswrapper[4753]: E0129 14:23:33.154745 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3\": container with ID starting with 32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3 not found: ID does not exist" containerID="32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.154774 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3"} err="failed to get container status \"32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3\": rpc error: code = NotFound desc = could not find container \"32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3\": container with ID starting with 32a07b118f00a66f9f892436a5b7adc47087b589aea720ea38ec7c5d9dd75df3 not found: ID does not exist" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.154795 4753 scope.go:117] "RemoveContainer" containerID="2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6" Jan 29 14:23:33 crc kubenswrapper[4753]: E0129 14:23:33.155003 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6\": container with ID starting with 2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6 not found: ID does not exist" containerID="2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.155021 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6"} err="failed to get container status \"2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6\": rpc error: code = NotFound desc = could not find container \"2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6\": container with ID starting with 2afdc480f42eac02684c55d96cb66bf42a7b9a4b816017cefd224cd6e5bcf2d6 not found: ID does not exist" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.222674 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-public-tls-certs\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.222815 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcjsl\" (UniqueName: \"kubernetes.io/projected/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-kube-api-access-lcjsl\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.222992 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-logs\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.223040 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.223065 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-config-data\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.223127 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.264640 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.294515 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.325301 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-public-tls-certs\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.325594 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcjsl\" (UniqueName: \"kubernetes.io/projected/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-kube-api-access-lcjsl\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.325676 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-logs\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.325703 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.325718 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-config-data\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.325764 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.326981 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-logs\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.330996 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.332014 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-public-tls-certs\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.332969 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-config-data\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.340456 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.347202 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcjsl\" (UniqueName: \"kubernetes.io/projected/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-kube-api-access-lcjsl\") pod \"nova-api-0\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " pod="openstack/nova-api-0" Jan 29 14:23:33 crc kubenswrapper[4753]: I0129 14:23:33.518904 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.024123 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7d0250a-2990-49be-9486-ce7e4302e9e0","Type":"ContainerStarted","Data":"17bb93e333b2823661450eda7f76dad7ce26c1c71e2f9248b07e522654054068"} Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.024492 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7d0250a-2990-49be-9486-ce7e4302e9e0","Type":"ContainerStarted","Data":"2db5d885e81eea5189f34241c33621a360d510b2296d497b487d4703bfd23c20"} Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.043865 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.124755 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.187621 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc50a465-1831-48ea-aeda-d9861e5e2007" path="/var/lib/kubelet/pods/dc50a465-1831-48ea-aeda-d9861e5e2007/volumes" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.261407 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wml54"] Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.262534 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.268267 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.268339 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.272265 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wml54"] Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.352313 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-scripts\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.352372 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.352396 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfgxn\" (UniqueName: \"kubernetes.io/projected/5d0d7f08-1d4d-451d-98c3-873ac514fd53-kube-api-access-lfgxn\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.352443 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-config-data\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.454627 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-scripts\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.454692 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.454715 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfgxn\" (UniqueName: \"kubernetes.io/projected/5d0d7f08-1d4d-451d-98c3-873ac514fd53-kube-api-access-lfgxn\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.454755 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-config-data\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.463087 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-config-data\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.463254 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.467443 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-scripts\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.473758 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfgxn\" (UniqueName: \"kubernetes.io/projected/5d0d7f08-1d4d-451d-98c3-873ac514fd53-kube-api-access-lfgxn\") pod \"nova-cell1-cell-mapping-wml54\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:34 crc kubenswrapper[4753]: I0129 14:23:34.598253 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:35 crc kubenswrapper[4753]: I0129 14:23:35.046359 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7d0250a-2990-49be-9486-ce7e4302e9e0","Type":"ContainerStarted","Data":"7a91c94eac94ff5110e30ed78e7fd3319e7ed69912fde42cc444f8928153a890"} Jan 29 14:23:35 crc kubenswrapper[4753]: I0129 14:23:35.050354 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1eb78eb9-7d3f-42cf-a308-2fb173bcb284","Type":"ContainerStarted","Data":"4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b"} Jan 29 14:23:35 crc kubenswrapper[4753]: I0129 14:23:35.050382 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1eb78eb9-7d3f-42cf-a308-2fb173bcb284","Type":"ContainerStarted","Data":"804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86"} Jan 29 14:23:35 crc kubenswrapper[4753]: I0129 14:23:35.050392 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1eb78eb9-7d3f-42cf-a308-2fb173bcb284","Type":"ContainerStarted","Data":"b66360fcfaa829999d42bdac77784cd8615f12db83cf79e82c38f10484579b67"} Jan 29 14:23:35 crc kubenswrapper[4753]: I0129 14:23:35.070541 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wml54"] Jan 29 14:23:35 crc kubenswrapper[4753]: I0129 14:23:35.085465 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.08544548 podStartE2EDuration="2.08544548s" podCreationTimestamp="2026-01-29 14:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:35.073599044 +0000 UTC m=+1249.768333426" watchObservedRunningTime="2026-01-29 14:23:35.08544548 +0000 UTC m=+1249.780179882" Jan 29 14:23:35 crc kubenswrapper[4753]: W0129 14:23:35.090899 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d0d7f08_1d4d_451d_98c3_873ac514fd53.slice/crio-a90e38b874e69019e1ccab80f34e57ac51635c41e3df8c3e86e195503098495c WatchSource:0}: Error finding container a90e38b874e69019e1ccab80f34e57ac51635c41e3df8c3e86e195503098495c: Status 404 returned error can't find the container with id a90e38b874e69019e1ccab80f34e57ac51635c41e3df8c3e86e195503098495c Jan 29 14:23:36 crc kubenswrapper[4753]: I0129 14:23:36.063390 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wml54" event={"ID":"5d0d7f08-1d4d-451d-98c3-873ac514fd53","Type":"ContainerStarted","Data":"1382f4decacf5f428514245aaf0e2515208a47c41adb2f648c7202bccd9216f9"} Jan 29 14:23:36 crc kubenswrapper[4753]: I0129 14:23:36.063794 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wml54" event={"ID":"5d0d7f08-1d4d-451d-98c3-873ac514fd53","Type":"ContainerStarted","Data":"a90e38b874e69019e1ccab80f34e57ac51635c41e3df8c3e86e195503098495c"} Jan 29 14:23:36 crc kubenswrapper[4753]: I0129 14:23:36.100659 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wml54" podStartSLOduration=2.100606721 podStartE2EDuration="2.100606721s" podCreationTimestamp="2026-01-29 14:23:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:36.095591238 +0000 UTC m=+1250.790325690" watchObservedRunningTime="2026-01-29 14:23:36.100606721 +0000 UTC m=+1250.795341143" Jan 29 14:23:36 crc kubenswrapper[4753]: I0129 14:23:36.448405 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:23:36 crc kubenswrapper[4753]: I0129 14:23:36.535860 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-lhrcx"] Jan 29 14:23:36 crc kubenswrapper[4753]: I0129 14:23:36.536519 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" podUID="593c409e-6f89-4cb9-8ab0-7910612780db" containerName="dnsmasq-dns" containerID="cri-o://d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56" gracePeriod=10 Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.072010 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7d0250a-2990-49be-9486-ce7e4302e9e0","Type":"ContainerStarted","Data":"280cccf7b8efdec37605eddf07e4950ab5aa0f8080daf672409d33e82356a17b"} Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.073451 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="ceilometer-central-agent" containerID="cri-o://2db5d885e81eea5189f34241c33621a360d510b2296d497b487d4703bfd23c20" gracePeriod=30 Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.073851 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.074225 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="proxy-httpd" containerID="cri-o://280cccf7b8efdec37605eddf07e4950ab5aa0f8080daf672409d33e82356a17b" gracePeriod=30 Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.074353 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="sg-core" containerID="cri-o://7a91c94eac94ff5110e30ed78e7fd3319e7ed69912fde42cc444f8928153a890" gracePeriod=30 Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.074506 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="ceilometer-notification-agent" containerID="cri-o://17bb93e333b2823661450eda7f76dad7ce26c1c71e2f9248b07e522654054068" gracePeriod=30 Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.074584 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.081469 4753 generic.go:334] "Generic (PLEG): container finished" podID="593c409e-6f89-4cb9-8ab0-7910612780db" containerID="d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56" exitCode=0 Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.082036 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" event={"ID":"593c409e-6f89-4cb9-8ab0-7910612780db","Type":"ContainerDied","Data":"d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56"} Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.082093 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" event={"ID":"593c409e-6f89-4cb9-8ab0-7910612780db","Type":"ContainerDied","Data":"2a50d1f0c99bdb7f8526600576379385f9397a9272e98e1e55db5b8298016372"} Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.082115 4753 scope.go:117] "RemoveContainer" containerID="d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.102461 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.555899165 podStartE2EDuration="6.102438127s" podCreationTimestamp="2026-01-29 14:23:31 +0000 UTC" firstStartedPulling="2026-01-29 14:23:31.936269553 +0000 UTC m=+1246.631003925" lastFinishedPulling="2026-01-29 14:23:36.482808475 +0000 UTC m=+1251.177542887" observedRunningTime="2026-01-29 14:23:37.094124146 +0000 UTC m=+1251.788858528" watchObservedRunningTime="2026-01-29 14:23:37.102438127 +0000 UTC m=+1251.797172509" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.115923 4753 scope.go:117] "RemoveContainer" containerID="d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.116971 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-nb\") pod \"593c409e-6f89-4cb9-8ab0-7910612780db\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.117013 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-config\") pod \"593c409e-6f89-4cb9-8ab0-7910612780db\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.117074 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-swift-storage-0\") pod \"593c409e-6f89-4cb9-8ab0-7910612780db\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.117141 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqbm4\" (UniqueName: \"kubernetes.io/projected/593c409e-6f89-4cb9-8ab0-7910612780db-kube-api-access-kqbm4\") pod \"593c409e-6f89-4cb9-8ab0-7910612780db\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.117329 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-svc\") pod \"593c409e-6f89-4cb9-8ab0-7910612780db\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.117410 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-sb\") pod \"593c409e-6f89-4cb9-8ab0-7910612780db\" (UID: \"593c409e-6f89-4cb9-8ab0-7910612780db\") " Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.129368 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593c409e-6f89-4cb9-8ab0-7910612780db-kube-api-access-kqbm4" (OuterVolumeSpecName: "kube-api-access-kqbm4") pod "593c409e-6f89-4cb9-8ab0-7910612780db" (UID: "593c409e-6f89-4cb9-8ab0-7910612780db"). InnerVolumeSpecName "kube-api-access-kqbm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.213804 4753 scope.go:117] "RemoveContainer" containerID="d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56" Jan 29 14:23:37 crc kubenswrapper[4753]: E0129 14:23:37.218845 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56\": container with ID starting with d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56 not found: ID does not exist" containerID="d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.218889 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56"} err="failed to get container status \"d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56\": rpc error: code = NotFound desc = could not find container \"d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56\": container with ID starting with d65e384a207b264067aee4eee1ccd3996eca4286d410164e67dfcd3ca7963d56 not found: ID does not exist" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.218912 4753 scope.go:117] "RemoveContainer" containerID="d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.219570 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqbm4\" (UniqueName: \"kubernetes.io/projected/593c409e-6f89-4cb9-8ab0-7910612780db-kube-api-access-kqbm4\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:37 crc kubenswrapper[4753]: E0129 14:23:37.220521 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0\": container with ID starting with d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0 not found: ID does not exist" containerID="d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.220546 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0"} err="failed to get container status \"d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0\": rpc error: code = NotFound desc = could not find container \"d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0\": container with ID starting with d2132a7e4be0f3edec421c40cf92ca3c6bf6f7f49b5e47e7f970c9731daea9b0 not found: ID does not exist" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.247196 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "593c409e-6f89-4cb9-8ab0-7910612780db" (UID: "593c409e-6f89-4cb9-8ab0-7910612780db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.247296 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-config" (OuterVolumeSpecName: "config") pod "593c409e-6f89-4cb9-8ab0-7910612780db" (UID: "593c409e-6f89-4cb9-8ab0-7910612780db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.255197 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "593c409e-6f89-4cb9-8ab0-7910612780db" (UID: "593c409e-6f89-4cb9-8ab0-7910612780db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.257010 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "593c409e-6f89-4cb9-8ab0-7910612780db" (UID: "593c409e-6f89-4cb9-8ab0-7910612780db"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.264879 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "593c409e-6f89-4cb9-8ab0-7910612780db" (UID: "593c409e-6f89-4cb9-8ab0-7910612780db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.321310 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.321348 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.321363 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.321371 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:37 crc kubenswrapper[4753]: I0129 14:23:37.321380 4753 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/593c409e-6f89-4cb9-8ab0-7910612780db-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:38 crc kubenswrapper[4753]: I0129 14:23:38.100720 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb54f9b5-lhrcx" Jan 29 14:23:38 crc kubenswrapper[4753]: I0129 14:23:38.119782 4753 generic.go:334] "Generic (PLEG): container finished" podID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerID="280cccf7b8efdec37605eddf07e4950ab5aa0f8080daf672409d33e82356a17b" exitCode=0 Jan 29 14:23:38 crc kubenswrapper[4753]: I0129 14:23:38.119834 4753 generic.go:334] "Generic (PLEG): container finished" podID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerID="7a91c94eac94ff5110e30ed78e7fd3319e7ed69912fde42cc444f8928153a890" exitCode=2 Jan 29 14:23:38 crc kubenswrapper[4753]: I0129 14:23:38.119856 4753 generic.go:334] "Generic (PLEG): container finished" podID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerID="17bb93e333b2823661450eda7f76dad7ce26c1c71e2f9248b07e522654054068" exitCode=0 Jan 29 14:23:38 crc kubenswrapper[4753]: I0129 14:23:38.119887 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7d0250a-2990-49be-9486-ce7e4302e9e0","Type":"ContainerDied","Data":"280cccf7b8efdec37605eddf07e4950ab5aa0f8080daf672409d33e82356a17b"} Jan 29 14:23:38 crc kubenswrapper[4753]: I0129 14:23:38.119927 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7d0250a-2990-49be-9486-ce7e4302e9e0","Type":"ContainerDied","Data":"7a91c94eac94ff5110e30ed78e7fd3319e7ed69912fde42cc444f8928153a890"} Jan 29 14:23:38 crc kubenswrapper[4753]: I0129 14:23:38.119946 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7d0250a-2990-49be-9486-ce7e4302e9e0","Type":"ContainerDied","Data":"17bb93e333b2823661450eda7f76dad7ce26c1c71e2f9248b07e522654054068"} Jan 29 14:23:38 crc kubenswrapper[4753]: I0129 14:23:38.199547 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-lhrcx"] Jan 29 14:23:38 crc kubenswrapper[4753]: I0129 14:23:38.199852 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bfb54f9b5-lhrcx"] Jan 29 14:23:38 crc kubenswrapper[4753]: E0129 14:23:38.369762 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod593c409e_6f89_4cb9_8ab0_7910612780db.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod593c409e_6f89_4cb9_8ab0_7910612780db.slice/crio-2a50d1f0c99bdb7f8526600576379385f9397a9272e98e1e55db5b8298016372\": RecentStats: unable to find data in memory cache]" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.147187 4753 generic.go:334] "Generic (PLEG): container finished" podID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerID="2db5d885e81eea5189f34241c33621a360d510b2296d497b487d4703bfd23c20" exitCode=0 Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.147254 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7d0250a-2990-49be-9486-ce7e4302e9e0","Type":"ContainerDied","Data":"2db5d885e81eea5189f34241c33621a360d510b2296d497b487d4703bfd23c20"} Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.149471 4753 generic.go:334] "Generic (PLEG): container finished" podID="5d0d7f08-1d4d-451d-98c3-873ac514fd53" containerID="1382f4decacf5f428514245aaf0e2515208a47c41adb2f648c7202bccd9216f9" exitCode=0 Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.166959 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593c409e-6f89-4cb9-8ab0-7910612780db" path="/var/lib/kubelet/pods/593c409e-6f89-4cb9-8ab0-7910612780db/volumes" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.168070 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wml54" event={"ID":"5d0d7f08-1d4d-451d-98c3-873ac514fd53","Type":"ContainerDied","Data":"1382f4decacf5f428514245aaf0e2515208a47c41adb2f648c7202bccd9216f9"} Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.514311 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.591570 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-log-httpd\") pod \"a7d0250a-2990-49be-9486-ce7e4302e9e0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.591661 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-sg-core-conf-yaml\") pod \"a7d0250a-2990-49be-9486-ce7e4302e9e0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.591763 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-ceilometer-tls-certs\") pod \"a7d0250a-2990-49be-9486-ce7e4302e9e0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.591842 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-scripts\") pod \"a7d0250a-2990-49be-9486-ce7e4302e9e0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.591953 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-combined-ca-bundle\") pod \"a7d0250a-2990-49be-9486-ce7e4302e9e0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.592010 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c6mc\" (UniqueName: \"kubernetes.io/projected/a7d0250a-2990-49be-9486-ce7e4302e9e0-kube-api-access-2c6mc\") pod \"a7d0250a-2990-49be-9486-ce7e4302e9e0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.592113 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-config-data\") pod \"a7d0250a-2990-49be-9486-ce7e4302e9e0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.592332 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-run-httpd\") pod \"a7d0250a-2990-49be-9486-ce7e4302e9e0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.592338 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a7d0250a-2990-49be-9486-ce7e4302e9e0" (UID: "a7d0250a-2990-49be-9486-ce7e4302e9e0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.593171 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.593630 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a7d0250a-2990-49be-9486-ce7e4302e9e0" (UID: "a7d0250a-2990-49be-9486-ce7e4302e9e0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.600185 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d0250a-2990-49be-9486-ce7e4302e9e0-kube-api-access-2c6mc" (OuterVolumeSpecName: "kube-api-access-2c6mc") pod "a7d0250a-2990-49be-9486-ce7e4302e9e0" (UID: "a7d0250a-2990-49be-9486-ce7e4302e9e0"). InnerVolumeSpecName "kube-api-access-2c6mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.611373 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-scripts" (OuterVolumeSpecName: "scripts") pod "a7d0250a-2990-49be-9486-ce7e4302e9e0" (UID: "a7d0250a-2990-49be-9486-ce7e4302e9e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.629227 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a7d0250a-2990-49be-9486-ce7e4302e9e0" (UID: "a7d0250a-2990-49be-9486-ce7e4302e9e0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.660629 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a7d0250a-2990-49be-9486-ce7e4302e9e0" (UID: "a7d0250a-2990-49be-9486-ce7e4302e9e0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.693486 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-config-data" (OuterVolumeSpecName: "config-data") pod "a7d0250a-2990-49be-9486-ce7e4302e9e0" (UID: "a7d0250a-2990-49be-9486-ce7e4302e9e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.694435 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-config-data\") pod \"a7d0250a-2990-49be-9486-ce7e4302e9e0\" (UID: \"a7d0250a-2990-49be-9486-ce7e4302e9e0\") " Jan 29 14:23:40 crc kubenswrapper[4753]: W0129 14:23:40.694615 4753 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a7d0250a-2990-49be-9486-ce7e4302e9e0/volumes/kubernetes.io~secret/config-data Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.694648 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-config-data" (OuterVolumeSpecName: "config-data") pod "a7d0250a-2990-49be-9486-ce7e4302e9e0" (UID: "a7d0250a-2990-49be-9486-ce7e4302e9e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.694862 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.694883 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7d0250a-2990-49be-9486-ce7e4302e9e0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.694894 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.694903 4753 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.694912 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.694922 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c6mc\" (UniqueName: \"kubernetes.io/projected/a7d0250a-2990-49be-9486-ce7e4302e9e0-kube-api-access-2c6mc\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.696144 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7d0250a-2990-49be-9486-ce7e4302e9e0" (UID: "a7d0250a-2990-49be-9486-ce7e4302e9e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:40 crc kubenswrapper[4753]: I0129 14:23:40.796558 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d0250a-2990-49be-9486-ce7e4302e9e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.167308 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7d0250a-2990-49be-9486-ce7e4302e9e0","Type":"ContainerDied","Data":"aa13b726a6cb8767ea49c4829a1fecbcee0a3661bc1ba2a24dbe533e686cfd15"} Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.167420 4753 scope.go:117] "RemoveContainer" containerID="280cccf7b8efdec37605eddf07e4950ab5aa0f8080daf672409d33e82356a17b" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.167335 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.250011 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.257952 4753 scope.go:117] "RemoveContainer" containerID="7a91c94eac94ff5110e30ed78e7fd3319e7ed69912fde42cc444f8928153a890" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.274227 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.286319 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:41 crc kubenswrapper[4753]: E0129 14:23:41.286867 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593c409e-6f89-4cb9-8ab0-7910612780db" containerName="init" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.286916 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="593c409e-6f89-4cb9-8ab0-7910612780db" containerName="init" Jan 29 14:23:41 crc kubenswrapper[4753]: E0129 14:23:41.286935 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="ceilometer-central-agent" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.286945 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="ceilometer-central-agent" Jan 29 14:23:41 crc kubenswrapper[4753]: E0129 14:23:41.286958 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="proxy-httpd" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.286969 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="proxy-httpd" Jan 29 14:23:41 crc kubenswrapper[4753]: E0129 14:23:41.286995 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593c409e-6f89-4cb9-8ab0-7910612780db" containerName="dnsmasq-dns" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.287008 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="593c409e-6f89-4cb9-8ab0-7910612780db" containerName="dnsmasq-dns" Jan 29 14:23:41 crc kubenswrapper[4753]: E0129 14:23:41.287034 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="sg-core" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.287044 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="sg-core" Jan 29 14:23:41 crc kubenswrapper[4753]: E0129 14:23:41.287076 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="ceilometer-notification-agent" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.287087 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="ceilometer-notification-agent" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.287338 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="ceilometer-notification-agent" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.287354 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="proxy-httpd" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.287377 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="sg-core" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.287387 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" containerName="ceilometer-central-agent" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.287406 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="593c409e-6f89-4cb9-8ab0-7910612780db" containerName="dnsmasq-dns" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.289969 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.297039 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.297560 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.297711 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.307207 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-log-httpd\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.307271 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.307299 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-scripts\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.307345 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr4sw\" (UniqueName: \"kubernetes.io/projected/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-kube-api-access-kr4sw\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.307384 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.307424 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.307507 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-config-data\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.307533 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-run-httpd\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.307567 4753 scope.go:117] "RemoveContainer" containerID="17bb93e333b2823661450eda7f76dad7ce26c1c71e2f9248b07e522654054068" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.315033 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.334799 4753 scope.go:117] "RemoveContainer" containerID="2db5d885e81eea5189f34241c33621a360d510b2296d497b487d4703bfd23c20" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.412074 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-log-httpd\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.412143 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.412178 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-scripts\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.412228 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr4sw\" (UniqueName: \"kubernetes.io/projected/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-kube-api-access-kr4sw\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.412480 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.412644 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.412758 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-log-httpd\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.412857 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-config-data\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.412900 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-run-httpd\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.413419 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-run-httpd\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.416389 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.416836 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.417707 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.421797 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-config-data\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.426204 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-scripts\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.429090 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr4sw\" (UniqueName: \"kubernetes.io/projected/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-kube-api-access-kr4sw\") pod \"ceilometer-0\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.606299 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.614364 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.615335 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-config-data\") pod \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.615460 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfgxn\" (UniqueName: \"kubernetes.io/projected/5d0d7f08-1d4d-451d-98c3-873ac514fd53-kube-api-access-lfgxn\") pod \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.615526 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-scripts\") pod \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.615603 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-combined-ca-bundle\") pod \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\" (UID: \"5d0d7f08-1d4d-451d-98c3-873ac514fd53\") " Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.622630 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-scripts" (OuterVolumeSpecName: "scripts") pod "5d0d7f08-1d4d-451d-98c3-873ac514fd53" (UID: "5d0d7f08-1d4d-451d-98c3-873ac514fd53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.625955 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d0d7f08-1d4d-451d-98c3-873ac514fd53-kube-api-access-lfgxn" (OuterVolumeSpecName: "kube-api-access-lfgxn") pod "5d0d7f08-1d4d-451d-98c3-873ac514fd53" (UID: "5d0d7f08-1d4d-451d-98c3-873ac514fd53"). InnerVolumeSpecName "kube-api-access-lfgxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.641954 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-config-data" (OuterVolumeSpecName: "config-data") pod "5d0d7f08-1d4d-451d-98c3-873ac514fd53" (UID: "5d0d7f08-1d4d-451d-98c3-873ac514fd53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.652427 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d0d7f08-1d4d-451d-98c3-873ac514fd53" (UID: "5d0d7f08-1d4d-451d-98c3-873ac514fd53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.718292 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.718496 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.718617 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfgxn\" (UniqueName: \"kubernetes.io/projected/5d0d7f08-1d4d-451d-98c3-873ac514fd53-kube-api-access-lfgxn\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.718758 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0d7f08-1d4d-451d-98c3-873ac514fd53-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:41 crc kubenswrapper[4753]: I0129 14:23:41.905399 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.167926 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d0250a-2990-49be-9486-ce7e4302e9e0" path="/var/lib/kubelet/pods/a7d0250a-2990-49be-9486-ce7e4302e9e0/volumes" Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.181480 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d","Type":"ContainerStarted","Data":"13ebadd82bb3bab52a4dc2e23a96c886a9bb225f678fbd5dc64f63564f50661a"} Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.182886 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wml54" event={"ID":"5d0d7f08-1d4d-451d-98c3-873ac514fd53","Type":"ContainerDied","Data":"a90e38b874e69019e1ccab80f34e57ac51635c41e3df8c3e86e195503098495c"} Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.182918 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90e38b874e69019e1ccab80f34e57ac51635c41e3df8c3e86e195503098495c" Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.182977 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wml54" Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.387290 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.387952 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" containerName="nova-api-log" containerID="cri-o://804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86" gracePeriod=30 Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.388075 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" containerName="nova-api-api" containerID="cri-o://4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b" gracePeriod=30 Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.399743 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.399951 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1fff4213-2daa-4a3d-9802-ee2f39232e15" containerName="nova-scheduler-scheduler" containerID="cri-o://d21f9f454fbad56dc31ecf236885f4f7d85e3d99cbd2b1aab2f6aca36d593874" gracePeriod=30 Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.440992 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.441280 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-log" containerID="cri-o://06f2a776446be8e207b874f0c09730d6dd63037c338c1c964cfe4ac94fa1b220" gracePeriod=30 Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.441629 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-metadata" containerID="cri-o://308efde5a1b73816a7b74039efd8e127ce5aafd5f6d5ee3906227a9fcfccf2b8" gracePeriod=30 Jan 29 14:23:42 crc kubenswrapper[4753]: I0129 14:23:42.923085 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.056657 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-public-tls-certs\") pod \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.056719 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-combined-ca-bundle\") pod \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.056798 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcjsl\" (UniqueName: \"kubernetes.io/projected/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-kube-api-access-lcjsl\") pod \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.056844 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-logs\") pod \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.056869 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-internal-tls-certs\") pod \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.056964 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-config-data\") pod \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\" (UID: \"1eb78eb9-7d3f-42cf-a308-2fb173bcb284\") " Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.057753 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-logs" (OuterVolumeSpecName: "logs") pod "1eb78eb9-7d3f-42cf-a308-2fb173bcb284" (UID: "1eb78eb9-7d3f-42cf-a308-2fb173bcb284"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.063098 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-kube-api-access-lcjsl" (OuterVolumeSpecName: "kube-api-access-lcjsl") pod "1eb78eb9-7d3f-42cf-a308-2fb173bcb284" (UID: "1eb78eb9-7d3f-42cf-a308-2fb173bcb284"). InnerVolumeSpecName "kube-api-access-lcjsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.083963 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-config-data" (OuterVolumeSpecName: "config-data") pod "1eb78eb9-7d3f-42cf-a308-2fb173bcb284" (UID: "1eb78eb9-7d3f-42cf-a308-2fb173bcb284"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.100229 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1eb78eb9-7d3f-42cf-a308-2fb173bcb284" (UID: "1eb78eb9-7d3f-42cf-a308-2fb173bcb284"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.120525 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1eb78eb9-7d3f-42cf-a308-2fb173bcb284" (UID: "1eb78eb9-7d3f-42cf-a308-2fb173bcb284"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.132117 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1eb78eb9-7d3f-42cf-a308-2fb173bcb284" (UID: "1eb78eb9-7d3f-42cf-a308-2fb173bcb284"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.159438 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.159482 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.159496 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.159509 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcjsl\" (UniqueName: \"kubernetes.io/projected/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-kube-api-access-lcjsl\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.159521 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.159533 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eb78eb9-7d3f-42cf-a308-2fb173bcb284-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.197419 4753 generic.go:334] "Generic (PLEG): container finished" podID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" containerID="4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b" exitCode=0 Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.197447 4753 generic.go:334] "Generic (PLEG): container finished" podID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" containerID="804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86" exitCode=143 Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.197494 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.197506 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1eb78eb9-7d3f-42cf-a308-2fb173bcb284","Type":"ContainerDied","Data":"4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b"} Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.197583 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1eb78eb9-7d3f-42cf-a308-2fb173bcb284","Type":"ContainerDied","Data":"804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86"} Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.197637 4753 scope.go:117] "RemoveContainer" containerID="4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.197640 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1eb78eb9-7d3f-42cf-a308-2fb173bcb284","Type":"ContainerDied","Data":"b66360fcfaa829999d42bdac77784cd8615f12db83cf79e82c38f10484579b67"} Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.200040 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d","Type":"ContainerStarted","Data":"77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd"} Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.202517 4753 generic.go:334] "Generic (PLEG): container finished" podID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerID="06f2a776446be8e207b874f0c09730d6dd63037c338c1c964cfe4ac94fa1b220" exitCode=143 Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.202550 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f9b383a-32b0-4302-96ae-4bcd900cd383","Type":"ContainerDied","Data":"06f2a776446be8e207b874f0c09730d6dd63037c338c1c964cfe4ac94fa1b220"} Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.227631 4753 scope.go:117] "RemoveContainer" containerID="804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86" Jan 29 14:23:43 crc kubenswrapper[4753]: E0129 14:23:43.257596 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d21f9f454fbad56dc31ecf236885f4f7d85e3d99cbd2b1aab2f6aca36d593874" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 14:23:43 crc kubenswrapper[4753]: E0129 14:23:43.261360 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d21f9f454fbad56dc31ecf236885f4f7d85e3d99cbd2b1aab2f6aca36d593874" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 14:23:43 crc kubenswrapper[4753]: E0129 14:23:43.271034 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d21f9f454fbad56dc31ecf236885f4f7d85e3d99cbd2b1aab2f6aca36d593874" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 14:23:43 crc kubenswrapper[4753]: E0129 14:23:43.271092 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1fff4213-2daa-4a3d-9802-ee2f39232e15" containerName="nova-scheduler-scheduler" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.282422 4753 scope.go:117] "RemoveContainer" containerID="4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.283879 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:43 crc kubenswrapper[4753]: E0129 14:23:43.285212 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b\": container with ID starting with 4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b not found: ID does not exist" containerID="4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.285256 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b"} err="failed to get container status \"4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b\": rpc error: code = NotFound desc = could not find container \"4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b\": container with ID starting with 4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b not found: ID does not exist" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.285344 4753 scope.go:117] "RemoveContainer" containerID="804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86" Jan 29 14:23:43 crc kubenswrapper[4753]: E0129 14:23:43.289490 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86\": container with ID starting with 804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86 not found: ID does not exist" containerID="804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.289543 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86"} err="failed to get container status \"804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86\": rpc error: code = NotFound desc = could not find container \"804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86\": container with ID starting with 804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86 not found: ID does not exist" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.289586 4753 scope.go:117] "RemoveContainer" containerID="4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.290366 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b"} err="failed to get container status \"4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b\": rpc error: code = NotFound desc = could not find container \"4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b\": container with ID starting with 4abcc60a32b46da628f410baaf45f01669fa805cd98b27ba5466e6c78586d73b not found: ID does not exist" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.290398 4753 scope.go:117] "RemoveContainer" containerID="804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.294375 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86"} err="failed to get container status \"804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86\": rpc error: code = NotFound desc = could not find container \"804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86\": container with ID starting with 804531e2367a8e041dea8167970b7c59ea7f58ae21d125e6aa429e6a51a1ab86 not found: ID does not exist" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.297614 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.310553 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:43 crc kubenswrapper[4753]: E0129 14:23:43.310934 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" containerName="nova-api-api" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.310952 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" containerName="nova-api-api" Jan 29 14:23:43 crc kubenswrapper[4753]: E0129 14:23:43.310964 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0d7f08-1d4d-451d-98c3-873ac514fd53" containerName="nova-manage" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.310970 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0d7f08-1d4d-451d-98c3-873ac514fd53" containerName="nova-manage" Jan 29 14:23:43 crc kubenswrapper[4753]: E0129 14:23:43.310986 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" containerName="nova-api-log" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.310992 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" containerName="nova-api-log" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.311175 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" containerName="nova-api-log" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.311186 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0d7f08-1d4d-451d-98c3-873ac514fd53" containerName="nova-manage" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.311204 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" containerName="nova-api-api" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.312118 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.314898 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.315105 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.315895 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.320543 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.479178 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.479222 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13672aee-1e34-4763-88d7-35ac9b484c87-logs\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.479243 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-config-data\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.479314 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-internal-tls-certs\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.479342 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqshj\" (UniqueName: \"kubernetes.io/projected/13672aee-1e34-4763-88d7-35ac9b484c87-kube-api-access-hqshj\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.479408 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-public-tls-certs\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.580743 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-public-tls-certs\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.580809 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.580825 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13672aee-1e34-4763-88d7-35ac9b484c87-logs\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.580840 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-config-data\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.580908 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-internal-tls-certs\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.580934 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqshj\" (UniqueName: \"kubernetes.io/projected/13672aee-1e34-4763-88d7-35ac9b484c87-kube-api-access-hqshj\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.581727 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13672aee-1e34-4763-88d7-35ac9b484c87-logs\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.589620 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.590658 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-config-data\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.595878 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-public-tls-certs\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.600798 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqshj\" (UniqueName: \"kubernetes.io/projected/13672aee-1e34-4763-88d7-35ac9b484c87-kube-api-access-hqshj\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.601270 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-internal-tls-certs\") pod \"nova-api-0\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " pod="openstack/nova-api-0" Jan 29 14:23:43 crc kubenswrapper[4753]: I0129 14:23:43.689012 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:23:44 crc kubenswrapper[4753]: I0129 14:23:44.164646 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eb78eb9-7d3f-42cf-a308-2fb173bcb284" path="/var/lib/kubelet/pods/1eb78eb9-7d3f-42cf-a308-2fb173bcb284/volumes" Jan 29 14:23:44 crc kubenswrapper[4753]: W0129 14:23:44.194595 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13672aee_1e34_4763_88d7_35ac9b484c87.slice/crio-2b15e24310ae7db5d1643f62ac2bdf1b6aff69a043369d89e1e5e87030e70979 WatchSource:0}: Error finding container 2b15e24310ae7db5d1643f62ac2bdf1b6aff69a043369d89e1e5e87030e70979: Status 404 returned error can't find the container with id 2b15e24310ae7db5d1643f62ac2bdf1b6aff69a043369d89e1e5e87030e70979 Jan 29 14:23:44 crc kubenswrapper[4753]: I0129 14:23:44.195470 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:23:44 crc kubenswrapper[4753]: I0129 14:23:44.233756 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d","Type":"ContainerStarted","Data":"f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5"} Jan 29 14:23:44 crc kubenswrapper[4753]: I0129 14:23:44.238231 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"13672aee-1e34-4763-88d7-35ac9b484c87","Type":"ContainerStarted","Data":"2b15e24310ae7db5d1643f62ac2bdf1b6aff69a043369d89e1e5e87030e70979"} Jan 29 14:23:45 crc kubenswrapper[4753]: I0129 14:23:45.248414 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d","Type":"ContainerStarted","Data":"8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4"} Jan 29 14:23:45 crc kubenswrapper[4753]: I0129 14:23:45.250564 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"13672aee-1e34-4763-88d7-35ac9b484c87","Type":"ContainerStarted","Data":"49145b48bee89f5fd944b3ebdb12f7d989505bd4659c96c697f65f65d2481518"} Jan 29 14:23:45 crc kubenswrapper[4753]: I0129 14:23:45.250585 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"13672aee-1e34-4763-88d7-35ac9b484c87","Type":"ContainerStarted","Data":"63959aa4ed603fda25a0942fccae384fdcda0338c3bd1c0131967af7f34b728b"} Jan 29 14:23:45 crc kubenswrapper[4753]: I0129 14:23:45.273201 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.273178015 podStartE2EDuration="2.273178015s" podCreationTimestamp="2026-01-29 14:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:45.268780087 +0000 UTC m=+1259.963514559" watchObservedRunningTime="2026-01-29 14:23:45.273178015 +0000 UTC m=+1259.967912397" Jan 29 14:23:45 crc kubenswrapper[4753]: I0129 14:23:45.597369 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:44148->10.217.0.192:8775: read: connection reset by peer" Jan 29 14:23:45 crc kubenswrapper[4753]: I0129 14:23:45.597778 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:44132->10.217.0.192:8775: read: connection reset by peer" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.262617 4753 generic.go:334] "Generic (PLEG): container finished" podID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerID="308efde5a1b73816a7b74039efd8e127ce5aafd5f6d5ee3906227a9fcfccf2b8" exitCode=0 Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.263027 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f9b383a-32b0-4302-96ae-4bcd900cd383","Type":"ContainerDied","Data":"308efde5a1b73816a7b74039efd8e127ce5aafd5f6d5ee3906227a9fcfccf2b8"} Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.263271 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f9b383a-32b0-4302-96ae-4bcd900cd383","Type":"ContainerDied","Data":"7ade97f6a50978c412d673c038a2ea1a096aec1c8a7e3af70a77c8dd43c962b0"} Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.263288 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ade97f6a50978c412d673c038a2ea1a096aec1c8a7e3af70a77c8dd43c962b0" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.266432 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.441128 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-config-data\") pod \"7f9b383a-32b0-4302-96ae-4bcd900cd383\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.441203 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-combined-ca-bundle\") pod \"7f9b383a-32b0-4302-96ae-4bcd900cd383\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.441324 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-nova-metadata-tls-certs\") pod \"7f9b383a-32b0-4302-96ae-4bcd900cd383\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.441384 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9b383a-32b0-4302-96ae-4bcd900cd383-logs\") pod \"7f9b383a-32b0-4302-96ae-4bcd900cd383\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.441427 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spv8n\" (UniqueName: \"kubernetes.io/projected/7f9b383a-32b0-4302-96ae-4bcd900cd383-kube-api-access-spv8n\") pod \"7f9b383a-32b0-4302-96ae-4bcd900cd383\" (UID: \"7f9b383a-32b0-4302-96ae-4bcd900cd383\") " Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.443771 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9b383a-32b0-4302-96ae-4bcd900cd383-logs" (OuterVolumeSpecName: "logs") pod "7f9b383a-32b0-4302-96ae-4bcd900cd383" (UID: "7f9b383a-32b0-4302-96ae-4bcd900cd383"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.448880 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9b383a-32b0-4302-96ae-4bcd900cd383-kube-api-access-spv8n" (OuterVolumeSpecName: "kube-api-access-spv8n") pod "7f9b383a-32b0-4302-96ae-4bcd900cd383" (UID: "7f9b383a-32b0-4302-96ae-4bcd900cd383"). InnerVolumeSpecName "kube-api-access-spv8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.475308 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f9b383a-32b0-4302-96ae-4bcd900cd383" (UID: "7f9b383a-32b0-4302-96ae-4bcd900cd383"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.483029 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-config-data" (OuterVolumeSpecName: "config-data") pod "7f9b383a-32b0-4302-96ae-4bcd900cd383" (UID: "7f9b383a-32b0-4302-96ae-4bcd900cd383"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.544295 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9b383a-32b0-4302-96ae-4bcd900cd383-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.544361 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spv8n\" (UniqueName: \"kubernetes.io/projected/7f9b383a-32b0-4302-96ae-4bcd900cd383-kube-api-access-spv8n\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.544381 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.544395 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.545305 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7f9b383a-32b0-4302-96ae-4bcd900cd383" (UID: "7f9b383a-32b0-4302-96ae-4bcd900cd383"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:46 crc kubenswrapper[4753]: I0129 14:23:46.645854 4753 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9b383a-32b0-4302-96ae-4bcd900cd383-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.277948 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d","Type":"ContainerStarted","Data":"0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc"} Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.278432 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.282866 4753 generic.go:334] "Generic (PLEG): container finished" podID="1fff4213-2daa-4a3d-9802-ee2f39232e15" containerID="d21f9f454fbad56dc31ecf236885f4f7d85e3d99cbd2b1aab2f6aca36d593874" exitCode=0 Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.282940 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.282967 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fff4213-2daa-4a3d-9802-ee2f39232e15","Type":"ContainerDied","Data":"d21f9f454fbad56dc31ecf236885f4f7d85e3d99cbd2b1aab2f6aca36d593874"} Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.316577 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.155050143 podStartE2EDuration="6.316555035s" podCreationTimestamp="2026-01-29 14:23:41 +0000 UTC" firstStartedPulling="2026-01-29 14:23:41.913846597 +0000 UTC m=+1256.608580989" lastFinishedPulling="2026-01-29 14:23:46.075351499 +0000 UTC m=+1260.770085881" observedRunningTime="2026-01-29 14:23:47.307359359 +0000 UTC m=+1262.002093741" watchObservedRunningTime="2026-01-29 14:23:47.316555035 +0000 UTC m=+1262.011289417" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.339724 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.375831 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.395953 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:23:47 crc kubenswrapper[4753]: E0129 14:23:47.398877 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-metadata" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.398903 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-metadata" Jan 29 14:23:47 crc kubenswrapper[4753]: E0129 14:23:47.398920 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-log" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.398930 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-log" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.400067 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-metadata" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.400093 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" containerName="nova-metadata-log" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.402020 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.405913 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.405952 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.440200 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.575636 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.584050 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.584120 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-config-data\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.584172 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7947b8f0-b134-40d9-beba-116bbb51a1c2-logs\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.584304 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txzts\" (UniqueName: \"kubernetes.io/projected/7947b8f0-b134-40d9-beba-116bbb51a1c2-kube-api-access-txzts\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.584612 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.686637 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4ts6\" (UniqueName: \"kubernetes.io/projected/1fff4213-2daa-4a3d-9802-ee2f39232e15-kube-api-access-w4ts6\") pod \"1fff4213-2daa-4a3d-9802-ee2f39232e15\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.686800 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-config-data\") pod \"1fff4213-2daa-4a3d-9802-ee2f39232e15\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.686837 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-combined-ca-bundle\") pod \"1fff4213-2daa-4a3d-9802-ee2f39232e15\" (UID: \"1fff4213-2daa-4a3d-9802-ee2f39232e15\") " Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.687336 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.687381 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-config-data\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.687407 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7947b8f0-b134-40d9-beba-116bbb51a1c2-logs\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.687438 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txzts\" (UniqueName: \"kubernetes.io/projected/7947b8f0-b134-40d9-beba-116bbb51a1c2-kube-api-access-txzts\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.687527 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.688140 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7947b8f0-b134-40d9-beba-116bbb51a1c2-logs\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.692967 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.693264 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-config-data\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.695023 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.695927 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fff4213-2daa-4a3d-9802-ee2f39232e15-kube-api-access-w4ts6" (OuterVolumeSpecName: "kube-api-access-w4ts6") pod "1fff4213-2daa-4a3d-9802-ee2f39232e15" (UID: "1fff4213-2daa-4a3d-9802-ee2f39232e15"). InnerVolumeSpecName "kube-api-access-w4ts6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.709362 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txzts\" (UniqueName: \"kubernetes.io/projected/7947b8f0-b134-40d9-beba-116bbb51a1c2-kube-api-access-txzts\") pod \"nova-metadata-0\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.716118 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fff4213-2daa-4a3d-9802-ee2f39232e15" (UID: "1fff4213-2daa-4a3d-9802-ee2f39232e15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.721411 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-config-data" (OuterVolumeSpecName: "config-data") pod "1fff4213-2daa-4a3d-9802-ee2f39232e15" (UID: "1fff4213-2daa-4a3d-9802-ee2f39232e15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.728010 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.789204 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4ts6\" (UniqueName: \"kubernetes.io/projected/1fff4213-2daa-4a3d-9802-ee2f39232e15-kube-api-access-w4ts6\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.789256 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:47 crc kubenswrapper[4753]: I0129 14:23:47.789275 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fff4213-2daa-4a3d-9802-ee2f39232e15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.171636 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9b383a-32b0-4302-96ae-4bcd900cd383" path="/var/lib/kubelet/pods/7f9b383a-32b0-4302-96ae-4bcd900cd383/volumes" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.215582 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.339567 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7947b8f0-b134-40d9-beba-116bbb51a1c2","Type":"ContainerStarted","Data":"c469b20772701e96ce71bae29e31c81555dc717d1d9369f4c7fa004bbaac807d"} Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.343968 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.345002 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fff4213-2daa-4a3d-9802-ee2f39232e15","Type":"ContainerDied","Data":"6edc214cbc4c81d13d9353a233b801d7de40b8232d1f179121b79e96bf952152"} Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.345038 4753 scope.go:117] "RemoveContainer" containerID="d21f9f454fbad56dc31ecf236885f4f7d85e3d99cbd2b1aab2f6aca36d593874" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.390186 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.425845 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.445284 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:48 crc kubenswrapper[4753]: E0129 14:23:48.445877 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fff4213-2daa-4a3d-9802-ee2f39232e15" containerName="nova-scheduler-scheduler" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.445897 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fff4213-2daa-4a3d-9802-ee2f39232e15" containerName="nova-scheduler-scheduler" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.446183 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fff4213-2daa-4a3d-9802-ee2f39232e15" containerName="nova-scheduler-scheduler" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.446965 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.450481 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.460659 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.605025 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5xw9\" (UniqueName: \"kubernetes.io/projected/15b88ba9-8449-4e76-a36c-34ca2b2be488-kube-api-access-d5xw9\") pod \"nova-scheduler-0\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.605469 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.605712 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-config-data\") pod \"nova-scheduler-0\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.707367 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5xw9\" (UniqueName: \"kubernetes.io/projected/15b88ba9-8449-4e76-a36c-34ca2b2be488-kube-api-access-d5xw9\") pod \"nova-scheduler-0\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.707774 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.708317 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-config-data\") pod \"nova-scheduler-0\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.712765 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-config-data\") pod \"nova-scheduler-0\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.713199 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.723773 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5xw9\" (UniqueName: \"kubernetes.io/projected/15b88ba9-8449-4e76-a36c-34ca2b2be488-kube-api-access-d5xw9\") pod \"nova-scheduler-0\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " pod="openstack/nova-scheduler-0" Jan 29 14:23:48 crc kubenswrapper[4753]: I0129 14:23:48.804196 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:23:49 crc kubenswrapper[4753]: I0129 14:23:49.362986 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7947b8f0-b134-40d9-beba-116bbb51a1c2","Type":"ContainerStarted","Data":"0a800f0f0fcb4a6f6136a9675b82b2ab62e4096da3d7cc45f7830d2af553041f"} Jan 29 14:23:49 crc kubenswrapper[4753]: I0129 14:23:49.363338 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7947b8f0-b134-40d9-beba-116bbb51a1c2","Type":"ContainerStarted","Data":"f34f5bfa0d06f0211a2a796c39c41d96c4d203bd1909c7053e13afe8556789c1"} Jan 29 14:23:49 crc kubenswrapper[4753]: W0129 14:23:49.373524 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15b88ba9_8449_4e76_a36c_34ca2b2be488.slice/crio-9a062ab3deb87a8eed7b4c1310901e0de2721c9113c51a5bae9d5b93a5fad4af WatchSource:0}: Error finding container 9a062ab3deb87a8eed7b4c1310901e0de2721c9113c51a5bae9d5b93a5fad4af: Status 404 returned error can't find the container with id 9a062ab3deb87a8eed7b4c1310901e0de2721c9113c51a5bae9d5b93a5fad4af Jan 29 14:23:49 crc kubenswrapper[4753]: I0129 14:23:49.375977 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:23:49 crc kubenswrapper[4753]: I0129 14:23:49.405117 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.405090339 podStartE2EDuration="2.405090339s" podCreationTimestamp="2026-01-29 14:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:49.399671334 +0000 UTC m=+1264.094405756" watchObservedRunningTime="2026-01-29 14:23:49.405090339 +0000 UTC m=+1264.099824761" Jan 29 14:23:50 crc kubenswrapper[4753]: I0129 14:23:50.168503 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fff4213-2daa-4a3d-9802-ee2f39232e15" path="/var/lib/kubelet/pods/1fff4213-2daa-4a3d-9802-ee2f39232e15/volumes" Jan 29 14:23:50 crc kubenswrapper[4753]: I0129 14:23:50.377747 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15b88ba9-8449-4e76-a36c-34ca2b2be488","Type":"ContainerStarted","Data":"24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539"} Jan 29 14:23:50 crc kubenswrapper[4753]: I0129 14:23:50.378050 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15b88ba9-8449-4e76-a36c-34ca2b2be488","Type":"ContainerStarted","Data":"9a062ab3deb87a8eed7b4c1310901e0de2721c9113c51a5bae9d5b93a5fad4af"} Jan 29 14:23:50 crc kubenswrapper[4753]: I0129 14:23:50.421924 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.421892063 podStartE2EDuration="2.421892063s" podCreationTimestamp="2026-01-29 14:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 14:23:50.408193498 +0000 UTC m=+1265.102927920" watchObservedRunningTime="2026-01-29 14:23:50.421892063 +0000 UTC m=+1265.116626485" Jan 29 14:23:52 crc kubenswrapper[4753]: I0129 14:23:52.728472 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 14:23:52 crc kubenswrapper[4753]: I0129 14:23:52.728968 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 14:23:53 crc kubenswrapper[4753]: I0129 14:23:53.689979 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 14:23:53 crc kubenswrapper[4753]: I0129 14:23:53.690395 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 14:23:53 crc kubenswrapper[4753]: I0129 14:23:53.805550 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 14:23:54 crc kubenswrapper[4753]: I0129 14:23:54.711470 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 14:23:54 crc kubenswrapper[4753]: I0129 14:23:54.711494 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 14:23:57 crc kubenswrapper[4753]: I0129 14:23:57.729304 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 14:23:57 crc kubenswrapper[4753]: I0129 14:23:57.730042 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 14:23:58 crc kubenswrapper[4753]: I0129 14:23:58.748307 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 29 14:23:58 crc kubenswrapper[4753]: I0129 14:23:58.748347 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 14:23:58 crc kubenswrapper[4753]: I0129 14:23:58.805119 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 14:23:58 crc kubenswrapper[4753]: I0129 14:23:58.866433 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 14:23:59 crc kubenswrapper[4753]: I0129 14:23:59.537486 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 14:24:03 crc kubenswrapper[4753]: I0129 14:24:03.702577 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 14:24:03 crc kubenswrapper[4753]: I0129 14:24:03.703703 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 14:24:03 crc kubenswrapper[4753]: I0129 14:24:03.704184 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 14:24:03 crc kubenswrapper[4753]: I0129 14:24:03.704256 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 14:24:03 crc kubenswrapper[4753]: I0129 14:24:03.713996 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 14:24:03 crc kubenswrapper[4753]: I0129 14:24:03.715880 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 14:24:07 crc kubenswrapper[4753]: I0129 14:24:07.741507 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 14:24:07 crc kubenswrapper[4753]: I0129 14:24:07.742928 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 14:24:07 crc kubenswrapper[4753]: I0129 14:24:07.753233 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 14:24:07 crc kubenswrapper[4753]: I0129 14:24:07.753867 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 14:24:11 crc kubenswrapper[4753]: I0129 14:24:11.624237 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 29 14:24:27 crc kubenswrapper[4753]: I0129 14:24:27.054874 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:24:27 crc kubenswrapper[4753]: I0129 14:24:27.056354 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.538218 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xcsvp"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.551760 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xcsvp"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.652223 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-c286z"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.653618 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c286z" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.658688 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.660991 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c286z"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.716241 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.733975 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b7f8-account-create-update-2zm4w"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.735181 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7f8-account-create-update-2zm4w" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.736998 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.748040 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts\") pod \"root-account-create-update-c286z\" (UID: \"558ab235-719b-4913-aab2-863bfa6586e8\") " pod="openstack/root-account-create-update-c286z" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.748111 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2s4w\" (UniqueName: \"kubernetes.io/projected/558ab235-719b-4913-aab2-863bfa6586e8-kube-api-access-v2s4w\") pod \"root-account-create-update-c286z\" (UID: \"558ab235-719b-4913-aab2-863bfa6586e8\") " pod="openstack/root-account-create-update-c286z" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.772348 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b7f8-account-create-update-2zm4w"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.792579 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.792786 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="606db0f5-bff8-4d65-bbad-63b8b1ba362c" containerName="openstackclient" containerID="cri-o://b9f6872bb6478c7a1880d301b55f528a8869bd68eb2346a5287f912f5e9ed844" gracePeriod=2 Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.809235 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.851940 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts\") pod \"root-account-create-update-c286z\" (UID: \"558ab235-719b-4913-aab2-863bfa6586e8\") " pod="openstack/root-account-create-update-c286z" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.852044 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc24c61-81b8-4f51-a2c7-961548f1b11b-operator-scripts\") pod \"neutron-b7f8-account-create-update-2zm4w\" (UID: \"0dc24c61-81b8-4f51-a2c7-961548f1b11b\") " pod="openstack/neutron-b7f8-account-create-update-2zm4w" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.852125 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2s4w\" (UniqueName: \"kubernetes.io/projected/558ab235-719b-4913-aab2-863bfa6586e8-kube-api-access-v2s4w\") pod \"root-account-create-update-c286z\" (UID: \"558ab235-719b-4913-aab2-863bfa6586e8\") " pod="openstack/root-account-create-update-c286z" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.852301 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh67z\" (UniqueName: \"kubernetes.io/projected/0dc24c61-81b8-4f51-a2c7-961548f1b11b-kube-api-access-mh67z\") pod \"neutron-b7f8-account-create-update-2zm4w\" (UID: \"0dc24c61-81b8-4f51-a2c7-961548f1b11b\") " pod="openstack/neutron-b7f8-account-create-update-2zm4w" Jan 29 14:24:32 crc kubenswrapper[4753]: E0129 14:24:32.855765 4753 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 29 14:24:32 crc kubenswrapper[4753]: E0129 14:24:32.855816 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data podName:5f7e3e27-a036-4623-8d63-557a3c0d76e6 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:33.355802775 +0000 UTC m=+1308.050537157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data") pod "rabbitmq-cell1-server-0" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6") : configmap "rabbitmq-cell1-config-data" not found Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.856744 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts\") pod \"root-account-create-update-c286z\" (UID: \"558ab235-719b-4913-aab2-863bfa6586e8\") " pod="openstack/root-account-create-update-c286z" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.868282 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.868589 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="86d005cc-e014-44b4-b8fa-f402d656ae5a" containerName="cinder-scheduler" containerID="cri-o://5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef" gracePeriod=30 Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.869031 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="86d005cc-e014-44b4-b8fa-f402d656ae5a" containerName="probe" containerID="cri-o://b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046" gracePeriod=30 Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.890233 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9257-account-create-update-bzjs4"] Jan 29 14:24:32 crc kubenswrapper[4753]: E0129 14:24:32.890621 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606db0f5-bff8-4d65-bbad-63b8b1ba362c" containerName="openstackclient" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.890636 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="606db0f5-bff8-4d65-bbad-63b8b1ba362c" containerName="openstackclient" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.890845 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="606db0f5-bff8-4d65-bbad-63b8b1ba362c" containerName="openstackclient" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.891429 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9257-account-create-update-bzjs4" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.899216 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.913643 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2s4w\" (UniqueName: \"kubernetes.io/projected/558ab235-719b-4913-aab2-863bfa6586e8-kube-api-access-v2s4w\") pod \"root-account-create-update-c286z\" (UID: \"558ab235-719b-4913-aab2-863bfa6586e8\") " pod="openstack/root-account-create-update-c286z" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.949442 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b7f8-account-create-update-b2tcz"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.956378 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8nw\" (UniqueName: \"kubernetes.io/projected/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-kube-api-access-9v8nw\") pod \"barbican-9257-account-create-update-bzjs4\" (UID: \"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0\") " pod="openstack/barbican-9257-account-create-update-bzjs4" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.956453 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc24c61-81b8-4f51-a2c7-961548f1b11b-operator-scripts\") pod \"neutron-b7f8-account-create-update-2zm4w\" (UID: \"0dc24c61-81b8-4f51-a2c7-961548f1b11b\") " pod="openstack/neutron-b7f8-account-create-update-2zm4w" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.956512 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-operator-scripts\") pod \"barbican-9257-account-create-update-bzjs4\" (UID: \"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0\") " pod="openstack/barbican-9257-account-create-update-bzjs4" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.956550 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh67z\" (UniqueName: \"kubernetes.io/projected/0dc24c61-81b8-4f51-a2c7-961548f1b11b-kube-api-access-mh67z\") pod \"neutron-b7f8-account-create-update-2zm4w\" (UID: \"0dc24c61-81b8-4f51-a2c7-961548f1b11b\") " pod="openstack/neutron-b7f8-account-create-update-2zm4w" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.957589 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc24c61-81b8-4f51-a2c7-961548f1b11b-operator-scripts\") pod \"neutron-b7f8-account-create-update-2zm4w\" (UID: \"0dc24c61-81b8-4f51-a2c7-961548f1b11b\") " pod="openstack/neutron-b7f8-account-create-update-2zm4w" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.970741 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b7f8-account-create-update-b2tcz"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.985807 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c286z" Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.986315 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9257-account-create-update-bzjs4"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.999346 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 14:24:32 crc kubenswrapper[4753]: I0129 14:24:32.999649 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="31019cc8-ce90-453f-be4f-949ed45a5873" containerName="ovn-northd" containerID="cri-o://b19308d0814c3df635fdb38228ce9b7ebf5a99fefcc0274c1834d736932a59bd" gracePeriod=30 Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.000114 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="31019cc8-ce90-453f-be4f-949ed45a5873" containerName="openstack-network-exporter" containerID="cri-o://0c9a422d95efc2b8a373980fb0f3a46037a883ada7821c87b5bc7209541856f4" gracePeriod=30 Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.030579 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh67z\" (UniqueName: \"kubernetes.io/projected/0dc24c61-81b8-4f51-a2c7-961548f1b11b-kube-api-access-mh67z\") pod \"neutron-b7f8-account-create-update-2zm4w\" (UID: \"0dc24c61-81b8-4f51-a2c7-961548f1b11b\") " pod="openstack/neutron-b7f8-account-create-update-2zm4w" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.047250 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.047550 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerName="cinder-api-log" containerID="cri-o://656f30c915a464984557b8cece588ce1d9b95d296a1ac31530c3c6877585393c" gracePeriod=30 Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.047877 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerName="cinder-api" containerID="cri-o://a111f231b7967402acc681815c59ca3c8b5a6d1e5677b5d94e8de77f77841cbc" gracePeriod=30 Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.059296 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7f8-account-create-update-2zm4w" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.059674 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8nw\" (UniqueName: \"kubernetes.io/projected/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-kube-api-access-9v8nw\") pod \"barbican-9257-account-create-update-bzjs4\" (UID: \"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0\") " pod="openstack/barbican-9257-account-create-update-bzjs4" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.059833 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-operator-scripts\") pod \"barbican-9257-account-create-update-bzjs4\" (UID: \"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0\") " pod="openstack/barbican-9257-account-create-update-bzjs4" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.069551 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-operator-scripts\") pod \"barbican-9257-account-create-update-bzjs4\" (UID: \"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0\") " pod="openstack/barbican-9257-account-create-update-bzjs4" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.088224 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9257-account-create-update-gx8hz"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.157865 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8nw\" (UniqueName: \"kubernetes.io/projected/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-kube-api-access-9v8nw\") pod \"barbican-9257-account-create-update-bzjs4\" (UID: \"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0\") " pod="openstack/barbican-9257-account-create-update-bzjs4" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.168894 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9257-account-create-update-gx8hz"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.229328 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.259931 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9257-account-create-update-bzjs4" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.272369 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c053-account-create-update-x8gmz"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.273974 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c053-account-create-update-x8gmz" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.291602 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.335698 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c053-account-create-update-x8gmz"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.387408 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c053-account-create-update-pbzxx"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.399735 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a4dbbc-e575-4d30-bba1-f3785c1e497e-operator-scripts\") pod \"glance-c053-account-create-update-x8gmz\" (UID: \"31a4dbbc-e575-4d30-bba1-f3785c1e497e\") " pod="openstack/glance-c053-account-create-update-x8gmz" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.399776 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkq48\" (UniqueName: \"kubernetes.io/projected/31a4dbbc-e575-4d30-bba1-f3785c1e497e-kube-api-access-lkq48\") pod \"glance-c053-account-create-update-x8gmz\" (UID: \"31a4dbbc-e575-4d30-bba1-f3785c1e497e\") " pod="openstack/glance-c053-account-create-update-x8gmz" Jan 29 14:24:33 crc kubenswrapper[4753]: E0129 14:24:33.399923 4753 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 29 14:24:33 crc kubenswrapper[4753]: E0129 14:24:33.399964 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data podName:5f7e3e27-a036-4623-8d63-557a3c0d76e6 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:34.399950235 +0000 UTC m=+1309.094684617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data") pod "rabbitmq-cell1-server-0" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6") : configmap "rabbitmq-cell1-config-data" not found Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.403206 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c053-account-create-update-pbzxx"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.444207 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f8b4-account-create-update-8xzhc"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.445408 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8b4-account-create-update-8xzhc" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.451803 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.494462 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-870b-account-create-update-kndlh"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.533082 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a4dbbc-e575-4d30-bba1-f3785c1e497e-operator-scripts\") pod \"glance-c053-account-create-update-x8gmz\" (UID: \"31a4dbbc-e575-4d30-bba1-f3785c1e497e\") " pod="openstack/glance-c053-account-create-update-x8gmz" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.533146 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkq48\" (UniqueName: \"kubernetes.io/projected/31a4dbbc-e575-4d30-bba1-f3785c1e497e-kube-api-access-lkq48\") pod \"glance-c053-account-create-update-x8gmz\" (UID: \"31a4dbbc-e575-4d30-bba1-f3785c1e497e\") " pod="openstack/glance-c053-account-create-update-x8gmz" Jan 29 14:24:33 crc kubenswrapper[4753]: E0129 14:24:33.554941 4753 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 29 14:24:33 crc kubenswrapper[4753]: E0129 14:24:33.561378 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data podName:ad5c04aa-ed92-4c33-ad37-4420b362e237 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:34.054985796 +0000 UTC m=+1308.749720178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data") pod "rabbitmq-server-0" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237") : configmap "rabbitmq-config-data" not found Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.583724 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a4dbbc-e575-4d30-bba1-f3785c1e497e-operator-scripts\") pod \"glance-c053-account-create-update-x8gmz\" (UID: \"31a4dbbc-e575-4d30-bba1-f3785c1e497e\") " pod="openstack/glance-c053-account-create-update-x8gmz" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.675610 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78jp7\" (UniqueName: \"kubernetes.io/projected/ef1a7724-ed52-4481-bbaa-1a0464561b2a-kube-api-access-78jp7\") pod \"cinder-f8b4-account-create-update-8xzhc\" (UID: \"ef1a7724-ed52-4481-bbaa-1a0464561b2a\") " pod="openstack/cinder-f8b4-account-create-update-8xzhc" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.675696 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef1a7724-ed52-4481-bbaa-1a0464561b2a-operator-scripts\") pod \"cinder-f8b4-account-create-update-8xzhc\" (UID: \"ef1a7724-ed52-4481-bbaa-1a0464561b2a\") " pod="openstack/cinder-f8b4-account-create-update-8xzhc" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.693247 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-870b-account-create-update-kndlh" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.702721 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-870b-account-create-update-kndlh"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.707659 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.715431 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkq48\" (UniqueName: \"kubernetes.io/projected/31a4dbbc-e575-4d30-bba1-f3785c1e497e-kube-api-access-lkq48\") pod \"glance-c053-account-create-update-x8gmz\" (UID: \"31a4dbbc-e575-4d30-bba1-f3785c1e497e\") " pod="openstack/glance-c053-account-create-update-x8gmz" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.770576 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f8b4-account-create-update-8xzhc"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.783538 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef1a7724-ed52-4481-bbaa-1a0464561b2a-operator-scripts\") pod \"cinder-f8b4-account-create-update-8xzhc\" (UID: \"ef1a7724-ed52-4481-bbaa-1a0464561b2a\") " pod="openstack/cinder-f8b4-account-create-update-8xzhc" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.783612 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7bgp\" (UniqueName: \"kubernetes.io/projected/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-kube-api-access-k7bgp\") pod \"nova-api-870b-account-create-update-kndlh\" (UID: \"dfce025b-40c8-4ae3-b1c2-a1d858e11adb\") " pod="openstack/nova-api-870b-account-create-update-kndlh" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.783692 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-operator-scripts\") pod \"nova-api-870b-account-create-update-kndlh\" (UID: \"dfce025b-40c8-4ae3-b1c2-a1d858e11adb\") " pod="openstack/nova-api-870b-account-create-update-kndlh" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.788114 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef1a7724-ed52-4481-bbaa-1a0464561b2a-operator-scripts\") pod \"cinder-f8b4-account-create-update-8xzhc\" (UID: \"ef1a7724-ed52-4481-bbaa-1a0464561b2a\") " pod="openstack/cinder-f8b4-account-create-update-8xzhc" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.796998 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78jp7\" (UniqueName: \"kubernetes.io/projected/ef1a7724-ed52-4481-bbaa-1a0464561b2a-kube-api-access-78jp7\") pod \"cinder-f8b4-account-create-update-8xzhc\" (UID: \"ef1a7724-ed52-4481-bbaa-1a0464561b2a\") " pod="openstack/cinder-f8b4-account-create-update-8xzhc" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.817061 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-nhv2c"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.882512 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78jp7\" (UniqueName: \"kubernetes.io/projected/ef1a7724-ed52-4481-bbaa-1a0464561b2a-kube-api-access-78jp7\") pod \"cinder-f8b4-account-create-update-8xzhc\" (UID: \"ef1a7724-ed52-4481-bbaa-1a0464561b2a\") " pod="openstack/cinder-f8b4-account-create-update-8xzhc" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.888612 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f8b4-account-create-update-9k466"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.920929 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7bgp\" (UniqueName: \"kubernetes.io/projected/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-kube-api-access-k7bgp\") pod \"nova-api-870b-account-create-update-kndlh\" (UID: \"dfce025b-40c8-4ae3-b1c2-a1d858e11adb\") " pod="openstack/nova-api-870b-account-create-update-kndlh" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.921118 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-operator-scripts\") pod \"nova-api-870b-account-create-update-kndlh\" (UID: \"dfce025b-40c8-4ae3-b1c2-a1d858e11adb\") " pod="openstack/nova-api-870b-account-create-update-kndlh" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.923036 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-operator-scripts\") pod \"nova-api-870b-account-create-update-kndlh\" (UID: \"dfce025b-40c8-4ae3-b1c2-a1d858e11adb\") " pod="openstack/nova-api-870b-account-create-update-kndlh" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.947671 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c053-account-create-update-x8gmz" Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.963987 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f8b4-account-create-update-9k466"] Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.990576 4753 generic.go:334] "Generic (PLEG): container finished" podID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerID="656f30c915a464984557b8cece588ce1d9b95d296a1ac31530c3c6877585393c" exitCode=143 Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.990805 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f2dbc378-044c-49a2-a891-94a90a0acff1","Type":"ContainerDied","Data":"656f30c915a464984557b8cece588ce1d9b95d296a1ac31530c3c6877585393c"} Jan 29 14:24:33 crc kubenswrapper[4753]: I0129 14:24:33.997569 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7bgp\" (UniqueName: \"kubernetes.io/projected/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-kube-api-access-k7bgp\") pod \"nova-api-870b-account-create-update-kndlh\" (UID: \"dfce025b-40c8-4ae3-b1c2-a1d858e11adb\") " pod="openstack/nova-api-870b-account-create-update-kndlh" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.002401 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-nhv2c"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.004098 4753 generic.go:334] "Generic (PLEG): container finished" podID="31019cc8-ce90-453f-be4f-949ed45a5873" containerID="0c9a422d95efc2b8a373980fb0f3a46037a883ada7821c87b5bc7209541856f4" exitCode=2 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.004275 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"31019cc8-ce90-453f-be4f-949ed45a5873","Type":"ContainerDied","Data":"0c9a422d95efc2b8a373980fb0f3a46037a883ada7821c87b5bc7209541856f4"} Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.018007 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-drwzl"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.051796 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-870b-account-create-update-2rg5l"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.052699 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-870b-account-create-update-kndlh" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.069686 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-drwzl"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.080639 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8b4-account-create-update-8xzhc" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.096658 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-870b-account-create-update-2rg5l"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.120954 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-lzht5"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.122304 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fad8-account-create-update-lzht5" Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.129287 4753 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.140704 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data podName:ad5c04aa-ed92-4c33-ad37-4420b362e237 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:35.140679293 +0000 UTC m=+1309.835413675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data") pod "rabbitmq-server-0" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237") : configmap "rabbitmq-config-data" not found Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.140602 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.239437 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07dc97c8-9d59-458c-81af-f83e6f71b09c" path="/var/lib/kubelet/pods/07dc97c8-9d59-458c-81af-f83e6f71b09c/volumes" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.249435 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfc5598-76b4-4673-aeef-b9105a8e6853-operator-scripts\") pod \"nova-cell0-fad8-account-create-update-lzht5\" (UID: \"acfc5598-76b4-4673-aeef-b9105a8e6853\") " pod="openstack/nova-cell0-fad8-account-create-update-lzht5" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.249645 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g62rn\" (UniqueName: \"kubernetes.io/projected/acfc5598-76b4-4673-aeef-b9105a8e6853-kube-api-access-g62rn\") pod \"nova-cell0-fad8-account-create-update-lzht5\" (UID: \"acfc5598-76b4-4673-aeef-b9105a8e6853\") " pod="openstack/nova-cell0-fad8-account-create-update-lzht5" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.260285 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b970443-356c-45b4-a764-800416b926e2" path="/var/lib/kubelet/pods/1b970443-356c-45b4-a764-800416b926e2/volumes" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.271398 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c" path="/var/lib/kubelet/pods/1bd43ba4-9ce7-4bb9-9bfa-6acf48d4ee1c/volumes" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.272533 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f613d6c-6411-47bd-8e65-fd132ae3e874" path="/var/lib/kubelet/pods/1f613d6c-6411-47bd-8e65-fd132ae3e874/volumes" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.273062 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43161e39-b904-49c6-be19-c11d01817594" path="/var/lib/kubelet/pods/43161e39-b904-49c6-be19-c11d01817594/volumes" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.274695 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d2de408-6b00-4029-a1a9-d0ab19401f96" path="/var/lib/kubelet/pods/8d2de408-6b00-4029-a1a9-d0ab19401f96/volumes" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.275401 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93698618-cbb9-4158-8c7e-1f37dc143651" path="/var/lib/kubelet/pods/93698618-cbb9-4158-8c7e-1f37dc143651/volumes" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.275925 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae41990f-dd6c-4bae-ba0c-ebfde88a25a0" path="/var/lib/kubelet/pods/ae41990f-dd6c-4bae-ba0c-ebfde88a25a0/volumes" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.276529 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-lzht5"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.276555 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ced3-account-create-update-sg22z"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.279084 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ced3-account-create-update-sg22z"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.279111 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.279132 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-vlh99"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.282076 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.282831 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerName="openstack-network-exporter" containerID="cri-o://fa247e070c5c59fdd8d782d513689c0f7fef6b0f207b3bdcee7043f734fd870a" gracePeriod=300 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.283279 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.283678 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" containerName="openstack-network-exporter" containerID="cri-o://8be6e8569c31079d6223e7f119db680fdc968fb974c6cb3d3a9876712caa477d" gracePeriod=300 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.288573 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-vlh99"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.299656 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.301128 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-zrdnm"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.301344 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" podUID="386b408e-6ac5-4f1d-8403-be4fc8aec57d" containerName="dnsmasq-dns" containerID="cri-o://f7c124c9fdda70a8e8042c2355a73ef99b59bc98a46f42c31fb3d71e038de5ed" gracePeriod=10 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.321648 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ced3-account-create-update-h5z4v"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.337049 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-zlbdz"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.350274 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-zlbdz"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.360312 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ced3-account-create-update-h5z4v"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.361500 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfc5598-76b4-4673-aeef-b9105a8e6853-operator-scripts\") pod \"nova-cell0-fad8-account-create-update-lzht5\" (UID: \"acfc5598-76b4-4673-aeef-b9105a8e6853\") " pod="openstack/nova-cell0-fad8-account-create-update-lzht5" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.361558 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g62rn\" (UniqueName: \"kubernetes.io/projected/acfc5598-76b4-4673-aeef-b9105a8e6853-kube-api-access-g62rn\") pod \"nova-cell0-fad8-account-create-update-lzht5\" (UID: \"acfc5598-76b4-4673-aeef-b9105a8e6853\") " pod="openstack/nova-cell0-fad8-account-create-update-lzht5" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.362569 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfc5598-76b4-4673-aeef-b9105a8e6853-operator-scripts\") pod \"nova-cell0-fad8-account-create-update-lzht5\" (UID: \"acfc5598-76b4-4673-aeef-b9105a8e6853\") " pod="openstack/nova-cell0-fad8-account-create-update-lzht5" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.370049 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-gv87j"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.377588 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-gv87j"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.387346 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-98h7m"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.401227 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78cbccfdbb-x7lwh"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.402772 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g62rn\" (UniqueName: \"kubernetes.io/projected/acfc5598-76b4-4673-aeef-b9105a8e6853-kube-api-access-g62rn\") pod \"nova-cell0-fad8-account-create-update-lzht5\" (UID: \"acfc5598-76b4-4673-aeef-b9105a8e6853\") " pod="openstack/nova-cell0-fad8-account-create-update-lzht5" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.403057 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78cbccfdbb-x7lwh" podUID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" containerName="placement-log" containerID="cri-o://0048536c095c1deea68c977175a1614cd4da94f70d3e465ff69e6a722a840709" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.403302 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78cbccfdbb-x7lwh" podUID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" containerName="placement-api" containerID="cri-o://7928bff86f6434736cea8c468f448a8ff6bae7916724dac5e9b712cc68b65281" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.437242 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-79v95"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.462794 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78skc\" (UniqueName: \"kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc\") pod \"nova-cell1-ced3-account-create-update-sg22z\" (UID: \"0de6053f-c9da-4625-a175-c9649c10d462\") " pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.462950 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts\") pod \"nova-cell1-ced3-account-create-update-sg22z\" (UID: \"0de6053f-c9da-4625-a175-c9649c10d462\") " pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.463080 4753 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.463127 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data podName:5f7e3e27-a036-4623-8d63-557a3c0d76e6 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:36.463113386 +0000 UTC m=+1311.157847768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data") pod "rabbitmq-cell1-server-0" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6") : configmap "rabbitmq-cell1-config-data" not found Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.472992 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-rdpjg"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.473272 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-rdpjg" podUID="1e919242-0fa9-4c26-90c5-718fec0a9109" containerName="openstack-network-exporter" containerID="cri-o://2e8b4350e06c3e2f7876fe0bc0220340e6c1e15c1569e84e4a38084b04546f2d" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.497083 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-79v95"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.521211 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rm9d5"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.536825 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55965d95bf-pftcq"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.537459 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55965d95bf-pftcq" podUID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerName="neutron-api" containerID="cri-o://24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.537668 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55965d95bf-pftcq" podUID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerName="neutron-httpd" containerID="cri-o://ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.546754 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fad8-account-create-update-lzht5" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.606056 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" containerName="ovsdbserver-nb" containerID="cri-o://1b90c85fe79f6d643048d2a3c0d72a78c8d8d608ab45d4967e59e9a9eb61c33c" gracePeriod=300 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.632274 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wml54"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.652915 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wml54"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.666594 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts\") pod \"nova-cell1-ced3-account-create-update-sg22z\" (UID: \"0de6053f-c9da-4625-a175-c9649c10d462\") " pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.666690 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78skc\" (UniqueName: \"kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc\") pod \"nova-cell1-ced3-account-create-update-sg22z\" (UID: \"0de6053f-c9da-4625-a175-c9649c10d462\") " pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.668911 4753 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.672859 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerName="ovsdbserver-sb" containerID="cri-o://0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf" gracePeriod=300 Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.687609 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts podName:0de6053f-c9da-4625-a175-c9649c10d462 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:35.187575717 +0000 UTC m=+1309.882310099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts") pod "nova-cell1-ced3-account-create-update-sg22z" (UID: "0de6053f-c9da-4625-a175-c9649c10d462") : configmap "openstack-cell1-scripts" not found Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.689137 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xnbzg"] Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.690470 4753 projected.go:194] Error preparing data for projected volume kube-api-access-78skc for pod openstack/nova-cell1-ced3-account-create-update-sg22z: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.690552 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc podName:0de6053f-c9da-4625-a175-c9649c10d462 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:35.190533225 +0000 UTC m=+1309.885267607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-78skc" (UniqueName: "kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc") pod "nova-cell1-ced3-account-create-update-sg22z" (UID: "0de6053f-c9da-4625-a175-c9649c10d462") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.700723 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xnbzg"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.723189 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-gqp9n"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.730321 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-gqp9n"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.802464 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.894796 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s47kf"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.904846 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-s47kf"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.911409 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-403e-account-create-update-fksgc"] Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.953914 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:34 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: if [ -n "neutron" ]; then Jan 29 14:24:34 crc kubenswrapper[4753]: GRANT_DATABASE="neutron" Jan 29 14:24:34 crc kubenswrapper[4753]: else Jan 29 14:24:34 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:34 crc kubenswrapper[4753]: fi Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:34 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:34 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:34 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:34 crc kubenswrapper[4753]: # support updates Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.958495 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-b7f8-account-create-update-2zm4w" podUID="0dc24c61-81b8-4f51-a2c7-961548f1b11b" Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.965711 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:34 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: if [ -n "" ]; then Jan 29 14:24:34 crc kubenswrapper[4753]: GRANT_DATABASE="" Jan 29 14:24:34 crc kubenswrapper[4753]: else Jan 29 14:24:34 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:34 crc kubenswrapper[4753]: fi Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:34 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:34 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:34 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:34 crc kubenswrapper[4753]: # support updates Jan 29 14:24:34 crc kubenswrapper[4753]: Jan 29 14:24:34 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:34 crc kubenswrapper[4753]: E0129 14:24:34.967645 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-c286z" podUID="558ab235-719b-4913-aab2-863bfa6586e8" Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.975298 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-403e-account-create-update-fksgc"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.985173 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.985678 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-server" containerID="cri-o://894c79912ca8aa5fcde9530fa2c8b0075ce313eea0c4e93fccea72f8d718e504" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986468 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="swift-recon-cron" containerID="cri-o://3c1a5183ff790d39a91d088d0d1476e957753555d63f7eb99e9f9efc1ecd852f" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986524 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="rsync" containerID="cri-o://f2a0991e4200eaba753ab9243604efaa3906af593fb0112ee1a38be84f321412" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986563 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-expirer" containerID="cri-o://4e7b00175d4cce7fcb0347c3d927ada241593a834021b289319e8fb80e3be8d0" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986609 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-updater" containerID="cri-o://9e38b526a388417cc886c1224a1c67b02aaed0db6c47390206d2232611f965db" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986648 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-auditor" containerID="cri-o://5ea90ff3252e8f33e1b49dcc054b155d3ca71041184f597d0ebf57cf05baa4d3" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986689 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-replicator" containerID="cri-o://3cebc2a254087212c9cd362b275183545752ff8879dd6da9c51d87070dfa4dab" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986728 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-server" containerID="cri-o://f7948c4c2ea98b702b132274cc49f43a7f9f174f2d4a40192b402344998935a2" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986774 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-updater" containerID="cri-o://2f0532da04eff45cc1a9f36829b01187bceb564dbe88cdc93585471ff9447783" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986846 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-auditor" containerID="cri-o://9cc09e193c62f0685bbeb053af7bcee09164f6f7a1f094768044b967541fdd99" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986907 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-replicator" containerID="cri-o://e4b8c37d71f04f91b5041bbfd2ebe5398fdb2a0eabb2c788da216bac1701192b" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986951 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-server" containerID="cri-o://902624d332ba6e97b89215d66460f6d1894207ecaac9579489c0f78d378469ed" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.986991 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-reaper" containerID="cri-o://851cf5e8ab4d46230b00d65ff6d2fe461116124fb2be946904a0939073438463" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.987031 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-auditor" containerID="cri-o://e165e57a40f6bc0c996660f2d010eaf2048fe443cee81d3a5bb35a9b618cefea" gracePeriod=30 Jan 29 14:24:34 crc kubenswrapper[4753]: I0129 14:24:34.987069 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-replicator" containerID="cri-o://8c14da07d1d3d581ed1cdb3f263473ebec5da8f60db91aaf8dae95021a6acbfa" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.032509 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b19308d0814c3df635fdb38228ce9b7ebf5a99fefcc0274c1834d736932a59bd" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.039884 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rdpjg_1e919242-0fa9-4c26-90c5-718fec0a9109/openstack-network-exporter/0.log" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.039926 4753 generic.go:334] "Generic (PLEG): container finished" podID="1e919242-0fa9-4c26-90c5-718fec0a9109" containerID="2e8b4350e06c3e2f7876fe0bc0220340e6c1e15c1569e84e4a38084b04546f2d" exitCode=2 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.040067 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rdpjg" event={"ID":"1e919242-0fa9-4c26-90c5-718fec0a9109","Type":"ContainerDied","Data":"2e8b4350e06c3e2f7876fe0bc0220340e6c1e15c1569e84e4a38084b04546f2d"} Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.042511 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7f8-account-create-update-2zm4w" event={"ID":"0dc24c61-81b8-4f51-a2c7-961548f1b11b","Type":"ContainerStarted","Data":"00d8949f8f371a4a63403fa15f6ce63e0989553fa96664b0c3aa3ab7af2bc8d6"} Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.046120 4753 generic.go:334] "Generic (PLEG): container finished" podID="606db0f5-bff8-4d65-bbad-63b8b1ba362c" containerID="b9f6872bb6478c7a1880d301b55f528a8869bd68eb2346a5287f912f5e9ed844" exitCode=137 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.049323 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5f7e3e27-a036-4623-8d63-557a3c0d76e6" containerName="rabbitmq" containerID="cri-o://a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c" gracePeriod=604800 Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.050402 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:35 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: if [ -n "neutron" ]; then Jan 29 14:24:35 crc kubenswrapper[4753]: GRANT_DATABASE="neutron" Jan 29 14:24:35 crc kubenswrapper[4753]: else Jan 29 14:24:35 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:35 crc kubenswrapper[4753]: fi Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:35 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:35 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:35 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:35 crc kubenswrapper[4753]: # support updates Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.051729 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-b7f8-account-create-update-2zm4w" podUID="0dc24c61-81b8-4f51-a2c7-961548f1b11b" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.052187 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-np8wt"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.052543 4753 generic.go:334] "Generic (PLEG): container finished" podID="386b408e-6ac5-4f1d-8403-be4fc8aec57d" containerID="f7c124c9fdda70a8e8042c2355a73ef99b59bc98a46f42c31fb3d71e038de5ed" exitCode=0 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.052644 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" event={"ID":"386b408e-6ac5-4f1d-8403-be4fc8aec57d","Type":"ContainerDied","Data":"f7c124c9fdda70a8e8042c2355a73ef99b59bc98a46f42c31fb3d71e038de5ed"} Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.053613 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf is running failed: container process not found" containerID="0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.055536 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf is running failed: container process not found" containerID="0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.058275 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf is running failed: container process not found" containerID="0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.058300 4753 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerName="ovsdbserver-sb" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.067286 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-np8wt"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.071092 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dd12b7e9-4fd2-4ae9-8e74-d8499726d995/ovsdbserver-nb/0.log" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.071132 4753 generic.go:334] "Generic (PLEG): container finished" podID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" containerID="8be6e8569c31079d6223e7f119db680fdc968fb974c6cb3d3a9876712caa477d" exitCode=2 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.071165 4753 generic.go:334] "Generic (PLEG): container finished" podID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" containerID="1b90c85fe79f6d643048d2a3c0d72a78c8d8d608ab45d4967e59e9a9eb61c33c" exitCode=143 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.071225 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dd12b7e9-4fd2-4ae9-8e74-d8499726d995","Type":"ContainerDied","Data":"8be6e8569c31079d6223e7f119db680fdc968fb974c6cb3d3a9876712caa477d"} Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.071252 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dd12b7e9-4fd2-4ae9-8e74-d8499726d995","Type":"ContainerDied","Data":"1b90c85fe79f6d643048d2a3c0d72a78c8d8d608ab45d4967e59e9a9eb61c33c"} Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.074855 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_08759bfe-4b2e-4da9-b0b0-2149a71a831e/ovsdbserver-sb/0.log" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.074883 4753 generic.go:334] "Generic (PLEG): container finished" podID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerID="fa247e070c5c59fdd8d782d513689c0f7fef6b0f207b3bdcee7043f734fd870a" exitCode=2 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.074894 4753 generic.go:334] "Generic (PLEG): container finished" podID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerID="0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf" exitCode=143 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.074932 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"08759bfe-4b2e-4da9-b0b0-2149a71a831e","Type":"ContainerDied","Data":"fa247e070c5c59fdd8d782d513689c0f7fef6b0f207b3bdcee7043f734fd870a"} Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.074951 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"08759bfe-4b2e-4da9-b0b0-2149a71a831e","Type":"ContainerDied","Data":"0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf"} Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.082233 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b7f8-account-create-update-2zm4w"] Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.089503 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b19308d0814c3df635fdb38228ce9b7ebf5a99fefcc0274c1834d736932a59bd" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.089870 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c286z" event={"ID":"558ab235-719b-4913-aab2-863bfa6586e8","Type":"ContainerStarted","Data":"41d74e4feaeb0b2190fc2c5ce8b2fc54c7da6fbcce00f05a893bd28ad38d8332"} Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.090626 4753 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-c286z" secret="" err="secret \"galera-openstack-cell1-dockercfg-5cl64\" not found" Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.095406 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:35 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: if [ -n "" ]; then Jan 29 14:24:35 crc kubenswrapper[4753]: GRANT_DATABASE="" Jan 29 14:24:35 crc kubenswrapper[4753]: else Jan 29 14:24:35 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:35 crc kubenswrapper[4753]: fi Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:35 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:35 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:35 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:35 crc kubenswrapper[4753]: # support updates Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.096968 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-c286z" podUID="558ab235-719b-4913-aab2-863bfa6586e8" Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.122727 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b19308d0814c3df635fdb38228ce9b7ebf5a99fefcc0274c1834d736932a59bd" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.123024 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="31019cc8-ce90-453f-be4f-949ed45a5873" containerName="ovn-northd" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.123452 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.123681 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3184c640-0157-4211-aa5a-aada8557e9f8" containerName="glance-log" containerID="cri-o://f5895e5563577e5f99ca58e92f19470e5b0e974e28396e83be746eec355480e3" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.123784 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3184c640-0157-4211-aa5a-aada8557e9f8" containerName="glance-httpd" containerID="cri-o://bf98498966b9708676b313afca0a0b4bb674752fe39d67e44f9f70b35df870b7" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.136181 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.136462 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6810e266-dec6-4731-884b-067f214781c2" containerName="glance-log" containerID="cri-o://b0fc984341d3bf9cf81937c474fb2a82c3b897efaf7a2c1e16681e411cbe9085" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.136612 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6810e266-dec6-4731-884b-067f214781c2" containerName="glance-httpd" containerID="cri-o://e2d1fb3d7fa36ee6d4949b785910ce4cfa547832c2acade41dcbf22dc90c2c6e" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.157291 4753 generic.go:334] "Generic (PLEG): container finished" podID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" containerID="0048536c095c1deea68c977175a1614cd4da94f70d3e465ff69e6a722a840709" exitCode=143 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.157336 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cbccfdbb-x7lwh" event={"ID":"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7","Type":"ContainerDied","Data":"0048536c095c1deea68c977175a1614cd4da94f70d3e465ff69e6a722a840709"} Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.182792 4753 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.182879 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data podName:ad5c04aa-ed92-4c33-ad37-4420b362e237 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:37.182862525 +0000 UTC m=+1311.877596907 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data") pod "rabbitmq-server-0" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237") : configmap "rabbitmq-config-data" not found Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.237019 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b7f8-account-create-update-2zm4w"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.250271 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4mdvn"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.277200 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4mdvn"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.277279 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.277468 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-log" containerID="cri-o://f34f5bfa0d06f0211a2a796c39c41d96c4d203bd1909c7053e13afe8556789c1" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.279887 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-metadata" containerID="cri-o://0a800f0f0fcb4a6f6136a9675b82b2ab62e4096da3d7cc45f7830d2af553041f" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.291639 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts\") pod \"nova-cell1-ced3-account-create-update-sg22z\" (UID: \"0de6053f-c9da-4625-a175-c9649c10d462\") " pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.292243 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78skc\" (UniqueName: \"kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc\") pod \"nova-cell1-ced3-account-create-update-sg22z\" (UID: \"0de6053f-c9da-4625-a175-c9649c10d462\") " pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.293349 4753 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.293407 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts podName:0de6053f-c9da-4625-a175-c9649c10d462 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:36.29338743 +0000 UTC m=+1310.988121802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts") pod "nova-cell1-ced3-account-create-update-sg22z" (UID: "0de6053f-c9da-4625-a175-c9649c10d462") : configmap "openstack-cell1-scripts" not found Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.293683 4753 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.293715 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts podName:558ab235-719b-4913-aab2-863bfa6586e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:35.793706288 +0000 UTC m=+1310.488440670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts") pod "root-account-create-update-c286z" (UID: "558ab235-719b-4913-aab2-863bfa6586e8") : configmap "openstack-cell1-scripts" not found Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.312261 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9257-account-create-update-bzjs4"] Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.313429 4753 projected.go:194] Error preparing data for projected volume kube-api-access-78skc for pod openstack/nova-cell1-ced3-account-create-update-sg22z: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.313493 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc podName:0de6053f-c9da-4625-a175-c9649c10d462 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:36.313474275 +0000 UTC m=+1311.008208657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-78skc" (UniqueName: "kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc") pod "nova-cell1-ced3-account-create-update-sg22z" (UID: "0de6053f-c9da-4625-a175-c9649c10d462") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.331614 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-m9d5c"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.345746 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-m9d5c"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.356367 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c286z"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.366226 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.366530 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" containerName="nova-api-log" containerID="cri-o://63959aa4ed603fda25a0942fccae384fdcda0338c3bd1c0131967af7f34b728b" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.367066 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" containerName="nova-api-api" containerID="cri-o://49145b48bee89f5fd944b3ebdb12f7d989505bd4659c96c697f65f65d2481518" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.403380 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c053-account-create-update-x8gmz"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.430567 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.435835 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.465455 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-zjqlx"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.470202 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6d9c4fb469-wlxbk"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.470444 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" podUID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" containerName="barbican-worker-log" containerID="cri-o://125994f3ec1405b9e8dad0539053aa0a87df73ee58d6e17040fc7d0d422ceb19" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.470551 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" podUID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" containerName="barbican-worker" containerID="cri-o://4c6f60456d1d153f776a9722a5c5c012f82210a1ce9b96301d44ce657ac865a5" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.483752 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-zjqlx"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.491938 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f8b4-account-create-update-8xzhc"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.501112 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-sb\") pod \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.501180 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-swift-storage-0\") pod \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.501250 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-nb\") pod \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.501364 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-svc\") pod \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.501402 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm2dc\" (UniqueName: \"kubernetes.io/projected/386b408e-6ac5-4f1d-8403-be4fc8aec57d-kube-api-access-jm2dc\") pod \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.501426 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-config\") pod \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\" (UID: \"386b408e-6ac5-4f1d-8403-be4fc8aec57d\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.510276 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.520886 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-xrf5w"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.524771 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386b408e-6ac5-4f1d-8403-be4fc8aec57d-kube-api-access-jm2dc" (OuterVolumeSpecName: "kube-api-access-jm2dc") pod "386b408e-6ac5-4f1d-8403-be4fc8aec57d" (UID: "386b408e-6ac5-4f1d-8403-be4fc8aec57d"). InnerVolumeSpecName "kube-api-access-jm2dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.527240 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rdpjg_1e919242-0fa9-4c26-90c5-718fec0a9109/openstack-network-exporter/0.log" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.527315 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.534344 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-xrf5w"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.567122 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-599fdd45b6-c7l8c"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.567431 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" podUID="62921d0c-9482-49a2-8d20-5dda61ba80da" containerName="barbican-keystone-listener-log" containerID="cri-o://ad233db4e8b1fd3e1f6f172ac5f80cdfd53b96374113b3ddbb73831c7e9cb5c9" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.567901 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" podUID="62921d0c-9482-49a2-8d20-5dda61ba80da" containerName="barbican-keystone-listener" containerID="cri-o://4aa6335e18e43896614c27e4605f0813d42217c00887d65f886c827434a9fc80" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.599195 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.624778 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovs-rundir\") pod \"1e919242-0fa9-4c26-90c5-718fec0a9109\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.625987 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-metrics-certs-tls-certs\") pod \"1e919242-0fa9-4c26-90c5-718fec0a9109\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.629970 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovn-rundir\") pod \"1e919242-0fa9-4c26-90c5-718fec0a9109\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.630169 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-combined-ca-bundle\") pod \"1e919242-0fa9-4c26-90c5-718fec0a9109\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.630324 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e919242-0fa9-4c26-90c5-718fec0a9109-config\") pod \"1e919242-0fa9-4c26-90c5-718fec0a9109\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.630457 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cnrt\" (UniqueName: \"kubernetes.io/projected/1e919242-0fa9-4c26-90c5-718fec0a9109-kube-api-access-9cnrt\") pod \"1e919242-0fa9-4c26-90c5-718fec0a9109\" (UID: \"1e919242-0fa9-4c26-90c5-718fec0a9109\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.625871 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "1e919242-0fa9-4c26-90c5-718fec0a9109" (UID: "1e919242-0fa9-4c26-90c5-718fec0a9109"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.630082 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "1e919242-0fa9-4c26-90c5-718fec0a9109" (UID: "1e919242-0fa9-4c26-90c5-718fec0a9109"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.631175 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e919242-0fa9-4c26-90c5-718fec0a9109-config" (OuterVolumeSpecName: "config") pod "1e919242-0fa9-4c26-90c5-718fec0a9109" (UID: "1e919242-0fa9-4c26-90c5-718fec0a9109"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.634322 4753 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovs-rundir\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.634356 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm2dc\" (UniqueName: \"kubernetes.io/projected/386b408e-6ac5-4f1d-8403-be4fc8aec57d-kube-api-access-jm2dc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.634368 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1e919242-0fa9-4c26-90c5-718fec0a9109-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.634382 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e919242-0fa9-4c26-90c5-718fec0a9109-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.659452 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="ad5c04aa-ed92-4c33-ad37-4420b362e237" containerName="rabbitmq" containerID="cri-o://3b34f95853a15ff9210f7c5a34e53924e5ea049fe09b94a0c39100cd6c83fdab" gracePeriod=604800 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.674141 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-p4tmx"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.685745 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-870b-account-create-update-kndlh"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.693493 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e919242-0fa9-4c26-90c5-718fec0a9109-kube-api-access-9cnrt" (OuterVolumeSpecName: "kube-api-access-9cnrt") pod "1e919242-0fa9-4c26-90c5-718fec0a9109" (UID: "1e919242-0fa9-4c26-90c5-718fec0a9109"). InnerVolumeSpecName "kube-api-access-9cnrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.695035 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-n6b96"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.715366 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-lzht5"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.737894 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-combined-ca-bundle\") pod \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.737994 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvszk\" (UniqueName: \"kubernetes.io/projected/606db0f5-bff8-4d65-bbad-63b8b1ba362c-kube-api-access-dvszk\") pod \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.738483 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config-secret\") pod \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.738514 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config\") pod \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.739115 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cnrt\" (UniqueName: \"kubernetes.io/projected/1e919242-0fa9-4c26-90c5-718fec0a9109-kube-api-access-9cnrt\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.755313 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/606db0f5-bff8-4d65-bbad-63b8b1ba362c-kube-api-access-dvszk" (OuterVolumeSpecName: "kube-api-access-dvszk") pod "606db0f5-bff8-4d65-bbad-63b8b1ba362c" (UID: "606db0f5-bff8-4d65-bbad-63b8b1ba362c"). InnerVolumeSpecName "kube-api-access-dvszk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.759792 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-754c57f55b-2hkbd"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.760062 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-754c57f55b-2hkbd" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerName="barbican-api-log" containerID="cri-o://eae4f86edd4d25f3149ae9e9e7b406efbd7b4f7e532051ca604088e784fc5e54" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.760200 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-754c57f55b-2hkbd" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerName="barbican-api" containerID="cri-o://831d4b37a8f34dc6da88e5991350950aff013d5011940ad0f9ecbf93b46818a1" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.771121 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-p4tmx"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.772636 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_08759bfe-4b2e-4da9-b0b0-2149a71a831e/ovsdbserver-sb/0.log" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.773064 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.791334 4753 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 29 14:24:35 crc kubenswrapper[4753]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 29 14:24:35 crc kubenswrapper[4753]: + source /usr/local/bin/container-scripts/functions Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNBridge=br-int Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNRemote=tcp:localhost:6642 Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNEncapType=geneve Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNAvailabilityZones= Jan 29 14:24:35 crc kubenswrapper[4753]: ++ EnableChassisAsGateway=true Jan 29 14:24:35 crc kubenswrapper[4753]: ++ PhysicalNetworks= Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNHostName= Jan 29 14:24:35 crc kubenswrapper[4753]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 29 14:24:35 crc kubenswrapper[4753]: ++ ovs_dir=/var/lib/openvswitch Jan 29 14:24:35 crc kubenswrapper[4753]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 29 14:24:35 crc kubenswrapper[4753]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 29 14:24:35 crc kubenswrapper[4753]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 29 14:24:35 crc kubenswrapper[4753]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 29 14:24:35 crc kubenswrapper[4753]: + sleep 0.5 Jan 29 14:24:35 crc kubenswrapper[4753]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 29 14:24:35 crc kubenswrapper[4753]: + sleep 0.5 Jan 29 14:24:35 crc kubenswrapper[4753]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 29 14:24:35 crc kubenswrapper[4753]: + cleanup_ovsdb_server_semaphore Jan 29 14:24:35 crc kubenswrapper[4753]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 29 14:24:35 crc kubenswrapper[4753]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 29 14:24:35 crc kubenswrapper[4753]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-98h7m" message=< Jan 29 14:24:35 crc kubenswrapper[4753]: Exiting ovsdb-server (5) [ OK ] Jan 29 14:24:35 crc kubenswrapper[4753]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 29 14:24:35 crc kubenswrapper[4753]: + source /usr/local/bin/container-scripts/functions Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNBridge=br-int Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNRemote=tcp:localhost:6642 Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNEncapType=geneve Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNAvailabilityZones= Jan 29 14:24:35 crc kubenswrapper[4753]: ++ EnableChassisAsGateway=true Jan 29 14:24:35 crc kubenswrapper[4753]: ++ PhysicalNetworks= Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNHostName= Jan 29 14:24:35 crc kubenswrapper[4753]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 29 14:24:35 crc kubenswrapper[4753]: ++ ovs_dir=/var/lib/openvswitch Jan 29 14:24:35 crc kubenswrapper[4753]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 29 14:24:35 crc kubenswrapper[4753]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 29 14:24:35 crc kubenswrapper[4753]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 29 14:24:35 crc kubenswrapper[4753]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 29 14:24:35 crc kubenswrapper[4753]: + sleep 0.5 Jan 29 14:24:35 crc kubenswrapper[4753]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 29 14:24:35 crc kubenswrapper[4753]: + sleep 0.5 Jan 29 14:24:35 crc kubenswrapper[4753]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 29 14:24:35 crc kubenswrapper[4753]: + cleanup_ovsdb_server_semaphore Jan 29 14:24:35 crc kubenswrapper[4753]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 29 14:24:35 crc kubenswrapper[4753]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 29 14:24:35 crc kubenswrapper[4753]: > Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.791381 4753 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 29 14:24:35 crc kubenswrapper[4753]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 29 14:24:35 crc kubenswrapper[4753]: + source /usr/local/bin/container-scripts/functions Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNBridge=br-int Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNRemote=tcp:localhost:6642 Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNEncapType=geneve Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNAvailabilityZones= Jan 29 14:24:35 crc kubenswrapper[4753]: ++ EnableChassisAsGateway=true Jan 29 14:24:35 crc kubenswrapper[4753]: ++ PhysicalNetworks= Jan 29 14:24:35 crc kubenswrapper[4753]: ++ OVNHostName= Jan 29 14:24:35 crc kubenswrapper[4753]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 29 14:24:35 crc kubenswrapper[4753]: ++ ovs_dir=/var/lib/openvswitch Jan 29 14:24:35 crc kubenswrapper[4753]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 29 14:24:35 crc kubenswrapper[4753]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 29 14:24:35 crc kubenswrapper[4753]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 29 14:24:35 crc kubenswrapper[4753]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 29 14:24:35 crc kubenswrapper[4753]: + sleep 0.5 Jan 29 14:24:35 crc kubenswrapper[4753]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 29 14:24:35 crc kubenswrapper[4753]: + sleep 0.5 Jan 29 14:24:35 crc kubenswrapper[4753]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 29 14:24:35 crc kubenswrapper[4753]: + cleanup_ovsdb_server_semaphore Jan 29 14:24:35 crc kubenswrapper[4753]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 29 14:24:35 crc kubenswrapper[4753]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 29 14:24:35 crc kubenswrapper[4753]: > pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" containerID="cri-o://f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.791412 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" containerID="cri-o://f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" gracePeriod=29 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.791876 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-n6b96"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.791904 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ced3-account-create-update-sg22z"] Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.792575 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-78skc operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell1-ced3-account-create-update-sg22z" podUID="0de6053f-c9da-4625-a175-c9649c10d462" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.803743 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.804000 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="de34f6dc-67dd-4054-84a9-a051e0ba2876" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://857539f16ab7d5f43263fb3fc69c40c7cfc6306d139f5a0e01c992147a025e17" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.825093 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovs-vswitchd" containerID="cri-o://49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" gracePeriod=29 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.825907 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "386b408e-6ac5-4f1d-8403-be4fc8aec57d" (UID: "386b408e-6ac5-4f1d-8403-be4fc8aec57d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.835869 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:35 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: if [ -n "barbican" ]; then Jan 29 14:24:35 crc kubenswrapper[4753]: GRANT_DATABASE="barbican" Jan 29 14:24:35 crc kubenswrapper[4753]: else Jan 29 14:24:35 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:35 crc kubenswrapper[4753]: fi Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:35 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:35 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:35 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:35 crc kubenswrapper[4753]: # support updates Jan 29 14:24:35 crc kubenswrapper[4753]: Jan 29 14:24:35 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.837434 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-c286z"] Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.838391 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-9257-account-create-update-bzjs4" podUID="bf251fba-e960-4f2f-ad9a-117ce9a2f5a0" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.839959 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdbserver-sb-tls-certs\") pod \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.840058 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.840095 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-config\") pod \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.840583 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-scripts\") pod \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.840792 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-metrics-certs-tls-certs\") pod \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.840959 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6sc5\" (UniqueName: \"kubernetes.io/projected/08759bfe-4b2e-4da9-b0b0-2149a71a831e-kube-api-access-x6sc5\") pod \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.841009 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-combined-ca-bundle\") pod \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.841081 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdb-rundir\") pod \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\" (UID: \"08759bfe-4b2e-4da9-b0b0-2149a71a831e\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.842033 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-scripts" (OuterVolumeSpecName: "scripts") pod "08759bfe-4b2e-4da9-b0b0-2149a71a831e" (UID: "08759bfe-4b2e-4da9-b0b0-2149a71a831e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.842722 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-config" (OuterVolumeSpecName: "config") pod "08759bfe-4b2e-4da9-b0b0-2149a71a831e" (UID: "08759bfe-4b2e-4da9-b0b0-2149a71a831e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.844035 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.844049 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08759bfe-4b2e-4da9-b0b0-2149a71a831e-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.844059 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.844070 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvszk\" (UniqueName: \"kubernetes.io/projected/606db0f5-bff8-4d65-bbad-63b8b1ba362c-kube-api-access-dvszk\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.844119 4753 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 29 14:24:35 crc kubenswrapper[4753]: E0129 14:24:35.844176 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts podName:558ab235-719b-4913-aab2-863bfa6586e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:36.844143426 +0000 UTC m=+1311.538877808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts") pod "root-account-create-update-c286z" (UID: "558ab235-719b-4913-aab2-863bfa6586e8") : configmap "openstack-cell1-scripts" not found Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.844908 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "08759bfe-4b2e-4da9-b0b0-2149a71a831e" (UID: "08759bfe-4b2e-4da9-b0b0-2149a71a831e"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.856909 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dd12b7e9-4fd2-4ae9-8e74-d8499726d995/ovsdbserver-nb/0.log" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.856987 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.863113 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "386b408e-6ac5-4f1d-8403-be4fc8aec57d" (UID: "386b408e-6ac5-4f1d-8403-be4fc8aec57d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.886335 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08759bfe-4b2e-4da9-b0b0-2149a71a831e-kube-api-access-x6sc5" (OuterVolumeSpecName: "kube-api-access-x6sc5") pod "08759bfe-4b2e-4da9-b0b0-2149a71a831e" (UID: "08759bfe-4b2e-4da9-b0b0-2149a71a831e"). InnerVolumeSpecName "kube-api-access-x6sc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.887944 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "08759bfe-4b2e-4da9-b0b0-2149a71a831e" (UID: "08759bfe-4b2e-4da9-b0b0-2149a71a831e"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.911270 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.911556 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="15b88ba9-8449-4e76-a36c-34ca2b2be488" containerName="nova-scheduler-scheduler" containerID="cri-o://24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.915785 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9257-account-create-update-bzjs4"] Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.929951 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e919242-0fa9-4c26-90c5-718fec0a9109" (UID: "1e919242-0fa9-4c26-90c5-718fec0a9109"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.944690 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "606db0f5-bff8-4d65-bbad-63b8b1ba362c" (UID: "606db0f5-bff8-4d65-bbad-63b8b1ba362c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.944957 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdg55\" (UniqueName: \"kubernetes.io/projected/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-kube-api-access-kdg55\") pod \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.945194 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-metrics-certs-tls-certs\") pod \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.945304 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config\") pod \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\" (UID: \"606db0f5-bff8-4d65-bbad-63b8b1ba362c\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.945453 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-scripts\") pod \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.945552 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.945667 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-config\") pod \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.945749 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdbserver-nb-tls-certs\") pod \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.945868 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdb-rundir\") pod \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.945930 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-combined-ca-bundle\") pod \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\" (UID: \"dd12b7e9-4fd2-4ae9-8e74-d8499726d995\") " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.947008 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6sc5\" (UniqueName: \"kubernetes.io/projected/08759bfe-4b2e-4da9-b0b0-2149a71a831e-kube-api-access-x6sc5\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.947128 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.947198 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.947260 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.947322 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.945193 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "386b408e-6ac5-4f1d-8403-be4fc8aec57d" (UID: "386b408e-6ac5-4f1d-8403-be4fc8aec57d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.949237 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "606db0f5-bff8-4d65-bbad-63b8b1ba362c" (UID: "606db0f5-bff8-4d65-bbad-63b8b1ba362c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: W0129 14:24:35.949339 4753 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/606db0f5-bff8-4d65-bbad-63b8b1ba362c/volumes/kubernetes.io~configmap/openstack-config Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.949356 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "606db0f5-bff8-4d65-bbad-63b8b1ba362c" (UID: "606db0f5-bff8-4d65-bbad-63b8b1ba362c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.949850 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-scripts" (OuterVolumeSpecName: "scripts") pod "dd12b7e9-4fd2-4ae9-8e74-d8499726d995" (UID: "dd12b7e9-4fd2-4ae9-8e74-d8499726d995"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.950174 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "dd12b7e9-4fd2-4ae9-8e74-d8499726d995" (UID: "dd12b7e9-4fd2-4ae9-8e74-d8499726d995"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.951782 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-config" (OuterVolumeSpecName: "config") pod "dd12b7e9-4fd2-4ae9-8e74-d8499726d995" (UID: "dd12b7e9-4fd2-4ae9-8e74-d8499726d995"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.954418 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-config" (OuterVolumeSpecName: "config") pod "386b408e-6ac5-4f1d-8403-be4fc8aec57d" (UID: "386b408e-6ac5-4f1d-8403-be4fc8aec57d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.961715 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="001ea12a-a725-4cd9-a12e-1442d56f7068" containerName="galera" containerID="cri-o://b638be6aff7479c4c3b3e3e264266ee7a0c8949f1731326d7adf23a76a43271b" gracePeriod=30 Jan 29 14:24:35 crc kubenswrapper[4753]: I0129 14:24:35.963933 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-kube-api-access-kdg55" (OuterVolumeSpecName: "kube-api-access-kdg55") pod "dd12b7e9-4fd2-4ae9-8e74-d8499726d995" (UID: "dd12b7e9-4fd2-4ae9-8e74-d8499726d995"). InnerVolumeSpecName "kube-api-access-kdg55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:35.979540 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "dd12b7e9-4fd2-4ae9-8e74-d8499726d995" (UID: "dd12b7e9-4fd2-4ae9-8e74-d8499726d995"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:35.995518 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "386b408e-6ac5-4f1d-8403-be4fc8aec57d" (UID: "386b408e-6ac5-4f1d-8403-be4fc8aec57d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.050808 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.051167 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.051177 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.051187 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.051207 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.051216 4753 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/386b408e-6ac5-4f1d-8403-be4fc8aec57d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.051226 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.051238 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.051245 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.051254 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdg55\" (UniqueName: \"kubernetes.io/projected/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-kube-api-access-kdg55\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.095589 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hgtzp"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.115982 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hgtzp"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.116646 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08759bfe-4b2e-4da9-b0b0-2149a71a831e" (UID: "08759bfe-4b2e-4da9-b0b0-2149a71a831e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.149256 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.164837 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "606db0f5-bff8-4d65-bbad-63b8b1ba362c" (UID: "606db0f5-bff8-4d65-bbad-63b8b1ba362c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.166579 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/606db0f5-bff8-4d65-bbad-63b8b1ba362c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.166613 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.166623 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.190734 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 29 14:24:36 crc kubenswrapper[4753]: W0129 14:24:36.202685 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacfc5598_76b4_4673_aeef_b9105a8e6853.slice/crio-d1b5d984490842cc087fa4858e3e2c4c430bd18896c94ed0d1afbaf2a74d0a2f WatchSource:0}: Error finding container d1b5d984490842cc087fa4858e3e2c4c430bd18896c94ed0d1afbaf2a74d0a2f: Status 404 returned error can't find the container with id d1b5d984490842cc087fa4858e3e2c4c430bd18896c94ed0d1afbaf2a74d0a2f Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.215257 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020d51a4-77d3-4fa1-8184-5867ecf2d6d1" path="/var/lib/kubelet/pods/020d51a4-77d3-4fa1-8184-5867ecf2d6d1/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.215798 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "08759bfe-4b2e-4da9-b0b0-2149a71a831e" (UID: "08759bfe-4b2e-4da9-b0b0-2149a71a831e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.216487 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dade6dd-52d6-419f-8460-438e78771a6d" path="/var/lib/kubelet/pods/4dade6dd-52d6-419f-8460-438e78771a6d/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.222546 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c1c8ae-4366-410e-b16b-3d06d55ce6e4" path="/var/lib/kubelet/pods/53c1c8ae-4366-410e-b16b-3d06d55ce6e4/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.223364 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="563e2a43-bce3-4ba1-bd5e-25ec89449549" path="/var/lib/kubelet/pods/563e2a43-bce3-4ba1-bd5e-25ec89449549/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.224041 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b901113-4b66-4eb9-ac27-ad9446e6aa29" path="/var/lib/kubelet/pods/5b901113-4b66-4eb9-ac27-ad9446e6aa29/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.228670 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d0d7f08-1d4d-451d-98c3-873ac514fd53" path="/var/lib/kubelet/pods/5d0d7f08-1d4d-451d-98c3-873ac514fd53/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.239398 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="606db0f5-bff8-4d65-bbad-63b8b1ba362c" path="/var/lib/kubelet/pods/606db0f5-bff8-4d65-bbad-63b8b1ba362c/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.240508 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df97e34-891e-41f6-bf19-24d14e83c610" path="/var/lib/kubelet/pods/7df97e34-891e-41f6-bf19-24d14e83c610/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.241101 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="845fdb72-7823-4160-941a-e70288b5f77b" path="/var/lib/kubelet/pods/845fdb72-7823-4160-941a-e70288b5f77b/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.242891 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:36 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: if [ -n "nova_cell0" ]; then Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="nova_cell0" Jan 29 14:24:36 crc kubenswrapper[4753]: else Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:36 crc kubenswrapper[4753]: fi Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:36 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:36 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:36 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:36 crc kubenswrapper[4753]: # support updates Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.244239 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:36 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: if [ -n "cinder" ]; then Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="cinder" Jan 29 14:24:36 crc kubenswrapper[4753]: else Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:36 crc kubenswrapper[4753]: fi Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:36 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:36 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:36 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:36 crc kubenswrapper[4753]: # support updates Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.245015 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-fad8-account-create-update-lzht5" podUID="acfc5598-76b4-4673-aeef-b9105a8e6853" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.245559 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:36 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: if [ -n "nova_api" ]; then Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="nova_api" Jan 29 14:24:36 crc kubenswrapper[4753]: else Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:36 crc kubenswrapper[4753]: fi Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:36 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:36 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:36 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:36 crc kubenswrapper[4753]: # support updates Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.245602 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-f8b4-account-create-update-8xzhc" podUID="ef1a7724-ed52-4481-bbaa-1a0464561b2a" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.246435 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:36 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: if [ -n "glance" ]; then Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="glance" Jan 29 14:24:36 crc kubenswrapper[4753]: else Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:36 crc kubenswrapper[4753]: fi Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:36 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:36 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:36 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:36 crc kubenswrapper[4753]: # support updates Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.246601 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-870b-account-create-update-kndlh" podUID="dfce025b-40c8-4ae3-b1c2-a1d858e11adb" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.248552 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-c053-account-create-update-x8gmz" podUID="31a4dbbc-e575-4d30-bba1-f3785c1e497e" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.252829 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd12b7e9-4fd2-4ae9-8e74-d8499726d995" (UID: "dd12b7e9-4fd2-4ae9-8e74-d8499726d995"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.255495 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94338586-7af0-45e5-b6aa-780295200d4e" path="/var/lib/kubelet/pods/94338586-7af0-45e5-b6aa-780295200d4e/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.262882 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9730846e-4e43-45f7-8b5d-38d4753b3b80" path="/var/lib/kubelet/pods/9730846e-4e43-45f7-8b5d-38d4753b3b80/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264751 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="f2a0991e4200eaba753ab9243604efaa3906af593fb0112ee1a38be84f321412" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264788 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="4e7b00175d4cce7fcb0347c3d927ada241593a834021b289319e8fb80e3be8d0" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264800 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="9e38b526a388417cc886c1224a1c67b02aaed0db6c47390206d2232611f965db" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264812 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="5ea90ff3252e8f33e1b49dcc054b155d3ca71041184f597d0ebf57cf05baa4d3" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264821 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="3cebc2a254087212c9cd362b275183545752ff8879dd6da9c51d87070dfa4dab" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264829 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="f7948c4c2ea98b702b132274cc49f43a7f9f174f2d4a40192b402344998935a2" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264838 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="2f0532da04eff45cc1a9f36829b01187bceb564dbe88cdc93585471ff9447783" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264845 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="9cc09e193c62f0685bbeb053af7bcee09164f6f7a1f094768044b967541fdd99" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264852 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="e4b8c37d71f04f91b5041bbfd2ebe5398fdb2a0eabb2c788da216bac1701192b" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264859 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="902624d332ba6e97b89215d66460f6d1894207ecaac9579489c0f78d378469ed" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264867 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="851cf5e8ab4d46230b00d65ff6d2fe461116124fb2be946904a0939073438463" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264885 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="e165e57a40f6bc0c996660f2d010eaf2048fe443cee81d3a5bb35a9b618cefea" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264893 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="8c14da07d1d3d581ed1cdb3f263473ebec5da8f60db91aaf8dae95021a6acbfa" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.264902 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="894c79912ca8aa5fcde9530fa2c8b0075ce313eea0c4e93fccea72f8d718e504" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.268224 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.268248 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.268260 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.270649 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7b64f0e-f7ef-4737-a543-edba91ed6811" path="/var/lib/kubelet/pods/a7b64f0e-f7ef-4737-a543-edba91ed6811/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.278276 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1e919242-0fa9-4c26-90c5-718fec0a9109" (UID: "1e919242-0fa9-4c26-90c5-718fec0a9109"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.278740 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a96f22ef-0591-44e5-a14d-4517ca0dcf14" path="/var/lib/kubelet/pods/a96f22ef-0591-44e5-a14d-4517ca0dcf14/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.279643 4753 generic.go:334] "Generic (PLEG): container finished" podID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" containerID="125994f3ec1405b9e8dad0539053aa0a87df73ee58d6e17040fc7d0d422ceb19" exitCode=143 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.287169 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "dd12b7e9-4fd2-4ae9-8e74-d8499726d995" (UID: "dd12b7e9-4fd2-4ae9-8e74-d8499726d995"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.292113 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.296093 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2d3ab6e-1910-48a5-bbd7-c4ec38a37571" path="/var/lib/kubelet/pods/b2d3ab6e-1910-48a5-bbd7-c4ec38a37571/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.296780 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b47fe495-e567-4097-82be-eff62586e721" path="/var/lib/kubelet/pods/b47fe495-e567-4097-82be-eff62586e721/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.297427 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d314405b-2bd4-44b1-93f3-89d92059a50c" path="/var/lib/kubelet/pods/d314405b-2bd4-44b1-93f3-89d92059a50c/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.304559 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rdpjg_1e919242-0fa9-4c26-90c5-718fec0a9109/openstack-network-exporter/0.log" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.304680 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rdpjg" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.317362 4753 generic.go:334] "Generic (PLEG): container finished" podID="3184c640-0157-4211-aa5a-aada8557e9f8" containerID="f5895e5563577e5f99ca58e92f19470e5b0e974e28396e83be746eec355480e3" exitCode=143 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.322227 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_dd12b7e9-4fd2-4ae9-8e74-d8499726d995/ovsdbserver-nb/0.log" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.322316 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.325035 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3381688-f092-411b-b2b2-5cdd67813b7d" path="/var/lib/kubelet/pods/e3381688-f092-411b-b2b2-5cdd67813b7d/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.328357 4753 generic.go:334] "Generic (PLEG): container finished" podID="62921d0c-9482-49a2-8d20-5dda61ba80da" containerID="ad233db4e8b1fd3e1f6f172ac5f80cdfd53b96374113b3ddbb73831c7e9cb5c9" exitCode=143 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.334310 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd640e3a-86af-406d-83c6-2df59b891fc3" path="/var/lib/kubelet/pods/fd640e3a-86af-406d-83c6-2df59b891fc3/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.348631 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "08759bfe-4b2e-4da9-b0b0-2149a71a831e" (UID: "08759bfe-4b2e-4da9-b0b0-2149a71a831e"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.350271 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "dd12b7e9-4fd2-4ae9-8e74-d8499726d995" (UID: "dd12b7e9-4fd2-4ae9-8e74-d8499726d995"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.352576 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdfa810e-df55-4e25-8280-b6275c932336" path="/var/lib/kubelet/pods/fdfa810e-df55-4e25-8280-b6275c932336/volumes" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.355606 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"f2a0991e4200eaba753ab9243604efaa3906af593fb0112ee1a38be84f321412"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.358861 4753 generic.go:334] "Generic (PLEG): container finished" podID="86d005cc-e014-44b4-b8fa-f402d656ae5a" containerID="b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.359832 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.359901 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"4e7b00175d4cce7fcb0347c3d927ada241593a834021b289319e8fb80e3be8d0"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.359941 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"9e38b526a388417cc886c1224a1c67b02aaed0db6c47390206d2232611f965db"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.359953 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.359968 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8c5ld"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.359978 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"5ea90ff3252e8f33e1b49dcc054b155d3ca71041184f597d0ebf57cf05baa4d3"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.359990 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8c5ld"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.360729 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="af44a6dc-a0bc-487f-b82a-05e0d08aa7ea" containerName="nova-cell1-conductor-conductor" containerID="cri-o://1b5479e0d4430d6fc3745e3ed9afa5a4334d10df67d89187482d0455266b8f05" gracePeriod=30 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.363338 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="fb7a325b-7833-484f-8bba-7dc85ebf57cd" containerName="nova-cell0-conductor-conductor" containerID="cri-o://cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf" gracePeriod=30 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367357 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"3cebc2a254087212c9cd362b275183545752ff8879dd6da9c51d87070dfa4dab"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367403 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"f7948c4c2ea98b702b132274cc49f43a7f9f174f2d4a40192b402344998935a2"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367414 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"2f0532da04eff45cc1a9f36829b01187bceb564dbe88cdc93585471ff9447783"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367442 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"9cc09e193c62f0685bbeb053af7bcee09164f6f7a1f094768044b967541fdd99"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367450 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"e4b8c37d71f04f91b5041bbfd2ebe5398fdb2a0eabb2c788da216bac1701192b"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367459 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"902624d332ba6e97b89215d66460f6d1894207ecaac9579489c0f78d378469ed"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367469 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"851cf5e8ab4d46230b00d65ff6d2fe461116124fb2be946904a0939073438463"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367478 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"e165e57a40f6bc0c996660f2d010eaf2048fe443cee81d3a5bb35a9b618cefea"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367486 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"8c14da07d1d3d581ed1cdb3f263473ebec5da8f60db91aaf8dae95021a6acbfa"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367511 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"894c79912ca8aa5fcde9530fa2c8b0075ce313eea0c4e93fccea72f8d718e504"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367521 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" event={"ID":"5766d009-a05f-4e8e-8267-9bd6c1267d3a","Type":"ContainerDied","Data":"125994f3ec1405b9e8dad0539053aa0a87df73ee58d6e17040fc7d0d422ceb19"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367538 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-867cd545c7-zrdnm" event={"ID":"386b408e-6ac5-4f1d-8403-be4fc8aec57d","Type":"ContainerDied","Data":"1f6dc0380a03f660a067b38b55a506e9e2023e8197e9373759736a6c99588bed"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367552 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rdpjg" event={"ID":"1e919242-0fa9-4c26-90c5-718fec0a9109","Type":"ContainerDied","Data":"c4195b05dd8b9b921e979b735020e1af4d9377731db0ebb4181aff5c714cd9db"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367562 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3184c640-0157-4211-aa5a-aada8557e9f8","Type":"ContainerDied","Data":"f5895e5563577e5f99ca58e92f19470e5b0e974e28396e83be746eec355480e3"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367590 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"dd12b7e9-4fd2-4ae9-8e74-d8499726d995","Type":"ContainerDied","Data":"67fde236cb8f75ff617756354524ff65f2bab7137de4fa8807321371899c1ccf"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367602 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" event={"ID":"62921d0c-9482-49a2-8d20-5dda61ba80da","Type":"ContainerDied","Data":"ad233db4e8b1fd3e1f6f172ac5f80cdfd53b96374113b3ddbb73831c7e9cb5c9"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367615 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86d005cc-e014-44b4-b8fa-f402d656ae5a","Type":"ContainerDied","Data":"b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.367635 4753 scope.go:117] "RemoveContainer" containerID="f7c124c9fdda70a8e8042c2355a73ef99b59bc98a46f42c31fb3d71e038de5ed" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.370039 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts\") pod \"nova-cell1-ced3-account-create-update-sg22z\" (UID: \"0de6053f-c9da-4625-a175-c9649c10d462\") " pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.370141 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78skc\" (UniqueName: \"kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc\") pod \"nova-cell1-ced3-account-create-update-sg22z\" (UID: \"0de6053f-c9da-4625-a175-c9649c10d462\") " pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.370218 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.370234 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e919242-0fa9-4c26-90c5-718fec0a9109-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.370244 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/08759bfe-4b2e-4da9-b0b0-2149a71a831e-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.372076 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd12b7e9-4fd2-4ae9-8e74-d8499726d995-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.371050 4753 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.370916 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.372319 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts podName:0de6053f-c9da-4625-a175-c9649c10d462 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:38.372298269 +0000 UTC m=+1313.067032651 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts") pod "nova-cell1-ced3-account-create-update-sg22z" (UID: "0de6053f-c9da-4625-a175-c9649c10d462") : configmap "openstack-cell1-scripts" not found Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.376242 4753 projected.go:194] Error preparing data for projected volume kube-api-access-78skc for pod openstack/nova-cell1-ced3-account-create-update-sg22z: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.376302 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc podName:0de6053f-c9da-4625-a175-c9649c10d462 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:38.376287926 +0000 UTC m=+1313.071022308 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-78skc" (UniqueName: "kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc") pod "nova-cell1-ced3-account-create-update-sg22z" (UID: "0de6053f-c9da-4625-a175-c9649c10d462") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.380200 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-lzht5"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.381959 4753 generic.go:334] "Generic (PLEG): container finished" podID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.382019 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-98h7m" event={"ID":"a17eeeff-955e-4718-9e0e-15fae4b8d9db","Type":"ContainerDied","Data":"f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.384357 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9257-account-create-update-bzjs4" event={"ID":"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0","Type":"ContainerStarted","Data":"5b79ba2152ca86d6581051db0978b562feeada90c67e7cf05bd4476254c489d3"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.391408 4753 generic.go:334] "Generic (PLEG): container finished" podID="13672aee-1e34-4763-88d7-35ac9b484c87" containerID="63959aa4ed603fda25a0942fccae384fdcda0338c3bd1c0131967af7f34b728b" exitCode=143 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.391511 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"13672aee-1e34-4763-88d7-35ac9b484c87","Type":"ContainerDied","Data":"63959aa4ed603fda25a0942fccae384fdcda0338c3bd1c0131967af7f34b728b"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.404322 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_08759bfe-4b2e-4da9-b0b0-2149a71a831e/ovsdbserver-sb/0.log" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.404463 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"08759bfe-4b2e-4da9-b0b0-2149a71a831e","Type":"ContainerDied","Data":"be1e3ce1ccf7a3365d57f5121e4998099ecf88dd17f49e0f0e325e53962e3e79"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.404547 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.405993 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-870b-account-create-update-kndlh"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.408222 4753 generic.go:334] "Generic (PLEG): container finished" podID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerID="eae4f86edd4d25f3149ae9e9e7b406efbd7b4f7e532051ca604088e784fc5e54" exitCode=143 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.408282 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-754c57f55b-2hkbd" event={"ID":"920278e2-a31f-4ad2-81be-d30a799b9d64","Type":"ContainerDied","Data":"eae4f86edd4d25f3149ae9e9e7b406efbd7b4f7e532051ca604088e784fc5e54"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.412311 4753 generic.go:334] "Generic (PLEG): container finished" podID="6810e266-dec6-4731-884b-067f214781c2" containerID="b0fc984341d3bf9cf81937c474fb2a82c3b897efaf7a2c1e16681e411cbe9085" exitCode=143 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.412361 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6810e266-dec6-4731-884b-067f214781c2","Type":"ContainerDied","Data":"b0fc984341d3bf9cf81937c474fb2a82c3b897efaf7a2c1e16681e411cbe9085"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.413590 4753 generic.go:334] "Generic (PLEG): container finished" podID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerID="ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560" exitCode=0 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.413632 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55965d95bf-pftcq" event={"ID":"1897b4f4-3f70-4584-9801-59c207f4d1db","Type":"ContainerDied","Data":"ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.413899 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f8b4-account-create-update-8xzhc"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.415081 4753 generic.go:334] "Generic (PLEG): container finished" podID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerID="f34f5bfa0d06f0211a2a796c39c41d96c4d203bd1909c7053e13afe8556789c1" exitCode=143 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.415138 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.418458 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7947b8f0-b134-40d9-beba-116bbb51a1c2","Type":"ContainerDied","Data":"f34f5bfa0d06f0211a2a796c39c41d96c4d203bd1909c7053e13afe8556789c1"} Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.419461 4753 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-c286z" secret="" err="secret \"galera-openstack-cell1-dockercfg-5cl64\" not found" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.421227 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:36 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: if [ -n "neutron" ]; then Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="neutron" Jan 29 14:24:36 crc kubenswrapper[4753]: else Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:36 crc kubenswrapper[4753]: fi Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:36 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:36 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:36 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:36 crc kubenswrapper[4753]: # support updates Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.421423 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:36 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: if [ -n "" ]; then Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="" Jan 29 14:24:36 crc kubenswrapper[4753]: else Jan 29 14:24:36 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:36 crc kubenswrapper[4753]: fi Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:36 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:36 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:36 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:36 crc kubenswrapper[4753]: # support updates Jan 29 14:24:36 crc kubenswrapper[4753]: Jan 29 14:24:36 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.422551 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-c286z" podUID="558ab235-719b-4913-aab2-863bfa6586e8" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.422585 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-b7f8-account-create-update-2zm4w" podUID="0dc24c61-81b8-4f51-a2c7-961548f1b11b" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.424269 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c053-account-create-update-x8gmz"] Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.478590 4753 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.478669 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data podName:5f7e3e27-a036-4623-8d63-557a3c0d76e6 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:40.478655143 +0000 UTC m=+1315.173389525 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data") pod "rabbitmq-cell1-server-0" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6") : configmap "rabbitmq-cell1-config-data" not found Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.501232 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-rdpjg"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.516338 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-rdpjg"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.532033 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-zrdnm"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.558611 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-867cd545c7-zrdnm"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.720323 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.746639 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7896b659cf-r8vcb"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.746873 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7896b659cf-r8vcb" podUID="82c06532-183f-4527-b423-630110596fb4" containerName="proxy-httpd" containerID="cri-o://94767674e99c2eb5b0abc2208da66b5b5d0e5bbb98eae7fc0cc7902d80c6ed4b" gracePeriod=30 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.747008 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7896b659cf-r8vcb" podUID="82c06532-183f-4527-b423-630110596fb4" containerName="proxy-server" containerID="cri-o://c03517e0292b57c6b359b996af8065e26ca76f92b447521f128204efb05538e3" gracePeriod=30 Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.759946 4753 scope.go:117] "RemoveContainer" containerID="f30589dcc7d57df07d54caec7f4038d30558bc380d3ec02279fbac2497ee8ee2" Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.780512 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.790779 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.812473 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.842754 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 14:24:36 crc kubenswrapper[4753]: I0129 14:24:36.865544 4753 scope.go:117] "RemoveContainer" containerID="2e8b4350e06c3e2f7876fe0bc0220340e6c1e15c1569e84e4a38084b04546f2d" Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.899044 4753 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 29 14:24:36 crc kubenswrapper[4753]: E0129 14:24:36.899115 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts podName:558ab235-719b-4913-aab2-863bfa6586e8 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:38.899100857 +0000 UTC m=+1313.593835229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts") pod "root-account-create-update-c286z" (UID: "558ab235-719b-4913-aab2-863bfa6586e8") : configmap "openstack-cell1-scripts" not found Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.022282 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.167:8776/healthcheck\": read tcp 10.217.0.2:39068->10.217.0.167:8776: read: connection reset by peer" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.054343 4753 scope.go:117] "RemoveContainer" containerID="8be6e8569c31079d6223e7f119db680fdc968fb974c6cb3d3a9876712caa477d" Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.184087 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.184819 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.185996 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.186488 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.186534 4753 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.194196 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.212351 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.212437 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovs-vswitchd" Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.216707 4753 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.216759 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data podName:ad5c04aa-ed92-4c33-ad37-4420b362e237 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:41.216744901 +0000 UTC m=+1315.911479283 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data") pod "rabbitmq-server-0" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237") : configmap "rabbitmq-config-data" not found Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.437443 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c053-account-create-update-x8gmz" event={"ID":"31a4dbbc-e575-4d30-bba1-f3785c1e497e","Type":"ContainerStarted","Data":"ef76beacb15a0486d7984825417dab8ed22cd4b79aeda34621b5c622d6ad73c0"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.445010 4753 generic.go:334] "Generic (PLEG): container finished" podID="001ea12a-a725-4cd9-a12e-1442d56f7068" containerID="b638be6aff7479c4c3b3e3e264266ee7a0c8949f1731326d7adf23a76a43271b" exitCode=0 Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.445100 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"001ea12a-a725-4cd9-a12e-1442d56f7068","Type":"ContainerDied","Data":"b638be6aff7479c4c3b3e3e264266ee7a0c8949f1731326d7adf23a76a43271b"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.462821 4753 generic.go:334] "Generic (PLEG): container finished" podID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerID="a111f231b7967402acc681815c59ca3c8b5a6d1e5677b5d94e8de77f77841cbc" exitCode=0 Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.462909 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f2dbc378-044c-49a2-a891-94a90a0acff1","Type":"ContainerDied","Data":"a111f231b7967402acc681815c59ca3c8b5a6d1e5677b5d94e8de77f77841cbc"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.487612 4753 generic.go:334] "Generic (PLEG): container finished" podID="de34f6dc-67dd-4054-84a9-a051e0ba2876" containerID="857539f16ab7d5f43263fb3fc69c40c7cfc6306d139f5a0e01c992147a025e17" exitCode=0 Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.487676 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"de34f6dc-67dd-4054-84a9-a051e0ba2876","Type":"ContainerDied","Data":"857539f16ab7d5f43263fb3fc69c40c7cfc6306d139f5a0e01c992147a025e17"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.487701 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"de34f6dc-67dd-4054-84a9-a051e0ba2876","Type":"ContainerDied","Data":"648e99ffb8599c2ec868390143f0a3162a551d58bc4c5e7e71ce50e71e20b390"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.487712 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="648e99ffb8599c2ec868390143f0a3162a551d58bc4c5e7e71ce50e71e20b390" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.488609 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9257-account-create-update-bzjs4" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.490139 4753 scope.go:117] "RemoveContainer" containerID="1b90c85fe79f6d643048d2a3c0d72a78c8d8d608ab45d4967e59e9a9eb61c33c" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.519376 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-870b-account-create-update-kndlh" event={"ID":"dfce025b-40c8-4ae3-b1c2-a1d858e11adb","Type":"ContainerStarted","Data":"060753a9957019838dad4cb7b926bdf9d210e7f430bdbda558a55e00bd88b43e"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.526552 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.526893 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fad8-account-create-update-lzht5" event={"ID":"acfc5598-76b4-4673-aeef-b9105a8e6853","Type":"ContainerStarted","Data":"d1b5d984490842cc087fa4858e3e2c4c430bd18896c94ed0d1afbaf2a74d0a2f"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.536515 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f8b4-account-create-update-8xzhc" event={"ID":"ef1a7724-ed52-4481-bbaa-1a0464561b2a","Type":"ContainerStarted","Data":"7109356fbf081180048decd7c2ba3ea0425623b3359ea764cd0507f19323e431"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.568798 4753 scope.go:117] "RemoveContainer" containerID="b9f6872bb6478c7a1880d301b55f528a8869bd68eb2346a5287f912f5e9ed844" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.574867 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7896b659cf-r8vcb" podUID="82c06532-183f-4527-b423-630110596fb4" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": dial tcp 10.217.0.169:8080: connect: connection refused" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.574995 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7896b659cf-r8vcb" podUID="82c06532-183f-4527-b423-630110596fb4" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": dial tcp 10.217.0.169:8080: connect: connection refused" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.617743 4753 generic.go:334] "Generic (PLEG): container finished" podID="62921d0c-9482-49a2-8d20-5dda61ba80da" containerID="4aa6335e18e43896614c27e4605f0813d42217c00887d65f886c827434a9fc80" exitCode=0 Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.617813 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" event={"ID":"62921d0c-9482-49a2-8d20-5dda61ba80da","Type":"ContainerDied","Data":"4aa6335e18e43896614c27e4605f0813d42217c00887d65f886c827434a9fc80"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.623082 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-operator-scripts\") pod \"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0\" (UID: \"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.623187 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v8nw\" (UniqueName: \"kubernetes.io/projected/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-kube-api-access-9v8nw\") pod \"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0\" (UID: \"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.623244 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-combined-ca-bundle\") pod \"de34f6dc-67dd-4054-84a9-a051e0ba2876\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.623314 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-vencrypt-tls-certs\") pod \"de34f6dc-67dd-4054-84a9-a051e0ba2876\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.623371 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-config-data\") pod \"de34f6dc-67dd-4054-84a9-a051e0ba2876\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.623391 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-nova-novncproxy-tls-certs\") pod \"de34f6dc-67dd-4054-84a9-a051e0ba2876\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.623416 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4pt2\" (UniqueName: \"kubernetes.io/projected/de34f6dc-67dd-4054-84a9-a051e0ba2876-kube-api-access-f4pt2\") pod \"de34f6dc-67dd-4054-84a9-a051e0ba2876\" (UID: \"de34f6dc-67dd-4054-84a9-a051e0ba2876\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.633261 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-kt2lt"] Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.633788 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerName="openstack-network-exporter" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.633807 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerName="openstack-network-exporter" Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.633831 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386b408e-6ac5-4f1d-8403-be4fc8aec57d" containerName="init" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.633839 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="386b408e-6ac5-4f1d-8403-be4fc8aec57d" containerName="init" Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.633847 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" containerName="ovsdbserver-nb" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.633853 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" containerName="ovsdbserver-nb" Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.633868 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de34f6dc-67dd-4054-84a9-a051e0ba2876" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.633873 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="de34f6dc-67dd-4054-84a9-a051e0ba2876" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.633887 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386b408e-6ac5-4f1d-8403-be4fc8aec57d" containerName="dnsmasq-dns" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.633895 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="386b408e-6ac5-4f1d-8403-be4fc8aec57d" containerName="dnsmasq-dns" Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.633966 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerName="ovsdbserver-sb" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.633974 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerName="ovsdbserver-sb" Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.634008 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" containerName="openstack-network-exporter" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.634015 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" containerName="openstack-network-exporter" Jan 29 14:24:37 crc kubenswrapper[4753]: E0129 14:24:37.634024 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e919242-0fa9-4c26-90c5-718fec0a9109" containerName="openstack-network-exporter" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.634030 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e919242-0fa9-4c26-90c5-718fec0a9109" containerName="openstack-network-exporter" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.634324 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerName="openstack-network-exporter" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.634336 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="386b408e-6ac5-4f1d-8403-be4fc8aec57d" containerName="dnsmasq-dns" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.634353 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e919242-0fa9-4c26-90c5-718fec0a9109" containerName="openstack-network-exporter" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.634367 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="de34f6dc-67dd-4054-84a9-a051e0ba2876" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.634375 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" containerName="openstack-network-exporter" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.634384 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" containerName="ovsdbserver-nb" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.634392 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" containerName="ovsdbserver-sb" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.635031 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kt2lt" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.638875 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de34f6dc-67dd-4054-84a9-a051e0ba2876-kube-api-access-f4pt2" (OuterVolumeSpecName: "kube-api-access-f4pt2") pod "de34f6dc-67dd-4054-84a9-a051e0ba2876" (UID: "de34f6dc-67dd-4054-84a9-a051e0ba2876"). InnerVolumeSpecName "kube-api-access-f4pt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.639199 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.640022 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf251fba-e960-4f2f-ad9a-117ce9a2f5a0" (UID: "bf251fba-e960-4f2f-ad9a-117ce9a2f5a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.640567 4753 generic.go:334] "Generic (PLEG): container finished" podID="82c06532-183f-4527-b423-630110596fb4" containerID="c03517e0292b57c6b359b996af8065e26ca76f92b447521f128204efb05538e3" exitCode=0 Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.640587 4753 generic.go:334] "Generic (PLEG): container finished" podID="82c06532-183f-4527-b423-630110596fb4" containerID="94767674e99c2eb5b0abc2208da66b5b5d0e5bbb98eae7fc0cc7902d80c6ed4b" exitCode=0 Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.640625 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7896b659cf-r8vcb" event={"ID":"82c06532-183f-4527-b423-630110596fb4","Type":"ContainerDied","Data":"c03517e0292b57c6b359b996af8065e26ca76f92b447521f128204efb05538e3"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.640649 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7896b659cf-r8vcb" event={"ID":"82c06532-183f-4527-b423-630110596fb4","Type":"ContainerDied","Data":"94767674e99c2eb5b0abc2208da66b5b5d0e5bbb98eae7fc0cc7902d80c6ed4b"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.651902 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-kube-api-access-9v8nw" (OuterVolumeSpecName: "kube-api-access-9v8nw") pod "bf251fba-e960-4f2f-ad9a-117ce9a2f5a0" (UID: "bf251fba-e960-4f2f-ad9a-117ce9a2f5a0"). InnerVolumeSpecName "kube-api-access-9v8nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.659607 4753 generic.go:334] "Generic (PLEG): container finished" podID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" containerID="4c6f60456d1d153f776a9722a5c5c012f82210a1ce9b96301d44ce657ac865a5" exitCode=0 Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.659677 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" event={"ID":"5766d009-a05f-4e8e-8267-9bd6c1267d3a","Type":"ContainerDied","Data":"4c6f60456d1d153f776a9722a5c5c012f82210a1ce9b96301d44ce657ac865a5"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.662544 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ced3-account-create-update-sg22z" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.665461 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9257-account-create-update-bzjs4" event={"ID":"bf251fba-e960-4f2f-ad9a-117ce9a2f5a0","Type":"ContainerDied","Data":"5b79ba2152ca86d6581051db0978b562feeada90c67e7cf05bd4476254c489d3"} Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.665489 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9257-account-create-update-bzjs4" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.667093 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kt2lt"] Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.683380 4753 scope.go:117] "RemoveContainer" containerID="fa247e070c5c59fdd8d782d513689c0f7fef6b0f207b3bdcee7043f734fd870a" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.683642 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-config-data" (OuterVolumeSpecName: "config-data") pod "de34f6dc-67dd-4054-84a9-a051e0ba2876" (UID: "de34f6dc-67dd-4054-84a9-a051e0ba2876"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.704270 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de34f6dc-67dd-4054-84a9-a051e0ba2876" (UID: "de34f6dc-67dd-4054-84a9-a051e0ba2876"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.720414 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "de34f6dc-67dd-4054-84a9-a051e0ba2876" (UID: "de34f6dc-67dd-4054-84a9-a051e0ba2876"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.725909 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/964a679c-ce73-46ec-8e88-37c972fbe817-operator-scripts\") pod \"root-account-create-update-kt2lt\" (UID: \"964a679c-ce73-46ec-8e88-37c972fbe817\") " pod="openstack/root-account-create-update-kt2lt" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.726192 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9h8\" (UniqueName: \"kubernetes.io/projected/964a679c-ce73-46ec-8e88-37c972fbe817-kube-api-access-vf9h8\") pod \"root-account-create-update-kt2lt\" (UID: \"964a679c-ce73-46ec-8e88-37c972fbe817\") " pod="openstack/root-account-create-update-kt2lt" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.726797 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4pt2\" (UniqueName: \"kubernetes.io/projected/de34f6dc-67dd-4054-84a9-a051e0ba2876-kube-api-access-f4pt2\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.726904 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.726992 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v8nw\" (UniqueName: \"kubernetes.io/projected/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0-kube-api-access-9v8nw\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.727074 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.727144 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.727220 4753 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.743419 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "de34f6dc-67dd-4054-84a9-a051e0ba2876" (UID: "de34f6dc-67dd-4054-84a9-a051e0ba2876"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.753038 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="ad5c04aa-ed92-4c33-ad37-4420b362e237" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.768456 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9257-account-create-update-bzjs4"] Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.776247 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9257-account-create-update-bzjs4"] Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.788354 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ced3-account-create-update-sg22z"] Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.791814 4753 scope.go:117] "RemoveContainer" containerID="0a4e3d482c90f0b0f41a33a9b8df7ea7922b37348c22a7403543d700dae2ebaf" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.794905 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ced3-account-create-update-sg22z"] Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.795664 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.842332 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/964a679c-ce73-46ec-8e88-37c972fbe817-operator-scripts\") pod \"root-account-create-update-kt2lt\" (UID: \"964a679c-ce73-46ec-8e88-37c972fbe817\") " pod="openstack/root-account-create-update-kt2lt" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.842702 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9h8\" (UniqueName: \"kubernetes.io/projected/964a679c-ce73-46ec-8e88-37c972fbe817-kube-api-access-vf9h8\") pod \"root-account-create-update-kt2lt\" (UID: \"964a679c-ce73-46ec-8e88-37c972fbe817\") " pod="openstack/root-account-create-update-kt2lt" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.842753 4753 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/de34f6dc-67dd-4054-84a9-a051e0ba2876-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.843998 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/964a679c-ce73-46ec-8e88-37c972fbe817-operator-scripts\") pod \"root-account-create-update-kt2lt\" (UID: \"964a679c-ce73-46ec-8e88-37c972fbe817\") " pod="openstack/root-account-create-update-kt2lt" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.874339 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9h8\" (UniqueName: \"kubernetes.io/projected/964a679c-ce73-46ec-8e88-37c972fbe817-kube-api-access-vf9h8\") pod \"root-account-create-update-kt2lt\" (UID: \"964a679c-ce73-46ec-8e88-37c972fbe817\") " pod="openstack/root-account-create-update-kt2lt" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.943935 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data-custom\") pod \"f2dbc378-044c-49a2-a891-94a90a0acff1\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.943986 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-combined-ca-bundle\") pod \"f2dbc378-044c-49a2-a891-94a90a0acff1\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.944086 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data\") pod \"f2dbc378-044c-49a2-a891-94a90a0acff1\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.944136 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2dbc378-044c-49a2-a891-94a90a0acff1-logs\") pod \"f2dbc378-044c-49a2-a891-94a90a0acff1\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.944187 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-internal-tls-certs\") pod \"f2dbc378-044c-49a2-a891-94a90a0acff1\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.944206 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-public-tls-certs\") pod \"f2dbc378-044c-49a2-a891-94a90a0acff1\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.944222 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-scripts\") pod \"f2dbc378-044c-49a2-a891-94a90a0acff1\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.944299 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-924sv\" (UniqueName: \"kubernetes.io/projected/f2dbc378-044c-49a2-a891-94a90a0acff1-kube-api-access-924sv\") pod \"f2dbc378-044c-49a2-a891-94a90a0acff1\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.944376 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2dbc378-044c-49a2-a891-94a90a0acff1-etc-machine-id\") pod \"f2dbc378-044c-49a2-a891-94a90a0acff1\" (UID: \"f2dbc378-044c-49a2-a891-94a90a0acff1\") " Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.944777 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0de6053f-c9da-4625-a175-c9649c10d462-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.944795 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78skc\" (UniqueName: \"kubernetes.io/projected/0de6053f-c9da-4625-a175-c9649c10d462-kube-api-access-78skc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.944866 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2dbc378-044c-49a2-a891-94a90a0acff1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f2dbc378-044c-49a2-a891-94a90a0acff1" (UID: "f2dbc378-044c-49a2-a891-94a90a0acff1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.959508 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2dbc378-044c-49a2-a891-94a90a0acff1-logs" (OuterVolumeSpecName: "logs") pod "f2dbc378-044c-49a2-a891-94a90a0acff1" (UID: "f2dbc378-044c-49a2-a891-94a90a0acff1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.960333 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f2dbc378-044c-49a2-a891-94a90a0acff1" (UID: "f2dbc378-044c-49a2-a891-94a90a0acff1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.960610 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kt2lt" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.972029 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2dbc378-044c-49a2-a891-94a90a0acff1-kube-api-access-924sv" (OuterVolumeSpecName: "kube-api-access-924sv") pod "f2dbc378-044c-49a2-a891-94a90a0acff1" (UID: "f2dbc378-044c-49a2-a891-94a90a0acff1"). InnerVolumeSpecName "kube-api-access-924sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:37 crc kubenswrapper[4753]: I0129 14:24:37.995175 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-scripts" (OuterVolumeSpecName: "scripts") pod "f2dbc378-044c-49a2-a891-94a90a0acff1" (UID: "f2dbc378-044c-49a2-a891-94a90a0acff1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.033002 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2dbc378-044c-49a2-a891-94a90a0acff1" (UID: "f2dbc378-044c-49a2-a891-94a90a0acff1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.046394 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.046438 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.046447 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2dbc378-044c-49a2-a891-94a90a0acff1-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.046455 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.046465 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-924sv\" (UniqueName: \"kubernetes.io/projected/f2dbc378-044c-49a2-a891-94a90a0acff1-kube-api-access-924sv\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.046476 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2dbc378-044c-49a2-a891-94a90a0acff1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.052375 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data" (OuterVolumeSpecName: "config-data") pod "f2dbc378-044c-49a2-a891-94a90a0acff1" (UID: "f2dbc378-044c-49a2-a891-94a90a0acff1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.055077 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f2dbc378-044c-49a2-a891-94a90a0acff1" (UID: "f2dbc378-044c-49a2-a891-94a90a0acff1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.055372 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f2dbc378-044c-49a2-a891-94a90a0acff1" (UID: "f2dbc378-044c-49a2-a891-94a90a0acff1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.114796 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="5f7e3e27-a036-4623-8d63-557a3c0d76e6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.148492 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.148526 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.148539 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dbc378-044c-49a2-a891-94a90a0acff1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.210726 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.210895 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08759bfe-4b2e-4da9-b0b0-2149a71a831e" path="/var/lib/kubelet/pods/08759bfe-4b2e-4da9-b0b0-2149a71a831e/volumes" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.211457 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0de6053f-c9da-4625-a175-c9649c10d462" path="/var/lib/kubelet/pods/0de6053f-c9da-4625-a175-c9649c10d462/volumes" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.211954 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e919242-0fa9-4c26-90c5-718fec0a9109" path="/var/lib/kubelet/pods/1e919242-0fa9-4c26-90c5-718fec0a9109/volumes" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.213011 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386b408e-6ac5-4f1d-8403-be4fc8aec57d" path="/var/lib/kubelet/pods/386b408e-6ac5-4f1d-8403-be4fc8aec57d/volumes" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.214071 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf251fba-e960-4f2f-ad9a-117ce9a2f5a0" path="/var/lib/kubelet/pods/bf251fba-e960-4f2f-ad9a-117ce9a2f5a0/volumes" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.214478 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8" path="/var/lib/kubelet/pods/d6ffcfa6-8ed5-4d92-9e63-87756c03f8e8/volumes" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.215134 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd12b7e9-4fd2-4ae9-8e74-d8499726d995" path="/var/lib/kubelet/pods/dd12b7e9-4fd2-4ae9-8e74-d8499726d995/volumes" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.387799 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tf7g\" (UniqueName: \"kubernetes.io/projected/001ea12a-a725-4cd9-a12e-1442d56f7068-kube-api-access-5tf7g\") pod \"001ea12a-a725-4cd9-a12e-1442d56f7068\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.388042 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-operator-scripts\") pod \"001ea12a-a725-4cd9-a12e-1442d56f7068\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.388102 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-default\") pod \"001ea12a-a725-4cd9-a12e-1442d56f7068\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.388125 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"001ea12a-a725-4cd9-a12e-1442d56f7068\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.388189 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-galera-tls-certs\") pod \"001ea12a-a725-4cd9-a12e-1442d56f7068\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.388254 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-combined-ca-bundle\") pod \"001ea12a-a725-4cd9-a12e-1442d56f7068\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.388320 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-kolla-config\") pod \"001ea12a-a725-4cd9-a12e-1442d56f7068\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.388366 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-generated\") pod \"001ea12a-a725-4cd9-a12e-1442d56f7068\" (UID: \"001ea12a-a725-4cd9-a12e-1442d56f7068\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.389025 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "001ea12a-a725-4cd9-a12e-1442d56f7068" (UID: "001ea12a-a725-4cd9-a12e-1442d56f7068"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.393640 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "001ea12a-a725-4cd9-a12e-1442d56f7068" (UID: "001ea12a-a725-4cd9-a12e-1442d56f7068"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.393891 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "001ea12a-a725-4cd9-a12e-1442d56f7068" (UID: "001ea12a-a725-4cd9-a12e-1442d56f7068"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.403252 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "001ea12a-a725-4cd9-a12e-1442d56f7068" (UID: "001ea12a-a725-4cd9-a12e-1442d56f7068"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.409383 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001ea12a-a725-4cd9-a12e-1442d56f7068-kube-api-access-5tf7g" (OuterVolumeSpecName: "kube-api-access-5tf7g") pod "001ea12a-a725-4cd9-a12e-1442d56f7068" (UID: "001ea12a-a725-4cd9-a12e-1442d56f7068"). InnerVolumeSpecName "kube-api-access-5tf7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.418288 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "001ea12a-a725-4cd9-a12e-1442d56f7068" (UID: "001ea12a-a725-4cd9-a12e-1442d56f7068"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.424304 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "001ea12a-a725-4cd9-a12e-1442d56f7068" (UID: "001ea12a-a725-4cd9-a12e-1442d56f7068"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.425241 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c053-account-create-update-x8gmz" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.445838 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.460466 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.479779 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-870b-account-create-update-kndlh" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.490524 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.490565 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.490576 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.490585 4753 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.490596 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/001ea12a-a725-4cd9-a12e-1442d56f7068-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.490607 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tf7g\" (UniqueName: \"kubernetes.io/projected/001ea12a-a725-4cd9-a12e-1442d56f7068-kube-api-access-5tf7g\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.490618 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001ea12a-a725-4cd9-a12e-1442d56f7068-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.492331 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "001ea12a-a725-4cd9-a12e-1442d56f7068" (UID: "001ea12a-a725-4cd9-a12e-1442d56f7068"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.517123 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.534020 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": read tcp 10.217.0.2:42278->10.217.0.205:8775: read: connection reset by peer" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.534534 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8b4-account-create-update-8xzhc" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.534667 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": read tcp 10.217.0.2:42282->10.217.0.205:8775: read: connection reset by peer" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.555465 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c286z" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.613016 4753 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/001ea12a-a725-4cd9-a12e-1442d56f7068-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.643269 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.645650 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.646177 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="ceilometer-central-agent" containerID="cri-o://77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd" gracePeriod=30 Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.646325 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="proxy-httpd" containerID="cri-o://0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc" gracePeriod=30 Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.646386 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="sg-core" containerID="cri-o://8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4" gracePeriod=30 Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.646433 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="ceilometer-notification-agent" containerID="cri-o://f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5" gracePeriod=30 Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.699541 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7f8-account-create-update-2zm4w" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.708219 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c053-account-create-update-x8gmz" event={"ID":"31a4dbbc-e575-4d30-bba1-f3785c1e497e","Type":"ContainerDied","Data":"ef76beacb15a0486d7984825417dab8ed22cd4b79aeda34621b5c622d6ad73c0"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.708312 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c053-account-create-update-x8gmz" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.709808 4753 generic.go:334] "Generic (PLEG): container finished" podID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" containerID="7928bff86f6434736cea8c468f448a8ff6bae7916724dac5e9b712cc68b65281" exitCode=0 Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.709843 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cbccfdbb-x7lwh" event={"ID":"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7","Type":"ContainerDied","Data":"7928bff86f6434736cea8c468f448a8ff6bae7916724dac5e9b712cc68b65281"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.710762 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f8b4-account-create-update-8xzhc" event={"ID":"ef1a7724-ed52-4481-bbaa-1a0464561b2a","Type":"ContainerDied","Data":"7109356fbf081180048decd7c2ba3ea0425623b3359ea764cd0507f19323e431"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.710822 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f8b4-account-create-update-8xzhc" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714221 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-run-httpd\") pod \"82c06532-183f-4527-b423-630110596fb4\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714265 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmrjc\" (UniqueName: \"kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-kube-api-access-rmrjc\") pod \"82c06532-183f-4527-b423-630110596fb4\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714288 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef1a7724-ed52-4481-bbaa-1a0464561b2a-operator-scripts\") pod \"ef1a7724-ed52-4481-bbaa-1a0464561b2a\" (UID: \"ef1a7724-ed52-4481-bbaa-1a0464561b2a\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714316 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-public-tls-certs\") pod \"82c06532-183f-4527-b423-630110596fb4\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714389 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62921d0c-9482-49a2-8d20-5dda61ba80da-logs\") pod \"62921d0c-9482-49a2-8d20-5dda61ba80da\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714404 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data\") pod \"62921d0c-9482-49a2-8d20-5dda61ba80da\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714427 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78jp7\" (UniqueName: \"kubernetes.io/projected/ef1a7724-ed52-4481-bbaa-1a0464561b2a-kube-api-access-78jp7\") pod \"ef1a7724-ed52-4481-bbaa-1a0464561b2a\" (UID: \"ef1a7724-ed52-4481-bbaa-1a0464561b2a\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714451 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data-custom\") pod \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714470 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-combined-ca-bundle\") pod \"62921d0c-9482-49a2-8d20-5dda61ba80da\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714505 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-internal-tls-certs\") pod \"82c06532-183f-4527-b423-630110596fb4\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714520 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-config-data\") pod \"82c06532-183f-4527-b423-630110596fb4\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714552 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-log-httpd\") pod \"82c06532-183f-4527-b423-630110596fb4\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714569 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-combined-ca-bundle\") pod \"82c06532-183f-4527-b423-630110596fb4\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714594 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-combined-ca-bundle\") pod \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714610 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkq48\" (UniqueName: \"kubernetes.io/projected/31a4dbbc-e575-4d30-bba1-f3785c1e497e-kube-api-access-lkq48\") pod \"31a4dbbc-e575-4d30-bba1-f3785c1e497e\" (UID: \"31a4dbbc-e575-4d30-bba1-f3785c1e497e\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714640 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a4dbbc-e575-4d30-bba1-f3785c1e497e-operator-scripts\") pod \"31a4dbbc-e575-4d30-bba1-f3785c1e497e\" (UID: \"31a4dbbc-e575-4d30-bba1-f3785c1e497e\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714659 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2s4w\" (UniqueName: \"kubernetes.io/projected/558ab235-719b-4913-aab2-863bfa6586e8-kube-api-access-v2s4w\") pod \"558ab235-719b-4913-aab2-863bfa6586e8\" (UID: \"558ab235-719b-4913-aab2-863bfa6586e8\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714673 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7bgp\" (UniqueName: \"kubernetes.io/projected/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-kube-api-access-k7bgp\") pod \"dfce025b-40c8-4ae3-b1c2-a1d858e11adb\" (UID: \"dfce025b-40c8-4ae3-b1c2-a1d858e11adb\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714689 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-operator-scripts\") pod \"dfce025b-40c8-4ae3-b1c2-a1d858e11adb\" (UID: \"dfce025b-40c8-4ae3-b1c2-a1d858e11adb\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714717 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5766d009-a05f-4e8e-8267-9bd6c1267d3a-logs\") pod \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714762 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zphs5\" (UniqueName: \"kubernetes.io/projected/5766d009-a05f-4e8e-8267-9bd6c1267d3a-kube-api-access-zphs5\") pod \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714784 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prv7t\" (UniqueName: \"kubernetes.io/projected/62921d0c-9482-49a2-8d20-5dda61ba80da-kube-api-access-prv7t\") pod \"62921d0c-9482-49a2-8d20-5dda61ba80da\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714802 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data\") pod \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\" (UID: \"5766d009-a05f-4e8e-8267-9bd6c1267d3a\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714833 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts\") pod \"558ab235-719b-4913-aab2-863bfa6586e8\" (UID: \"558ab235-719b-4913-aab2-863bfa6586e8\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714883 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-etc-swift\") pod \"82c06532-183f-4527-b423-630110596fb4\" (UID: \"82c06532-183f-4527-b423-630110596fb4\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714922 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data-custom\") pod \"62921d0c-9482-49a2-8d20-5dda61ba80da\" (UID: \"62921d0c-9482-49a2-8d20-5dda61ba80da\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.714921 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "82c06532-183f-4527-b423-630110596fb4" (UID: "82c06532-183f-4527-b423-630110596fb4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.715444 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.715463 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.715597 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dfce025b-40c8-4ae3-b1c2-a1d858e11adb" (UID: "dfce025b-40c8-4ae3-b1c2-a1d858e11adb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.719472 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5766d009-a05f-4e8e-8267-9bd6c1267d3a-logs" (OuterVolumeSpecName: "logs") pod "5766d009-a05f-4e8e-8267-9bd6c1267d3a" (UID: "5766d009-a05f-4e8e-8267-9bd6c1267d3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.720528 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "558ab235-719b-4913-aab2-863bfa6586e8" (UID: "558ab235-719b-4913-aab2-863bfa6586e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.722200 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "82c06532-183f-4527-b423-630110596fb4" (UID: "82c06532-183f-4527-b423-630110596fb4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.722421 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1a7724-ed52-4481-bbaa-1a0464561b2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef1a7724-ed52-4481-bbaa-1a0464561b2a" (UID: "ef1a7724-ed52-4481-bbaa-1a0464561b2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.722811 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31a4dbbc-e575-4d30-bba1-f3785c1e497e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31a4dbbc-e575-4d30-bba1-f3785c1e497e" (UID: "31a4dbbc-e575-4d30-bba1-f3785c1e497e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.724494 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62921d0c-9482-49a2-8d20-5dda61ba80da-logs" (OuterVolumeSpecName: "logs") pod "62921d0c-9482-49a2-8d20-5dda61ba80da" (UID: "62921d0c-9482-49a2-8d20-5dda61ba80da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.725825 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1a7724-ed52-4481-bbaa-1a0464561b2a-kube-api-access-78jp7" (OuterVolumeSpecName: "kube-api-access-78jp7") pod "ef1a7724-ed52-4481-bbaa-1a0464561b2a" (UID: "ef1a7724-ed52-4481-bbaa-1a0464561b2a"). InnerVolumeSpecName "kube-api-access-78jp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.727859 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62921d0c-9482-49a2-8d20-5dda61ba80da-kube-api-access-prv7t" (OuterVolumeSpecName: "kube-api-access-prv7t") pod "62921d0c-9482-49a2-8d20-5dda61ba80da" (UID: "62921d0c-9482-49a2-8d20-5dda61ba80da"). InnerVolumeSpecName "kube-api-access-prv7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.730829 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7896b659cf-r8vcb" event={"ID":"82c06532-183f-4527-b423-630110596fb4","Type":"ContainerDied","Data":"87d0d9f0e044c738cb947de5af9e24b5365d10b11b844923864aa48497f20936"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.730889 4753 scope.go:117] "RemoveContainer" containerID="c03517e0292b57c6b359b996af8065e26ca76f92b447521f128204efb05538e3" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.731063 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7896b659cf-r8vcb" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.732505 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "82c06532-183f-4527-b423-630110596fb4" (UID: "82c06532-183f-4527-b423-630110596fb4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.735246 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5766d009-a05f-4e8e-8267-9bd6c1267d3a" (UID: "5766d009-a05f-4e8e-8267-9bd6c1267d3a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.738265 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-kube-api-access-k7bgp" (OuterVolumeSpecName: "kube-api-access-k7bgp") pod "dfce025b-40c8-4ae3-b1c2-a1d858e11adb" (UID: "dfce025b-40c8-4ae3-b1c2-a1d858e11adb"). InnerVolumeSpecName "kube-api-access-k7bgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.739692 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558ab235-719b-4913-aab2-863bfa6586e8-kube-api-access-v2s4w" (OuterVolumeSpecName: "kube-api-access-v2s4w") pod "558ab235-719b-4913-aab2-863bfa6586e8" (UID: "558ab235-719b-4913-aab2-863bfa6586e8"). InnerVolumeSpecName "kube-api-access-v2s4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.740797 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "62921d0c-9482-49a2-8d20-5dda61ba80da" (UID: "62921d0c-9482-49a2-8d20-5dda61ba80da"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.741061 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-kube-api-access-rmrjc" (OuterVolumeSpecName: "kube-api-access-rmrjc") pod "82c06532-183f-4527-b423-630110596fb4" (UID: "82c06532-183f-4527-b423-630110596fb4"). InnerVolumeSpecName "kube-api-access-rmrjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.742857 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-870b-account-create-update-kndlh" event={"ID":"dfce025b-40c8-4ae3-b1c2-a1d858e11adb","Type":"ContainerDied","Data":"060753a9957019838dad4cb7b926bdf9d210e7f430bdbda558a55e00bd88b43e"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.743052 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-870b-account-create-update-kndlh" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.748528 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a4dbbc-e575-4d30-bba1-f3785c1e497e-kube-api-access-lkq48" (OuterVolumeSpecName: "kube-api-access-lkq48") pod "31a4dbbc-e575-4d30-bba1-f3785c1e497e" (UID: "31a4dbbc-e575-4d30-bba1-f3785c1e497e"). InnerVolumeSpecName "kube-api-access-lkq48". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.748694 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.749922 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="05315649-b501-4aae-9c14-4e632b89be53" containerName="kube-state-metrics" containerID="cri-o://27f10dc0be419156c4795db55c1bed92ee49c68890c1310f78d7e5642ab655c9" gracePeriod=30 Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.755758 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"001ea12a-a725-4cd9-a12e-1442d56f7068","Type":"ContainerDied","Data":"b73bc2dcdd1e10d160fcb2dd67ce0c404d53f92df5fb95335e5b3080b1552073"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.755930 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.762515 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" event={"ID":"62921d0c-9482-49a2-8d20-5dda61ba80da","Type":"ContainerDied","Data":"0fc06144907ebce1c52c6c2f39e97bc3d8ae95784a20a740f4f2e06aca457a92"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.762641 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-599fdd45b6-c7l8c" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.767134 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5766d009-a05f-4e8e-8267-9bd6c1267d3a-kube-api-access-zphs5" (OuterVolumeSpecName: "kube-api-access-zphs5") pod "5766d009-a05f-4e8e-8267-9bd6c1267d3a" (UID: "5766d009-a05f-4e8e-8267-9bd6c1267d3a"). InnerVolumeSpecName "kube-api-access-zphs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.778176 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7f8-account-create-update-2zm4w" event={"ID":"0dc24c61-81b8-4f51-a2c7-961548f1b11b","Type":"ContainerDied","Data":"00d8949f8f371a4a63403fa15f6ce63e0989553fa96664b0c3aa3ab7af2bc8d6"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.778298 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7f8-account-create-update-2zm4w" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.786784 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.787494 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f2dbc378-044c-49a2-a891-94a90a0acff1","Type":"ContainerDied","Data":"6b786f25bafe8d6c3eaf139cfc995408a12a5b9be476cf604c495d5f9896ac60"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.802476 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" event={"ID":"5766d009-a05f-4e8e-8267-9bd6c1267d3a","Type":"ContainerDied","Data":"f602b4bb61fe764b47c81e745fe06ca697de6376b87af80355b0767cc16d33ad"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.802586 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d9c4fb469-wlxbk" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.807321 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c286z" event={"ID":"558ab235-719b-4913-aab2-863bfa6586e8","Type":"ContainerDied","Data":"41d74e4feaeb0b2190fc2c5ce8b2fc54c7da6fbcce00f05a893bd28ad38d8332"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.807429 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c286z" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.811124 4753 generic.go:334] "Generic (PLEG): container finished" podID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerID="0a800f0f0fcb4a6f6136a9675b82b2ab62e4096da3d7cc45f7830d2af553041f" exitCode=0 Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.811235 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.811260 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7947b8f0-b134-40d9-beba-116bbb51a1c2","Type":"ContainerDied","Data":"0a800f0f0fcb4a6f6136a9675b82b2ab62e4096da3d7cc45f7830d2af553041f"} Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.818864 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc24c61-81b8-4f51-a2c7-961548f1b11b-operator-scripts\") pod \"0dc24c61-81b8-4f51-a2c7-961548f1b11b\" (UID: \"0dc24c61-81b8-4f51-a2c7-961548f1b11b\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.819003 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh67z\" (UniqueName: \"kubernetes.io/projected/0dc24c61-81b8-4f51-a2c7-961548f1b11b-kube-api-access-mh67z\") pod \"0dc24c61-81b8-4f51-a2c7-961548f1b11b\" (UID: \"0dc24c61-81b8-4f51-a2c7-961548f1b11b\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.821212 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dc24c61-81b8-4f51-a2c7-961548f1b11b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0dc24c61-81b8-4f51-a2c7-961548f1b11b" (UID: "0dc24c61-81b8-4f51-a2c7-961548f1b11b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.823988 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62921d0c-9482-49a2-8d20-5dda61ba80da" (UID: "62921d0c-9482-49a2-8d20-5dda61ba80da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825360 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825490 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmrjc\" (UniqueName: \"kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-kube-api-access-rmrjc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825512 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef1a7724-ed52-4481-bbaa-1a0464561b2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825526 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62921d0c-9482-49a2-8d20-5dda61ba80da-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825540 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78jp7\" (UniqueName: \"kubernetes.io/projected/ef1a7724-ed52-4481-bbaa-1a0464561b2a-kube-api-access-78jp7\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825552 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825563 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82c06532-183f-4527-b423-630110596fb4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825574 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkq48\" (UniqueName: \"kubernetes.io/projected/31a4dbbc-e575-4d30-bba1-f3785c1e497e-kube-api-access-lkq48\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825587 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a4dbbc-e575-4d30-bba1-f3785c1e497e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825599 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2s4w\" (UniqueName: \"kubernetes.io/projected/558ab235-719b-4913-aab2-863bfa6586e8-kube-api-access-v2s4w\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825611 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7bgp\" (UniqueName: \"kubernetes.io/projected/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-kube-api-access-k7bgp\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825622 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfce025b-40c8-4ae3-b1c2-a1d858e11adb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825632 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5766d009-a05f-4e8e-8267-9bd6c1267d3a-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825645 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zphs5\" (UniqueName: \"kubernetes.io/projected/5766d009-a05f-4e8e-8267-9bd6c1267d3a-kube-api-access-zphs5\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825656 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prv7t\" (UniqueName: \"kubernetes.io/projected/62921d0c-9482-49a2-8d20-5dda61ba80da-kube-api-access-prv7t\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825666 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/558ab235-719b-4913-aab2-863bfa6586e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.825678 4753 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/82c06532-183f-4527-b423-630110596fb4-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.844763 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc24c61-81b8-4f51-a2c7-961548f1b11b-kube-api-access-mh67z" (OuterVolumeSpecName: "kube-api-access-mh67z") pod "0dc24c61-81b8-4f51-a2c7-961548f1b11b" (UID: "0dc24c61-81b8-4f51-a2c7-961548f1b11b"). InnerVolumeSpecName "kube-api-access-mh67z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: E0129 14:24:38.847892 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.848010 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5766d009-a05f-4e8e-8267-9bd6c1267d3a" (UID: "5766d009-a05f-4e8e-8267-9bd6c1267d3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: E0129 14:24:38.864221 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.867534 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data" (OuterVolumeSpecName: "config-data") pod "5766d009-a05f-4e8e-8267-9bd6c1267d3a" (UID: "5766d009-a05f-4e8e-8267-9bd6c1267d3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: E0129 14:24:38.868461 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 14:24:38 crc kubenswrapper[4753]: E0129 14:24:38.868513 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="15b88ba9-8449-4e76-a36c-34ca2b2be488" containerName="nova-scheduler-scheduler" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.868828 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fad8-account-create-update-lzht5" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.875474 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data" (OuterVolumeSpecName: "config-data") pod "62921d0c-9482-49a2-8d20-5dda61ba80da" (UID: "62921d0c-9482-49a2-8d20-5dda61ba80da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.876228 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "82c06532-183f-4527-b423-630110596fb4" (UID: "82c06532-183f-4527-b423-630110596fb4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.882828 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82c06532-183f-4527-b423-630110596fb4" (UID: "82c06532-183f-4527-b423-630110596fb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.935061 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g62rn\" (UniqueName: \"kubernetes.io/projected/acfc5598-76b4-4673-aeef-b9105a8e6853-kube-api-access-g62rn\") pod \"acfc5598-76b4-4673-aeef-b9105a8e6853\" (UID: \"acfc5598-76b4-4673-aeef-b9105a8e6853\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.935374 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfc5598-76b4-4673-aeef-b9105a8e6853-operator-scripts\") pod \"acfc5598-76b4-4673-aeef-b9105a8e6853\" (UID: \"acfc5598-76b4-4673-aeef-b9105a8e6853\") " Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.937039 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acfc5598-76b4-4673-aeef-b9105a8e6853-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "acfc5598-76b4-4673-aeef-b9105a8e6853" (UID: "acfc5598-76b4-4673-aeef-b9105a8e6853"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.938786 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfc5598-76b4-4673-aeef-b9105a8e6853-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.938812 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.938825 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62921d0c-9482-49a2-8d20-5dda61ba80da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.938836 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc24c61-81b8-4f51-a2c7-961548f1b11b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.938846 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.938854 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.938865 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.938877 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh67z\" (UniqueName: \"kubernetes.io/projected/0dc24c61-81b8-4f51-a2c7-961548f1b11b-kube-api-access-mh67z\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.938889 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5766d009-a05f-4e8e-8267-9bd6c1267d3a-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.948492 4753 scope.go:117] "RemoveContainer" containerID="94767674e99c2eb5b0abc2208da66b5b5d0e5bbb98eae7fc0cc7902d80c6ed4b" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.951825 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfc5598-76b4-4673-aeef-b9105a8e6853-kube-api-access-g62rn" (OuterVolumeSpecName: "kube-api-access-g62rn") pod "acfc5598-76b4-4673-aeef-b9105a8e6853" (UID: "acfc5598-76b4-4673-aeef-b9105a8e6853"). InnerVolumeSpecName "kube-api-access-g62rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.952406 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-config-data" (OuterVolumeSpecName: "config-data") pod "82c06532-183f-4527-b423-630110596fb4" (UID: "82c06532-183f-4527-b423-630110596fb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:38 crc kubenswrapper[4753]: I0129 14:24:38.952606 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.045584 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-public-tls-certs\") pod \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.058170 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-combined-ca-bundle\") pod \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.058416 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-logs\") pod \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.058569 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-scripts\") pod \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.058639 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th988\" (UniqueName: \"kubernetes.io/projected/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-kube-api-access-th988\") pod \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.058705 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-internal-tls-certs\") pod \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.058837 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-config-data\") pod \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\" (UID: \"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.059469 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.059533 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g62rn\" (UniqueName: \"kubernetes.io/projected/acfc5598-76b4-4673-aeef-b9105a8e6853-kube-api-access-g62rn\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.074406 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-logs" (OuterVolumeSpecName: "logs") pod "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" (UID: "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.088445 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="05315649-b501-4aae-9c14-4e632b89be53" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.196:8081/readyz\": dial tcp 10.217.0.196:8081: connect: connection refused" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.111334 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "82c06532-183f-4527-b423-630110596fb4" (UID: "82c06532-183f-4527-b423-630110596fb4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.111550 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-kube-api-access-th988" (OuterVolumeSpecName: "kube-api-access-th988") pod "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" (UID: "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7"). InnerVolumeSpecName "kube-api-access-th988". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.114565 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-scripts" (OuterVolumeSpecName: "scripts") pod "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" (UID: "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.162577 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.162608 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82c06532-183f-4527-b423-630110596fb4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.162616 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.162624 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th988\" (UniqueName: \"kubernetes.io/projected/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-kube-api-access-th988\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.246547 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.246809 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" containerName="memcached" containerID="cri-o://5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48" gracePeriod=30 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.275875 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2e4b-account-create-update-sbtfx"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.311265 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2e4b-account-create-update-sbtfx"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.328066 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2e4b-account-create-update-t2l6n"] Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.328528 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c06532-183f-4527-b423-630110596fb4" containerName="proxy-httpd" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.328541 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c06532-183f-4527-b423-630110596fb4" containerName="proxy-httpd" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.328563 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62921d0c-9482-49a2-8d20-5dda61ba80da" containerName="barbican-keystone-listener" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.328570 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="62921d0c-9482-49a2-8d20-5dda61ba80da" containerName="barbican-keystone-listener" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.328580 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" containerName="placement-log" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.328586 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" containerName="placement-log" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.328593 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001ea12a-a725-4cd9-a12e-1442d56f7068" containerName="galera" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.328599 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="001ea12a-a725-4cd9-a12e-1442d56f7068" containerName="galera" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.328611 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" containerName="barbican-worker" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.335754 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" containerName="barbican-worker" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.335906 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001ea12a-a725-4cd9-a12e-1442d56f7068" containerName="mysql-bootstrap" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.335941 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="001ea12a-a725-4cd9-a12e-1442d56f7068" containerName="mysql-bootstrap" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.336316 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c06532-183f-4527-b423-630110596fb4" containerName="proxy-server" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336324 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c06532-183f-4527-b423-630110596fb4" containerName="proxy-server" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.336340 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" containerName="barbican-worker-log" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336346 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" containerName="barbican-worker-log" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.336354 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62921d0c-9482-49a2-8d20-5dda61ba80da" containerName="barbican-keystone-listener-log" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336360 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="62921d0c-9482-49a2-8d20-5dda61ba80da" containerName="barbican-keystone-listener-log" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.336370 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" containerName="placement-api" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336376 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" containerName="placement-api" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.336385 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerName="cinder-api-log" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336391 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerName="cinder-api-log" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.336399 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerName="cinder-api" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336404 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerName="cinder-api" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336637 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="62921d0c-9482-49a2-8d20-5dda61ba80da" containerName="barbican-keystone-listener" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336647 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerName="cinder-api-log" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336656 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" containerName="placement-log" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336664 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2dbc378-044c-49a2-a891-94a90a0acff1" containerName="cinder-api" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336675 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" containerName="barbican-worker-log" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336686 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" containerName="placement-api" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336699 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="001ea12a-a725-4cd9-a12e-1442d56f7068" containerName="galera" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336710 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c06532-183f-4527-b423-630110596fb4" containerName="proxy-httpd" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336722 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c06532-183f-4527-b423-630110596fb4" containerName="proxy-server" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336731 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" containerName="barbican-worker" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.336739 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="62921d0c-9482-49a2-8d20-5dda61ba80da" containerName="barbican-keystone-listener-log" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.344820 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.359519 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.366611 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-q5h9z"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.393446 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-q5h9z"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.406638 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2e4b-account-create-update-t2l6n"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.420894 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" (UID: "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.434644 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-n9g6j"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.434695 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-n9g6j"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.437924 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7b9f57fc94-gqqlc"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.438115 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7b9f57fc94-gqqlc" podUID="cf6045aa-89c7-46c0-ba1e-4d63b9740883" containerName="keystone-api" containerID="cri-o://e450be8a7013dbd08831b7150e1c8ccd73e085bf16b828b286d89403e5f7cfbc" gracePeriod=30 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.446371 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-config-data" (OuterVolumeSpecName: "config-data") pod "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" (UID: "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.447813 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.467340 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mfqkt"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.472035 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" (UID: "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.475496 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mfqkt"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.481930 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6kwp\" (UniqueName: \"kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp\") pod \"keystone-2e4b-account-create-update-t2l6n\" (UID: \"eae1aad4-9fe6-4aa7-9071-170560f783af\") " pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.482059 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts\") pod \"keystone-2e4b-account-create-update-t2l6n\" (UID: \"eae1aad4-9fe6-4aa7-9071-170560f783af\") " pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.482185 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.482204 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.482216 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.483521 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.488519 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.488667 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2e4b-account-create-update-t2l6n"] Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.501227 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.501304 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="fb7a325b-7833-484f-8bba-7dc85ebf57cd" containerName="nova-cell0-conductor-conductor" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.512615 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kt2lt"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.526212 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" (UID: "a641f31f-1bb2-4a49-8e74-3d5baf14bfe7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.558438 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-c286z"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.562505 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.564741 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-c286z"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.567902 4753 scope.go:117] "RemoveContainer" containerID="b638be6aff7479c4c3b3e3e264266ee7a0c8949f1731326d7adf23a76a43271b" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.577011 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.584372 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6kwp\" (UniqueName: \"kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp\") pod \"keystone-2e4b-account-create-update-t2l6n\" (UID: \"eae1aad4-9fe6-4aa7-9071-170560f783af\") " pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.584498 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts\") pod \"keystone-2e4b-account-create-update-t2l6n\" (UID: \"eae1aad4-9fe6-4aa7-9071-170560f783af\") " pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.584578 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.584658 4753 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.584713 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts podName:eae1aad4-9fe6-4aa7-9071-170560f783af nodeName:}" failed. No retries permitted until 2026-01-29 14:24:40.08469606 +0000 UTC m=+1314.779430442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts") pod "keystone-2e4b-account-create-update-t2l6n" (UID: "eae1aad4-9fe6-4aa7-9071-170560f783af") : configmap "openstack-scripts" not found Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.589553 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.606773 4753 projected.go:194] Error preparing data for projected volume kube-api-access-f6kwp for pod openstack/keystone-2e4b-account-create-update-t2l6n: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.606853 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp podName:eae1aad4-9fe6-4aa7-9071-170560f783af nodeName:}" failed. No retries permitted until 2026-01-29 14:24:40.10682683 +0000 UTC m=+1314.801561212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f6kwp" (UniqueName: "kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp") pod "keystone-2e4b-account-create-update-t2l6n" (UID: "eae1aad4-9fe6-4aa7-9071-170560f783af") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 29 14:24:39 crc kubenswrapper[4753]: E0129 14:24:39.632399 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-f6kwp operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-2e4b-account-create-update-t2l6n" podUID="eae1aad4-9fe6-4aa7-9071-170560f783af" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.669123 4753 scope.go:117] "RemoveContainer" containerID="c48f2caaa7b2b2dcc7a3ca761e1a8901c8e8087e528a5186cf1bb1488756685e" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.669602 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-870b-account-create-update-kndlh"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.677907 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-870b-account-create-update-kndlh"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.685339 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-config-data\") pod \"7947b8f0-b134-40d9-beba-116bbb51a1c2\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.685441 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7947b8f0-b134-40d9-beba-116bbb51a1c2-logs\") pod \"7947b8f0-b134-40d9-beba-116bbb51a1c2\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.685589 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-nova-metadata-tls-certs\") pod \"7947b8f0-b134-40d9-beba-116bbb51a1c2\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.685671 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-combined-ca-bundle\") pod \"7947b8f0-b134-40d9-beba-116bbb51a1c2\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.685714 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txzts\" (UniqueName: \"kubernetes.io/projected/7947b8f0-b134-40d9-beba-116bbb51a1c2-kube-api-access-txzts\") pod \"7947b8f0-b134-40d9-beba-116bbb51a1c2\" (UID: \"7947b8f0-b134-40d9-beba-116bbb51a1c2\") " Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.687723 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7947b8f0-b134-40d9-beba-116bbb51a1c2-logs" (OuterVolumeSpecName: "logs") pod "7947b8f0-b134-40d9-beba-116bbb51a1c2" (UID: "7947b8f0-b134-40d9-beba-116bbb51a1c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.690694 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.698352 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7947b8f0-b134-40d9-beba-116bbb51a1c2-kube-api-access-txzts" (OuterVolumeSpecName: "kube-api-access-txzts") pod "7947b8f0-b134-40d9-beba-116bbb51a1c2" (UID: "7947b8f0-b134-40d9-beba-116bbb51a1c2"). InnerVolumeSpecName "kube-api-access-txzts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.712380 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.726753 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7896b659cf-r8vcb"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.727030 4753 scope.go:117] "RemoveContainer" containerID="4aa6335e18e43896614c27e4605f0813d42217c00887d65f886c827434a9fc80" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.737999 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7896b659cf-r8vcb"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.753766 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c053-account-create-update-x8gmz"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.754958 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-config-data" (OuterVolumeSpecName: "config-data") pod "7947b8f0-b134-40d9-beba-116bbb51a1c2" (UID: "7947b8f0-b134-40d9-beba-116bbb51a1c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.760389 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c053-account-create-update-x8gmz"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.777910 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f8b4-account-create-update-8xzhc"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.782474 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f8b4-account-create-update-8xzhc"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.786602 4753 scope.go:117] "RemoveContainer" containerID="ad233db4e8b1fd3e1f6f172ac5f80cdfd53b96374113b3ddbb73831c7e9cb5c9" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.793945 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="94157b6b-3cc9-44e9-9625-64d34611046a" containerName="galera" containerID="cri-o://85e717f2d1168cff52e5656e97f7028853eb41a9dedbe7b1a8d1cda97bf06e35" gracePeriod=30 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.800637 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txzts\" (UniqueName: \"kubernetes.io/projected/7947b8f0-b134-40d9-beba-116bbb51a1c2-kube-api-access-txzts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.800673 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.800684 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7947b8f0-b134-40d9-beba-116bbb51a1c2-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.804346 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7947b8f0-b134-40d9-beba-116bbb51a1c2" (UID: "7947b8f0-b134-40d9-beba-116bbb51a1c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.818525 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7947b8f0-b134-40d9-beba-116bbb51a1c2" (UID: "7947b8f0-b134-40d9-beba-116bbb51a1c2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.854787 4753 generic.go:334] "Generic (PLEG): container finished" podID="13672aee-1e34-4763-88d7-35ac9b484c87" containerID="49145b48bee89f5fd944b3ebdb12f7d989505bd4659c96c697f65f65d2481518" exitCode=0 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.855093 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"13672aee-1e34-4763-88d7-35ac9b484c87","Type":"ContainerDied","Data":"49145b48bee89f5fd944b3ebdb12f7d989505bd4659c96c697f65f65d2481518"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.865170 4753 generic.go:334] "Generic (PLEG): container finished" podID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerID="0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc" exitCode=0 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.865197 4753 generic.go:334] "Generic (PLEG): container finished" podID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerID="8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4" exitCode=2 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.865203 4753 generic.go:334] "Generic (PLEG): container finished" podID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerID="77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd" exitCode=0 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.865244 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d","Type":"ContainerDied","Data":"0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.865315 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d","Type":"ContainerDied","Data":"8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.865327 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d","Type":"ContainerDied","Data":"77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.883132 4753 generic.go:334] "Generic (PLEG): container finished" podID="05315649-b501-4aae-9c14-4e632b89be53" containerID="27f10dc0be419156c4795db55c1bed92ee49c68890c1310f78d7e5642ab655c9" exitCode=2 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.883248 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"05315649-b501-4aae-9c14-4e632b89be53","Type":"ContainerDied","Data":"27f10dc0be419156c4795db55c1bed92ee49c68890c1310f78d7e5642ab655c9"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.883789 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.884313 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fad8-account-create-update-lzht5" event={"ID":"acfc5598-76b4-4673-aeef-b9105a8e6853","Type":"ContainerDied","Data":"d1b5d984490842cc087fa4858e3e2c4c430bd18896c94ed0d1afbaf2a74d0a2f"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.884411 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fad8-account-create-update-lzht5" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.893371 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b7f8-account-create-update-2zm4w"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.903444 4753 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.903476 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7947b8f0-b134-40d9-beba-116bbb51a1c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.905346 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b7f8-account-create-update-2zm4w"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.908934 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78cbccfdbb-x7lwh" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.909332 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78cbccfdbb-x7lwh" event={"ID":"a641f31f-1bb2-4a49-8e74-3d5baf14bfe7","Type":"ContainerDied","Data":"60a7001a94c2226bc46eb06d5822549a8f0c4196428f5dc7b1b8b967f00daefe"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.912589 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7947b8f0-b134-40d9-beba-116bbb51a1c2","Type":"ContainerDied","Data":"c469b20772701e96ce71bae29e31c81555dc717d1d9369f4c7fa004bbaac807d"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.912667 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.916417 4753 scope.go:117] "RemoveContainer" containerID="a111f231b7967402acc681815c59ca3c8b5a6d1e5677b5d94e8de77f77841cbc" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.917583 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.919559 4753 generic.go:334] "Generic (PLEG): container finished" podID="6810e266-dec6-4731-884b-067f214781c2" containerID="e2d1fb3d7fa36ee6d4949b785910ce4cfa547832c2acade41dcbf22dc90c2c6e" exitCode=0 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.919638 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6810e266-dec6-4731-884b-067f214781c2","Type":"ContainerDied","Data":"e2d1fb3d7fa36ee6d4949b785910ce4cfa547832c2acade41dcbf22dc90c2c6e"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.921579 4753 generic.go:334] "Generic (PLEG): container finished" podID="3184c640-0157-4211-aa5a-aada8557e9f8" containerID="bf98498966b9708676b313afca0a0b4bb674752fe39d67e44f9f70b35df870b7" exitCode=0 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.921627 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3184c640-0157-4211-aa5a-aada8557e9f8","Type":"ContainerDied","Data":"bf98498966b9708676b313afca0a0b4bb674752fe39d67e44f9f70b35df870b7"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.923968 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.925246 4753 generic.go:334] "Generic (PLEG): container finished" podID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerID="831d4b37a8f34dc6da88e5991350950aff013d5011940ad0f9ecbf93b46818a1" exitCode=0 Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.925305 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.925591 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-754c57f55b-2hkbd" event={"ID":"920278e2-a31f-4ad2-81be-d30a799b9d64","Type":"ContainerDied","Data":"831d4b37a8f34dc6da88e5991350950aff013d5011940ad0f9ecbf93b46818a1"} Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.944096 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-599fdd45b6-c7l8c"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.948715 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-599fdd45b6-c7l8c"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.953474 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6d9c4fb469-wlxbk"] Jan 29 14:24:39 crc kubenswrapper[4753]: I0129 14:24:39.959690 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6d9c4fb469-wlxbk"] Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.005711 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-config\") pod \"05315649-b501-4aae-9c14-4e632b89be53\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.006262 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8w7g\" (UniqueName: \"kubernetes.io/projected/05315649-b501-4aae-9c14-4e632b89be53-kube-api-access-h8w7g\") pod \"05315649-b501-4aae-9c14-4e632b89be53\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.006337 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-combined-ca-bundle\") pod \"05315649-b501-4aae-9c14-4e632b89be53\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.006424 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-certs\") pod \"05315649-b501-4aae-9c14-4e632b89be53\" (UID: \"05315649-b501-4aae-9c14-4e632b89be53\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.016457 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05315649-b501-4aae-9c14-4e632b89be53-kube-api-access-h8w7g" (OuterVolumeSpecName: "kube-api-access-h8w7g") pod "05315649-b501-4aae-9c14-4e632b89be53" (UID: "05315649-b501-4aae-9c14-4e632b89be53"). InnerVolumeSpecName "kube-api-access-h8w7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.035168 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05315649-b501-4aae-9c14-4e632b89be53" (UID: "05315649-b501-4aae-9c14-4e632b89be53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.096440 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "05315649-b501-4aae-9c14-4e632b89be53" (UID: "05315649-b501-4aae-9c14-4e632b89be53"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.108673 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6kwp\" (UniqueName: \"kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp\") pod \"keystone-2e4b-account-create-update-t2l6n\" (UID: \"eae1aad4-9fe6-4aa7-9071-170560f783af\") " pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.108781 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts\") pod \"keystone-2e4b-account-create-update-t2l6n\" (UID: \"eae1aad4-9fe6-4aa7-9071-170560f783af\") " pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.108925 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.108939 4753 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.108953 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8w7g\" (UniqueName: \"kubernetes.io/projected/05315649-b501-4aae-9c14-4e632b89be53-kube-api-access-h8w7g\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.109020 4753 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.109075 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts podName:eae1aad4-9fe6-4aa7-9071-170560f783af nodeName:}" failed. No retries permitted until 2026-01-29 14:24:41.109055854 +0000 UTC m=+1315.803790236 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts") pod "keystone-2e4b-account-create-update-t2l6n" (UID: "eae1aad4-9fe6-4aa7-9071-170560f783af") : configmap "openstack-scripts" not found Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.116372 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "05315649-b501-4aae-9c14-4e632b89be53" (UID: "05315649-b501-4aae-9c14-4e632b89be53"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.120516 4753 projected.go:194] Error preparing data for projected volume kube-api-access-f6kwp for pod openstack/keystone-2e4b-account-create-update-t2l6n: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.120575 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp podName:eae1aad4-9fe6-4aa7-9071-170560f783af nodeName:}" failed. No retries permitted until 2026-01-29 14:24:41.12055849 +0000 UTC m=+1315.815292872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f6kwp" (UniqueName: "kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp") pod "keystone-2e4b-account-create-update-t2l6n" (UID: "eae1aad4-9fe6-4aa7-9071-170560f783af") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.125601 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kt2lt"] Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.172099 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001ea12a-a725-4cd9-a12e-1442d56f7068" path="/var/lib/kubelet/pods/001ea12a-a725-4cd9-a12e-1442d56f7068/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.179924 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc24c61-81b8-4f51-a2c7-961548f1b11b" path="/var/lib/kubelet/pods/0dc24c61-81b8-4f51-a2c7-961548f1b11b/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.180348 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b88724-5390-491e-a5b3-0b3fbcbf8bd2" path="/var/lib/kubelet/pods/30b88724-5390-491e-a5b3-0b3fbcbf8bd2/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.181053 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a4dbbc-e575-4d30-bba1-f3785c1e497e" path="/var/lib/kubelet/pods/31a4dbbc-e575-4d30-bba1-f3785c1e497e/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.181558 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="558ab235-719b-4913-aab2-863bfa6586e8" path="/var/lib/kubelet/pods/558ab235-719b-4913-aab2-863bfa6586e8/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.182127 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5766d009-a05f-4e8e-8267-9bd6c1267d3a" path="/var/lib/kubelet/pods/5766d009-a05f-4e8e-8267-9bd6c1267d3a/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.182932 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62921d0c-9482-49a2-8d20-5dda61ba80da" path="/var/lib/kubelet/pods/62921d0c-9482-49a2-8d20-5dda61ba80da/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.197032 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aba56e7-db42-42d9-9586-fb6a145f2a39" path="/var/lib/kubelet/pods/6aba56e7-db42-42d9-9586-fb6a145f2a39/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.197799 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82c06532-183f-4527-b423-630110596fb4" path="/var/lib/kubelet/pods/82c06532-183f-4527-b423-630110596fb4/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.198565 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb" path="/var/lib/kubelet/pods/aaee2cf2-b9e9-4dce-95a7-33ebf90dc3eb/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.201472 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de34f6dc-67dd-4054-84a9-a051e0ba2876" path="/var/lib/kubelet/pods/de34f6dc-67dd-4054-84a9-a051e0ba2876/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.202297 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 14:24:40 crc kubenswrapper[4753]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 29 14:24:40 crc kubenswrapper[4753]: Jan 29 14:24:40 crc kubenswrapper[4753]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 29 14:24:40 crc kubenswrapper[4753]: Jan 29 14:24:40 crc kubenswrapper[4753]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 29 14:24:40 crc kubenswrapper[4753]: Jan 29 14:24:40 crc kubenswrapper[4753]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 29 14:24:40 crc kubenswrapper[4753]: Jan 29 14:24:40 crc kubenswrapper[4753]: if [ -n "" ]; then Jan 29 14:24:40 crc kubenswrapper[4753]: GRANT_DATABASE="" Jan 29 14:24:40 crc kubenswrapper[4753]: else Jan 29 14:24:40 crc kubenswrapper[4753]: GRANT_DATABASE="*" Jan 29 14:24:40 crc kubenswrapper[4753]: fi Jan 29 14:24:40 crc kubenswrapper[4753]: Jan 29 14:24:40 crc kubenswrapper[4753]: # going for maximum compatibility here: Jan 29 14:24:40 crc kubenswrapper[4753]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 29 14:24:40 crc kubenswrapper[4753]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 29 14:24:40 crc kubenswrapper[4753]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 29 14:24:40 crc kubenswrapper[4753]: # support updates Jan 29 14:24:40 crc kubenswrapper[4753]: Jan 29 14:24:40 crc kubenswrapper[4753]: $MYSQL_CMD < logger="UnhandledError" Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.204191 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-kt2lt" podUID="964a679c-ce73-46ec-8e88-37c972fbe817" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.215027 4753 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/05315649-b501-4aae-9c14-4e632b89be53-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.215420 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfce025b-40c8-4ae3-b1c2-a1d858e11adb" path="/var/lib/kubelet/pods/dfce025b-40c8-4ae3-b1c2-a1d858e11adb/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.215971 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed4d464b-59be-4a41-b9bc-2f16147c8ced" path="/var/lib/kubelet/pods/ed4d464b-59be-4a41-b9bc-2f16147c8ced/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.216675 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1a7724-ed52-4481-bbaa-1a0464561b2a" path="/var/lib/kubelet/pods/ef1a7724-ed52-4481-bbaa-1a0464561b2a/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.217200 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2dbc378-044c-49a2-a891-94a90a0acff1" path="/var/lib/kubelet/pods/f2dbc378-044c-49a2-a891-94a90a0acff1/volumes" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.287110 4753 scope.go:117] "RemoveContainer" containerID="656f30c915a464984557b8cece588ce1d9b95d296a1ac31530c3c6877585393c" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.424745 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.471847 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb7a325b_7833_484f_8bba_7dc85ebf57cd.slice/crio-conmon-cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf.scope\": RecentStats: unable to find data in memory cache]" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.526052 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-public-tls-certs\") pod \"13672aee-1e34-4763-88d7-35ac9b484c87\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.526332 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqshj\" (UniqueName: \"kubernetes.io/projected/13672aee-1e34-4763-88d7-35ac9b484c87-kube-api-access-hqshj\") pod \"13672aee-1e34-4763-88d7-35ac9b484c87\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.526382 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13672aee-1e34-4763-88d7-35ac9b484c87-logs\") pod \"13672aee-1e34-4763-88d7-35ac9b484c87\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.526403 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-config-data\") pod \"13672aee-1e34-4763-88d7-35ac9b484c87\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.526513 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-combined-ca-bundle\") pod \"13672aee-1e34-4763-88d7-35ac9b484c87\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.526552 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-internal-tls-certs\") pod \"13672aee-1e34-4763-88d7-35ac9b484c87\" (UID: \"13672aee-1e34-4763-88d7-35ac9b484c87\") " Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.527133 4753 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.527545 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data podName:5f7e3e27-a036-4623-8d63-557a3c0d76e6 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:48.527525455 +0000 UTC m=+1323.222259837 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data") pod "rabbitmq-cell1-server-0" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6") : configmap "rabbitmq-cell1-config-data" not found Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.527839 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13672aee-1e34-4763-88d7-35ac9b484c87-logs" (OuterVolumeSpecName: "logs") pod "13672aee-1e34-4763-88d7-35ac9b484c87" (UID: "13672aee-1e34-4763-88d7-35ac9b484c87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.532012 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13672aee-1e34-4763-88d7-35ac9b484c87-kube-api-access-hqshj" (OuterVolumeSpecName: "kube-api-access-hqshj") pod "13672aee-1e34-4763-88d7-35ac9b484c87" (UID: "13672aee-1e34-4763-88d7-35ac9b484c87"). InnerVolumeSpecName "kube-api-access-hqshj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.543032 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.553949 4753 scope.go:117] "RemoveContainer" containerID="4c6f60456d1d153f776a9722a5c5c012f82210a1ce9b96301d44ce657ac865a5" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.562935 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-config-data" (OuterVolumeSpecName: "config-data") pod "13672aee-1e34-4763-88d7-35ac9b484c87" (UID: "13672aee-1e34-4763-88d7-35ac9b484c87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.567114 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.569518 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.574083 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.575107 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13672aee-1e34-4763-88d7-35ac9b484c87" (UID: "13672aee-1e34-4763-88d7-35ac9b484c87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.587333 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.591134 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78cbccfdbb-x7lwh"] Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.591782 4753 scope.go:117] "RemoveContainer" containerID="125994f3ec1405b9e8dad0539053aa0a87df73ee58d6e17040fc7d0d422ceb19" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.593966 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.604627 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.604718 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.608883 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-78cbccfdbb-x7lwh"] Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.622287 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-lzht5"] Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.624828 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "13672aee-1e34-4763-88d7-35ac9b484c87" (UID: "13672aee-1e34-4763-88d7-35ac9b484c87"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.626439 4753 scope.go:117] "RemoveContainer" containerID="7928bff86f6434736cea8c468f448a8ff6bae7916724dac5e9b712cc68b65281" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.630118 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.630173 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqshj\" (UniqueName: \"kubernetes.io/projected/13672aee-1e34-4763-88d7-35ac9b484c87-kube-api-access-hqshj\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.630182 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13672aee-1e34-4763-88d7-35ac9b484c87-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.630193 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.630223 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.636316 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fad8-account-create-update-lzht5"] Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.674361 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "13672aee-1e34-4763-88d7-35ac9b484c87" (UID: "13672aee-1e34-4763-88d7-35ac9b484c87"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.676025 4753 scope.go:117] "RemoveContainer" containerID="0048536c095c1deea68c977175a1614cd4da94f70d3e465ff69e6a722a840709" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.708730 4753 scope.go:117] "RemoveContainer" containerID="0a800f0f0fcb4a6f6136a9675b82b2ab62e4096da3d7cc45f7830d2af553041f" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.730423 4753 scope.go:117] "RemoveContainer" containerID="f34f5bfa0d06f0211a2a796c39c41d96c4d203bd1909c7053e13afe8556789c1" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.730997 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-public-tls-certs\") pod \"3184c640-0157-4211-aa5a-aada8557e9f8\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731067 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb7pv\" (UniqueName: \"kubernetes.io/projected/920278e2-a31f-4ad2-81be-d30a799b9d64-kube-api-access-rb7pv\") pod \"920278e2-a31f-4ad2-81be-d30a799b9d64\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731096 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-logs\") pod \"6810e266-dec6-4731-884b-067f214781c2\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731122 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-logs\") pod \"3184c640-0157-4211-aa5a-aada8557e9f8\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731190 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-public-tls-certs\") pod \"920278e2-a31f-4ad2-81be-d30a799b9d64\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731204 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-internal-tls-certs\") pod \"6810e266-dec6-4731-884b-067f214781c2\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731232 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8vx6\" (UniqueName: \"kubernetes.io/projected/fb7a325b-7833-484f-8bba-7dc85ebf57cd-kube-api-access-b8vx6\") pod \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731261 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnnqf\" (UniqueName: \"kubernetes.io/projected/3184c640-0157-4211-aa5a-aada8557e9f8-kube-api-access-nnnqf\") pod \"3184c640-0157-4211-aa5a-aada8557e9f8\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731281 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data\") pod \"920278e2-a31f-4ad2-81be-d30a799b9d64\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731300 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-combined-ca-bundle\") pod \"6810e266-dec6-4731-884b-067f214781c2\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731319 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data-custom\") pod \"920278e2-a31f-4ad2-81be-d30a799b9d64\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731351 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-config-data\") pod \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731375 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-combined-ca-bundle\") pod \"920278e2-a31f-4ad2-81be-d30a799b9d64\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731395 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5xw9\" (UniqueName: \"kubernetes.io/projected/15b88ba9-8449-4e76-a36c-34ca2b2be488-kube-api-access-d5xw9\") pod \"15b88ba9-8449-4e76-a36c-34ca2b2be488\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731420 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6810e266-dec6-4731-884b-067f214781c2\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731450 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-scripts\") pod \"6810e266-dec6-4731-884b-067f214781c2\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731470 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-config-data\") pod \"15b88ba9-8449-4e76-a36c-34ca2b2be488\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731498 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-scripts\") pod \"3184c640-0157-4211-aa5a-aada8557e9f8\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731519 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/920278e2-a31f-4ad2-81be-d30a799b9d64-logs\") pod \"920278e2-a31f-4ad2-81be-d30a799b9d64\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731538 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-combined-ca-bundle\") pod \"3184c640-0157-4211-aa5a-aada8557e9f8\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731557 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-internal-tls-certs\") pod \"920278e2-a31f-4ad2-81be-d30a799b9d64\" (UID: \"920278e2-a31f-4ad2-81be-d30a799b9d64\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731579 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-combined-ca-bundle\") pod \"15b88ba9-8449-4e76-a36c-34ca2b2be488\" (UID: \"15b88ba9-8449-4e76-a36c-34ca2b2be488\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731643 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-httpd-run\") pod \"6810e266-dec6-4731-884b-067f214781c2\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731666 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qtp8\" (UniqueName: \"kubernetes.io/projected/6810e266-dec6-4731-884b-067f214781c2-kube-api-access-5qtp8\") pod \"6810e266-dec6-4731-884b-067f214781c2\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731691 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-combined-ca-bundle\") pod \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\" (UID: \"fb7a325b-7833-484f-8bba-7dc85ebf57cd\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731721 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-httpd-run\") pod \"3184c640-0157-4211-aa5a-aada8557e9f8\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731745 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-config-data\") pod \"6810e266-dec6-4731-884b-067f214781c2\" (UID: \"6810e266-dec6-4731-884b-067f214781c2\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731771 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"3184c640-0157-4211-aa5a-aada8557e9f8\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.731787 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-config-data\") pod \"3184c640-0157-4211-aa5a-aada8557e9f8\" (UID: \"3184c640-0157-4211-aa5a-aada8557e9f8\") " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.734906 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13672aee-1e34-4763-88d7-35ac9b484c87-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.745905 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-logs" (OuterVolumeSpecName: "logs") pod "3184c640-0157-4211-aa5a-aada8557e9f8" (UID: "3184c640-0157-4211-aa5a-aada8557e9f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.749401 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-logs" (OuterVolumeSpecName: "logs") pod "6810e266-dec6-4731-884b-067f214781c2" (UID: "6810e266-dec6-4731-884b-067f214781c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.750935 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6810e266-dec6-4731-884b-067f214781c2" (UID: "6810e266-dec6-4731-884b-067f214781c2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.756192 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "920278e2-a31f-4ad2-81be-d30a799b9d64" (UID: "920278e2-a31f-4ad2-81be-d30a799b9d64"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.756354 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3184c640-0157-4211-aa5a-aada8557e9f8-kube-api-access-nnnqf" (OuterVolumeSpecName: "kube-api-access-nnnqf") pod "3184c640-0157-4211-aa5a-aada8557e9f8" (UID: "3184c640-0157-4211-aa5a-aada8557e9f8"). InnerVolumeSpecName "kube-api-access-nnnqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.756717 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3184c640-0157-4211-aa5a-aada8557e9f8" (UID: "3184c640-0157-4211-aa5a-aada8557e9f8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.756852 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "6810e266-dec6-4731-884b-067f214781c2" (UID: "6810e266-dec6-4731-884b-067f214781c2"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.758626 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/920278e2-a31f-4ad2-81be-d30a799b9d64-logs" (OuterVolumeSpecName: "logs") pod "920278e2-a31f-4ad2-81be-d30a799b9d64" (UID: "920278e2-a31f-4ad2-81be-d30a799b9d64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.766916 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-scripts" (OuterVolumeSpecName: "scripts") pod "6810e266-dec6-4731-884b-067f214781c2" (UID: "6810e266-dec6-4731-884b-067f214781c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.772606 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7a325b-7833-484f-8bba-7dc85ebf57cd-kube-api-access-b8vx6" (OuterVolumeSpecName: "kube-api-access-b8vx6") pod "fb7a325b-7833-484f-8bba-7dc85ebf57cd" (UID: "fb7a325b-7833-484f-8bba-7dc85ebf57cd"). InnerVolumeSpecName "kube-api-access-b8vx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.773788 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6810e266-dec6-4731-884b-067f214781c2-kube-api-access-5qtp8" (OuterVolumeSpecName: "kube-api-access-5qtp8") pod "6810e266-dec6-4731-884b-067f214781c2" (UID: "6810e266-dec6-4731-884b-067f214781c2"). InnerVolumeSpecName "kube-api-access-5qtp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.775982 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/920278e2-a31f-4ad2-81be-d30a799b9d64-kube-api-access-rb7pv" (OuterVolumeSpecName: "kube-api-access-rb7pv") pod "920278e2-a31f-4ad2-81be-d30a799b9d64" (UID: "920278e2-a31f-4ad2-81be-d30a799b9d64"). InnerVolumeSpecName "kube-api-access-rb7pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.793484 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-scripts" (OuterVolumeSpecName: "scripts") pod "3184c640-0157-4211-aa5a-aada8557e9f8" (UID: "3184c640-0157-4211-aa5a-aada8557e9f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.793623 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "3184c640-0157-4211-aa5a-aada8557e9f8" (UID: "3184c640-0157-4211-aa5a-aada8557e9f8"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.793817 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b88ba9-8449-4e76-a36c-34ca2b2be488-kube-api-access-d5xw9" (OuterVolumeSpecName: "kube-api-access-d5xw9") pod "15b88ba9-8449-4e76-a36c-34ca2b2be488" (UID: "15b88ba9-8449-4e76-a36c-34ca2b2be488"). InnerVolumeSpecName "kube-api-access-d5xw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.810391 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6810e266-dec6-4731-884b-067f214781c2" (UID: "6810e266-dec6-4731-884b-067f214781c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.815305 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-config-data" (OuterVolumeSpecName: "config-data") pod "15b88ba9-8449-4e76-a36c-34ca2b2be488" (UID: "15b88ba9-8449-4e76-a36c-34ca2b2be488"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.819322 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "920278e2-a31f-4ad2-81be-d30a799b9d64" (UID: "920278e2-a31f-4ad2-81be-d30a799b9d64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.823651 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15b88ba9-8449-4e76-a36c-34ca2b2be488" (UID: "15b88ba9-8449-4e76-a36c-34ca2b2be488"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847621 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5xw9\" (UniqueName: \"kubernetes.io/projected/15b88ba9-8449-4e76-a36c-34ca2b2be488-kube-api-access-d5xw9\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847667 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847678 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847716 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847726 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847734 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/920278e2-a31f-4ad2-81be-d30a799b9d64-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847743 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b88ba9-8449-4e76-a36c-34ca2b2be488-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847751 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847760 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qtp8\" (UniqueName: \"kubernetes.io/projected/6810e266-dec6-4731-884b-067f214781c2-kube-api-access-5qtp8\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847789 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847805 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847816 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb7pv\" (UniqueName: \"kubernetes.io/projected/920278e2-a31f-4ad2-81be-d30a799b9d64-kube-api-access-rb7pv\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847825 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6810e266-dec6-4731-884b-067f214781c2-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847833 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3184c640-0157-4211-aa5a-aada8557e9f8-logs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847842 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8vx6\" (UniqueName: \"kubernetes.io/projected/fb7a325b-7833-484f-8bba-7dc85ebf57cd-kube-api-access-b8vx6\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847852 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnnqf\" (UniqueName: \"kubernetes.io/projected/3184c640-0157-4211-aa5a-aada8557e9f8-kube-api-access-nnnqf\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847860 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847869 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.847881 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.865240 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-config-data" (OuterVolumeSpecName: "config-data") pod "6810e266-dec6-4731-884b-067f214781c2" (UID: "6810e266-dec6-4731-884b-067f214781c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.867864 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.872187 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb7a325b-7833-484f-8bba-7dc85ebf57cd" (UID: "fb7a325b-7833-484f-8bba-7dc85ebf57cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.873481 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-config-data" (OuterVolumeSpecName: "config-data") pod "3184c640-0157-4211-aa5a-aada8557e9f8" (UID: "3184c640-0157-4211-aa5a-aada8557e9f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.893274 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1b5479e0d4430d6fc3745e3ed9afa5a4334d10df67d89187482d0455266b8f05" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.896233 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1b5479e0d4430d6fc3745e3ed9afa5a4334d10df67d89187482d0455266b8f05" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.897620 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1b5479e0d4430d6fc3745e3ed9afa5a4334d10df67d89187482d0455266b8f05" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 29 14:24:40 crc kubenswrapper[4753]: E0129 14:24:40.897653 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="af44a6dc-a0bc-487f-b82a-05e0d08aa7ea" containerName="nova-cell1-conductor-conductor" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.911777 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.950314 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-754c57f55b-2hkbd" event={"ID":"920278e2-a31f-4ad2-81be-d30a799b9d64","Type":"ContainerDied","Data":"d43baf5011f66db39e4f48ff74f5212be4b15a17d4fe4496eef46839b9b4fdd0"} Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.950376 4753 scope.go:117] "RemoveContainer" containerID="831d4b37a8f34dc6da88e5991350950aff013d5011940ad0f9ecbf93b46818a1" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.950511 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-754c57f55b-2hkbd" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.957754 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.957786 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.957799 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.957810 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.962520 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kt2lt" event={"ID":"964a679c-ce73-46ec-8e88-37c972fbe817","Type":"ContainerStarted","Data":"d6cb9681ebd6ce332835e44baa8f3001b623d73fbbc9f5a58f6d7085523f0fe9"} Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.965837 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.965879 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6810e266-dec6-4731-884b-067f214781c2","Type":"ContainerDied","Data":"9a868df539376371754bcb8cc9c201732d3c2ee4f5c9581079b86ae7fa2e189c"} Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.974725 4753 generic.go:334] "Generic (PLEG): container finished" podID="fb7a325b-7833-484f-8bba-7dc85ebf57cd" containerID="cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf" exitCode=0 Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.974790 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fb7a325b-7833-484f-8bba-7dc85ebf57cd","Type":"ContainerDied","Data":"cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf"} Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.974816 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fb7a325b-7833-484f-8bba-7dc85ebf57cd","Type":"ContainerDied","Data":"272d86541960e9b3fd28693f32258e7157d62b325454981c2f00d03254c22cc8"} Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.974870 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.979610 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"05315649-b501-4aae-9c14-4e632b89be53","Type":"ContainerDied","Data":"36e4c9d67763ebbdfa3c794e1f00f6b4e59a7abd195ad09b4271debed373fbce"} Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.979672 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.994059 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "920278e2-a31f-4ad2-81be-d30a799b9d64" (UID: "920278e2-a31f-4ad2-81be-d30a799b9d64"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:40 crc kubenswrapper[4753]: I0129 14:24:40.995916 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-config-data" (OuterVolumeSpecName: "config-data") pod "fb7a325b-7833-484f-8bba-7dc85ebf57cd" (UID: "fb7a325b-7833-484f-8bba-7dc85ebf57cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.000419 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.002864 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3184c640-0157-4211-aa5a-aada8557e9f8" (UID: "3184c640-0157-4211-aa5a-aada8557e9f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.003830 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "920278e2-a31f-4ad2-81be-d30a799b9d64" (UID: "920278e2-a31f-4ad2-81be-d30a799b9d64"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.008629 4753 generic.go:334] "Generic (PLEG): container finished" podID="ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" containerID="5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48" exitCode=0 Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.008742 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb","Type":"ContainerDied","Data":"5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48"} Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.008778 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb","Type":"ContainerDied","Data":"82068b6aff78f1d96c9577ba7dcca553c78e7c01638829e64f5c13ad5a34ca4d"} Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.008852 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.014268 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"13672aee-1e34-4763-88d7-35ac9b484c87","Type":"ContainerDied","Data":"2b15e24310ae7db5d1643f62ac2bdf1b6aff69a043369d89e1e5e87030e70979"} Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.014405 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.024135 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_31019cc8-ce90-453f-be4f-949ed45a5873/ovn-northd/0.log" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.024212 4753 generic.go:334] "Generic (PLEG): container finished" podID="31019cc8-ce90-453f-be4f-949ed45a5873" containerID="b19308d0814c3df635fdb38228ce9b7ebf5a99fefcc0274c1834d736932a59bd" exitCode=139 Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.024291 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"31019cc8-ce90-453f-be4f-949ed45a5873","Type":"ContainerDied","Data":"b19308d0814c3df635fdb38228ce9b7ebf5a99fefcc0274c1834d736932a59bd"} Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.032342 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3184c640-0157-4211-aa5a-aada8557e9f8" (UID: "3184c640-0157-4211-aa5a-aada8557e9f8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.033900 4753 generic.go:334] "Generic (PLEG): container finished" podID="15b88ba9-8449-4e76-a36c-34ca2b2be488" containerID="24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539" exitCode=0 Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.033959 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15b88ba9-8449-4e76-a36c-34ca2b2be488","Type":"ContainerDied","Data":"24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539"} Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.033982 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15b88ba9-8449-4e76-a36c-34ca2b2be488","Type":"ContainerDied","Data":"9a062ab3deb87a8eed7b4c1310901e0de2721c9113c51a5bae9d5b93a5fad4af"} Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.034035 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.051085 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.056594 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.061267 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3184c640-0157-4211-aa5a-aada8557e9f8","Type":"ContainerDied","Data":"d6674fe2218f2f77bdb97bc90e9d01638b395fd39f7b8d3fbd7084662bcd6fb9"} Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.062376 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-memcached-tls-certs\") pod \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.062548 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-config-data\") pod \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.062590 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrrrr\" (UniqueName: \"kubernetes.io/projected/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kube-api-access-nrrrr\") pod \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.062613 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-combined-ca-bundle\") pod \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.062643 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kolla-config\") pod \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\" (UID: \"ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.062976 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.062988 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7a325b-7833-484f-8bba-7dc85ebf57cd-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.062996 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.063004 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.063014 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.063026 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3184c640-0157-4211-aa5a-aada8557e9f8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.066526 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-config-data" (OuterVolumeSpecName: "config-data") pod "ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" (UID: "ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.067737 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6810e266-dec6-4731-884b-067f214781c2" (UID: "6810e266-dec6-4731-884b-067f214781c2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.068703 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" (UID: "ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.074976 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data" (OuterVolumeSpecName: "config-data") pod "920278e2-a31f-4ad2-81be-d30a799b9d64" (UID: "920278e2-a31f-4ad2-81be-d30a799b9d64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.075119 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kube-api-access-nrrrr" (OuterVolumeSpecName: "kube-api-access-nrrrr") pod "ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" (UID: "ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb"). InnerVolumeSpecName "kube-api-access-nrrrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.094557 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" (UID: "ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.132580 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" (UID: "ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.163989 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6kwp\" (UniqueName: \"kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp\") pod \"keystone-2e4b-account-create-update-t2l6n\" (UID: \"eae1aad4-9fe6-4aa7-9071-170560f783af\") " pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.164075 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts\") pod \"keystone-2e4b-account-create-update-t2l6n\" (UID: \"eae1aad4-9fe6-4aa7-9071-170560f783af\") " pod="openstack/keystone-2e4b-account-create-update-t2l6n" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.164241 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.164261 4753 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.164273 4753 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.164284 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6810e266-dec6-4731-884b-067f214781c2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.164295 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920278e2-a31f-4ad2-81be-d30a799b9d64-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.164306 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.164318 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrrrr\" (UniqueName: \"kubernetes.io/projected/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb-kube-api-access-nrrrr\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: E0129 14:24:41.164406 4753 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 29 14:24:41 crc kubenswrapper[4753]: E0129 14:24:41.164450 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts podName:eae1aad4-9fe6-4aa7-9071-170560f783af nodeName:}" failed. No retries permitted until 2026-01-29 14:24:43.164435807 +0000 UTC m=+1317.859170189 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts") pod "keystone-2e4b-account-create-update-t2l6n" (UID: "eae1aad4-9fe6-4aa7-9071-170560f783af") : configmap "openstack-scripts" not found Jan 29 14:24:41 crc kubenswrapper[4753]: E0129 14:24:41.168050 4753 projected.go:194] Error preparing data for projected volume kube-api-access-f6kwp for pod openstack/keystone-2e4b-account-create-update-t2l6n: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 29 14:24:41 crc kubenswrapper[4753]: E0129 14:24:41.168093 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp podName:eae1aad4-9fe6-4aa7-9071-170560f783af nodeName:}" failed. No retries permitted until 2026-01-29 14:24:43.168083153 +0000 UTC m=+1317.862817535 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f6kwp" (UniqueName: "kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp") pod "keystone-2e4b-account-create-update-t2l6n" (UID: "eae1aad4-9fe6-4aa7-9071-170560f783af") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.195910 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.202448 4753 scope.go:117] "RemoveContainer" containerID="eae4f86edd4d25f3149ae9e9e7b406efbd7b4f7e532051ca604088e784fc5e54" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.207785 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.258658 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_31019cc8-ce90-453f-be4f-949ed45a5873/ovn-northd/0.log" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.258719 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 29 14:24:41 crc kubenswrapper[4753]: E0129 14:24:41.266137 4753 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 29 14:24:41 crc kubenswrapper[4753]: E0129 14:24:41.267551 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data podName:ad5c04aa-ed92-4c33-ad37-4420b362e237 nodeName:}" failed. No retries permitted until 2026-01-29 14:24:49.267533283 +0000 UTC m=+1323.962267665 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data") pod "rabbitmq-server-0" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237") : configmap "rabbitmq-config-data" not found Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.282549 4753 scope.go:117] "RemoveContainer" containerID="e2d1fb3d7fa36ee6d4949b785910ce4cfa547832c2acade41dcbf22dc90c2c6e" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.299332 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.316255 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.325288 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.331637 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.355799 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2e4b-account-create-update-t2l6n"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.364105 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2e4b-account-create-update-t2l6n"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.377340 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.379014 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-metrics-certs-tls-certs\") pod \"31019cc8-ce90-453f-be4f-949ed45a5873\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.379079 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-northd-tls-certs\") pod \"31019cc8-ce90-453f-be4f-949ed45a5873\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.379186 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-rundir\") pod \"31019cc8-ce90-453f-be4f-949ed45a5873\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.379216 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-combined-ca-bundle\") pod \"31019cc8-ce90-453f-be4f-949ed45a5873\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.379259 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-scripts\") pod \"31019cc8-ce90-453f-be4f-949ed45a5873\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.379339 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk8k9\" (UniqueName: \"kubernetes.io/projected/31019cc8-ce90-453f-be4f-949ed45a5873-kube-api-access-zk8k9\") pod \"31019cc8-ce90-453f-be4f-949ed45a5873\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.379414 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-config\") pod \"31019cc8-ce90-453f-be4f-949ed45a5873\" (UID: \"31019cc8-ce90-453f-be4f-949ed45a5873\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.386968 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-config" (OuterVolumeSpecName: "config") pod "31019cc8-ce90-453f-be4f-949ed45a5873" (UID: "31019cc8-ce90-453f-be4f-949ed45a5873"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.388048 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "31019cc8-ce90-453f-be4f-949ed45a5873" (UID: "31019cc8-ce90-453f-be4f-949ed45a5873"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.390754 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-scripts" (OuterVolumeSpecName: "scripts") pod "31019cc8-ce90-453f-be4f-949ed45a5873" (UID: "31019cc8-ce90-453f-be4f-949ed45a5873"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.391798 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.397914 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31019cc8-ce90-453f-be4f-949ed45a5873-kube-api-access-zk8k9" (OuterVolumeSpecName: "kube-api-access-zk8k9") pod "31019cc8-ce90-453f-be4f-949ed45a5873" (UID: "31019cc8-ce90-453f-be4f-949ed45a5873"). InnerVolumeSpecName "kube-api-access-zk8k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.430508 4753 scope.go:117] "RemoveContainer" containerID="b0fc984341d3bf9cf81937c474fb2a82c3b897efaf7a2c1e16681e411cbe9085" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.434950 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.435366 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31019cc8-ce90-453f-be4f-949ed45a5873" (UID: "31019cc8-ce90-453f-be4f-949ed45a5873"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.441179 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.448005 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.452942 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.459410 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-754c57f55b-2hkbd"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.462077 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kt2lt" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.464689 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-754c57f55b-2hkbd"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.465356 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "31019cc8-ce90-453f-be4f-949ed45a5873" (UID: "31019cc8-ce90-453f-be4f-949ed45a5873"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.469675 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.477550 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.490667 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6kwp\" (UniqueName: \"kubernetes.io/projected/eae1aad4-9fe6-4aa7-9071-170560f783af-kube-api-access-f6kwp\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.490716 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eae1aad4-9fe6-4aa7-9071-170560f783af-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.490729 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.490741 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.490753 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.490763 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk8k9\" (UniqueName: \"kubernetes.io/projected/31019cc8-ce90-453f-be4f-949ed45a5873-kube-api-access-zk8k9\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.490772 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31019cc8-ce90-453f-be4f-949ed45a5873-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.490782 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.500736 4753 scope.go:117] "RemoveContainer" containerID="cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.533147 4753 scope.go:117] "RemoveContainer" containerID="cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf" Jan 29 14:24:41 crc kubenswrapper[4753]: E0129 14:24:41.533530 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf\": container with ID starting with cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf not found: ID does not exist" containerID="cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.533563 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf"} err="failed to get container status \"cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf\": rpc error: code = NotFound desc = could not find container \"cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf\": container with ID starting with cf8789bb641e83cb5db1723985aad3931f85ce111a50ecc4d4cc4ca0a164cebf not found: ID does not exist" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.533591 4753 scope.go:117] "RemoveContainer" containerID="27f10dc0be419156c4795db55c1bed92ee49c68890c1310f78d7e5642ab655c9" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.560185 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "31019cc8-ce90-453f-be4f-949ed45a5873" (UID: "31019cc8-ce90-453f-be4f-949ed45a5873"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.567355 4753 scope.go:117] "RemoveContainer" containerID="5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.592238 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/964a679c-ce73-46ec-8e88-37c972fbe817-operator-scripts\") pod \"964a679c-ce73-46ec-8e88-37c972fbe817\" (UID: \"964a679c-ce73-46ec-8e88-37c972fbe817\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.592593 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf9h8\" (UniqueName: \"kubernetes.io/projected/964a679c-ce73-46ec-8e88-37c972fbe817-kube-api-access-vf9h8\") pod \"964a679c-ce73-46ec-8e88-37c972fbe817\" (UID: \"964a679c-ce73-46ec-8e88-37c972fbe817\") " Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.592677 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/964a679c-ce73-46ec-8e88-37c972fbe817-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "964a679c-ce73-46ec-8e88-37c972fbe817" (UID: "964a679c-ce73-46ec-8e88-37c972fbe817"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.593565 4753 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/31019cc8-ce90-453f-be4f-949ed45a5873-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.593662 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/964a679c-ce73-46ec-8e88-37c972fbe817-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.596263 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/964a679c-ce73-46ec-8e88-37c972fbe817-kube-api-access-vf9h8" (OuterVolumeSpecName: "kube-api-access-vf9h8") pod "964a679c-ce73-46ec-8e88-37c972fbe817" (UID: "964a679c-ce73-46ec-8e88-37c972fbe817"). InnerVolumeSpecName "kube-api-access-vf9h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.604120 4753 scope.go:117] "RemoveContainer" containerID="5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48" Jan 29 14:24:41 crc kubenswrapper[4753]: E0129 14:24:41.604605 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48\": container with ID starting with 5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48 not found: ID does not exist" containerID="5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.604659 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48"} err="failed to get container status \"5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48\": rpc error: code = NotFound desc = could not find container \"5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48\": container with ID starting with 5fdaeb1028dee5248139bb0519ec3b1e33b3551f63e8d441100a3c0693b61a48 not found: ID does not exist" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.604712 4753 scope.go:117] "RemoveContainer" containerID="49145b48bee89f5fd944b3ebdb12f7d989505bd4659c96c697f65f65d2481518" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.617663 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.203:3000/\": dial tcp 10.217.0.203:3000: connect: connection refused" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.644195 4753 scope.go:117] "RemoveContainer" containerID="63959aa4ed603fda25a0942fccae384fdcda0338c3bd1c0131967af7f34b728b" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.669570 4753 scope.go:117] "RemoveContainer" containerID="24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.689124 4753 scope.go:117] "RemoveContainer" containerID="24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539" Jan 29 14:24:41 crc kubenswrapper[4753]: E0129 14:24:41.689530 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539\": container with ID starting with 24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539 not found: ID does not exist" containerID="24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.689559 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539"} err="failed to get container status \"24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539\": rpc error: code = NotFound desc = could not find container \"24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539\": container with ID starting with 24914f1c611828883d33720bc46b75efaab79bea1eb2b26d648832fc9f928539 not found: ID does not exist" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.689582 4753 scope.go:117] "RemoveContainer" containerID="bf98498966b9708676b313afca0a0b4bb674752fe39d67e44f9f70b35df870b7" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.696388 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf9h8\" (UniqueName: \"kubernetes.io/projected/964a679c-ce73-46ec-8e88-37c972fbe817-kube-api-access-vf9h8\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.726127 4753 scope.go:117] "RemoveContainer" containerID="f5895e5563577e5f99ca58e92f19470e5b0e974e28396e83be746eec355480e3" Jan 29 14:24:41 crc kubenswrapper[4753]: I0129 14:24:41.810585 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.000236 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f7e3e27-a036-4623-8d63-557a3c0d76e6-pod-info\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.000282 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f7e3e27-a036-4623-8d63-557a3c0d76e6-erlang-cookie-secret\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.000305 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-tls\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.000348 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-plugins-conf\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.000381 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-plugins\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.001097 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-confd\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.001134 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.001177 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-server-conf\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.001212 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.001233 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrhjh\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-kube-api-access-qrhjh\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.001276 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-erlang-cookie\") pod \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\" (UID: \"5f7e3e27-a036-4623-8d63-557a3c0d76e6\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.000895 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.002321 4753 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.001164 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.001947 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.004337 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.005279 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7e3e27-a036-4623-8d63-557a3c0d76e6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.012100 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.027783 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5f7e3e27-a036-4623-8d63-557a3c0d76e6-pod-info" (OuterVolumeSpecName: "pod-info") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.033059 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data" (OuterVolumeSpecName: "config-data") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.034928 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-kube-api-access-qrhjh" (OuterVolumeSpecName: "kube-api-access-qrhjh") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "kube-api-access-qrhjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.070014 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-server-conf" (OuterVolumeSpecName: "server-conf") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.095639 4753 generic.go:334] "Generic (PLEG): container finished" podID="5f7e3e27-a036-4623-8d63-557a3c0d76e6" containerID="a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c" exitCode=0 Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.095711 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f7e3e27-a036-4623-8d63-557a3c0d76e6","Type":"ContainerDied","Data":"a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c"} Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.095737 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5f7e3e27-a036-4623-8d63-557a3c0d76e6","Type":"ContainerDied","Data":"c44106bb8710cc08a418ad82d9da2d097116bce22fb7ae11db8985c45ecf9081"} Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.095754 4753 scope.go:117] "RemoveContainer" containerID="a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.095850 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.105570 4753 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5f7e3e27-a036-4623-8d63-557a3c0d76e6-pod-info\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.105719 4753 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5f7e3e27-a036-4623-8d63-557a3c0d76e6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.105795 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.105865 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.105932 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.105999 4753 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5f7e3e27-a036-4623-8d63-557a3c0d76e6-server-conf\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.106117 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.117446 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrhjh\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-kube-api-access-qrhjh\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.117650 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.113917 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kt2lt" event={"ID":"964a679c-ce73-46ec-8e88-37c972fbe817","Type":"ContainerDied","Data":"d6cb9681ebd6ce332835e44baa8f3001b623d73fbbc9f5a58f6d7085523f0fe9"} Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.113984 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kt2lt" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.122958 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5f7e3e27-a036-4623-8d63-557a3c0d76e6" (UID: "5f7e3e27-a036-4623-8d63-557a3c0d76e6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.123687 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_31019cc8-ce90-453f-be4f-949ed45a5873/ovn-northd/0.log" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.123765 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"31019cc8-ce90-453f-be4f-949ed45a5873","Type":"ContainerDied","Data":"65675f1aabccde5b55da60f3fa9c1ffebba3bb8602450f43d737e1445a53afbf"} Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.123881 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.140326 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.152902 4753 generic.go:334] "Generic (PLEG): container finished" podID="ad5c04aa-ed92-4c33-ad37-4420b362e237" containerID="3b34f95853a15ff9210f7c5a34e53924e5ea049fe09b94a0c39100cd6c83fdab" exitCode=0 Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.167645 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05315649-b501-4aae-9c14-4e632b89be53" path="/var/lib/kubelet/pods/05315649-b501-4aae-9c14-4e632b89be53/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.168500 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" path="/var/lib/kubelet/pods/13672aee-1e34-4763-88d7-35ac9b484c87/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.169302 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15b88ba9-8449-4e76-a36c-34ca2b2be488" path="/var/lib/kubelet/pods/15b88ba9-8449-4e76-a36c-34ca2b2be488/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.170427 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3184c640-0157-4211-aa5a-aada8557e9f8" path="/var/lib/kubelet/pods/3184c640-0157-4211-aa5a-aada8557e9f8/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.170730 4753 scope.go:117] "RemoveContainer" containerID="81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.171184 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6810e266-dec6-4731-884b-067f214781c2" path="/var/lib/kubelet/pods/6810e266-dec6-4731-884b-067f214781c2/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.171801 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" path="/var/lib/kubelet/pods/7947b8f0-b134-40d9-beba-116bbb51a1c2/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.172990 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" path="/var/lib/kubelet/pods/920278e2-a31f-4ad2-81be-d30a799b9d64/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.173588 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a641f31f-1bb2-4a49-8e74-3d5baf14bfe7" path="/var/lib/kubelet/pods/a641f31f-1bb2-4a49-8e74-3d5baf14bfe7/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.174620 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acfc5598-76b4-4673-aeef-b9105a8e6853" path="/var/lib/kubelet/pods/acfc5598-76b4-4673-aeef-b9105a8e6853/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.174853 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eae1aad4-9fe6-4aa7-9071-170560f783af" path="/var/lib/kubelet/pods/eae1aad4-9fe6-4aa7-9071-170560f783af/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.175126 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb7a325b-7833-484f-8bba-7dc85ebf57cd" path="/var/lib/kubelet/pods/fb7a325b-7833-484f-8bba-7dc85ebf57cd/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.175619 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" path="/var/lib/kubelet/pods/ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb/volumes" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.176655 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad5c04aa-ed92-4c33-ad37-4420b362e237","Type":"ContainerDied","Data":"3b34f95853a15ff9210f7c5a34e53924e5ea049fe09b94a0c39100cd6c83fdab"} Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.185082 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.185271 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.186720 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.186859 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.187327 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.187355 4753 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.188509 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.188533 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovs-vswitchd" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.204495 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rm9d5" podUID="5ca7a69c-2f29-46d8-ab2a-67393114629f" containerName="ovn-controller" probeResult="failure" output="command timed out" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.206918 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kt2lt"] Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.216168 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-kt2lt"] Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.219108 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5f7e3e27-a036-4623-8d63-557a3c0d76e6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.219137 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.224619 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.227576 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.276833 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rm9d5" podUID="5ca7a69c-2f29-46d8-ab2a-67393114629f" containerName="ovn-controller" probeResult="failure" output=< Jan 29 14:24:42 crc kubenswrapper[4753]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Jan 29 14:24:42 crc kubenswrapper[4753]: > Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.284843 4753 scope.go:117] "RemoveContainer" containerID="a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c" Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.285188 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c\": container with ID starting with a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c not found: ID does not exist" containerID="a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.285226 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c"} err="failed to get container status \"a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c\": rpc error: code = NotFound desc = could not find container \"a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c\": container with ID starting with a459b086bf2795b9932ccea6d332e37422701a9fc3ea75796c1296149aac276c not found: ID does not exist" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.285247 4753 scope.go:117] "RemoveContainer" containerID="81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd" Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.285606 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd\": container with ID starting with 81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd not found: ID does not exist" containerID="81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.285639 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd"} err="failed to get container status \"81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd\": rpc error: code = NotFound desc = could not find container \"81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd\": container with ID starting with 81b6287c3cb8d9ffb1d77c00f95145ae012417ef853faa3a3c7f549d9cfb1fdd not found: ID does not exist" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.285653 4753 scope.go:117] "RemoveContainer" containerID="0c9a422d95efc2b8a373980fb0f3a46037a883ada7821c87b5bc7209541856f4" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.311991 4753 scope.go:117] "RemoveContainer" containerID="b19308d0814c3df635fdb38228ce9b7ebf5a99fefcc0274c1834d736932a59bd" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.366576 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421141 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421226 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad5c04aa-ed92-4c33-ad37-4420b362e237-erlang-cookie-secret\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421281 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-tls\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421316 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-plugins\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421372 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-confd\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421420 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-server-conf\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421450 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421518 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl52d\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-kube-api-access-cl52d\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421559 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad5c04aa-ed92-4c33-ad37-4420b362e237-pod-info\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421595 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-erlang-cookie\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.421623 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-plugins-conf\") pod \"ad5c04aa-ed92-4c33-ad37-4420b362e237\" (UID: \"ad5c04aa-ed92-4c33-ad37-4420b362e237\") " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.444415 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.445250 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.446930 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5c04aa-ed92-4c33-ad37-4420b362e237-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.448357 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.451272 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.455193 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.461580 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-kube-api-access-cl52d" (OuterVolumeSpecName: "kube-api-access-cl52d") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "kube-api-access-cl52d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.466511 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.468874 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data" (OuterVolumeSpecName: "config-data") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.471307 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ad5c04aa-ed92-4c33-ad37-4420b362e237-pod-info" (OuterVolumeSpecName: "pod-info") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.473037 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.514656 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-server-conf" (OuterVolumeSpecName: "server-conf") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.531570 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ad5c04aa-ed92-4c33-ad37-4420b362e237" (UID: "ad5c04aa-ed92-4c33-ad37-4420b362e237"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544525 4753 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-server-conf\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544599 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544615 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl52d\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-kube-api-access-cl52d\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544625 4753 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad5c04aa-ed92-4c33-ad37-4420b362e237-pod-info\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544637 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544648 4753 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544659 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad5c04aa-ed92-4c33-ad37-4420b362e237-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544669 4753 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad5c04aa-ed92-4c33-ad37-4420b362e237-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544679 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544690 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.544701 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad5c04aa-ed92-4c33-ad37-4420b362e237-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.562332 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.646507 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.993362 4753 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 29 14:24:42 crc kubenswrapper[4753]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-01-29T14:24:35Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Jan 29 14:24:42 crc kubenswrapper[4753]: /etc/init.d/functions: line 589: 393 Alarm clock "$@" Jan 29 14:24:42 crc kubenswrapper[4753]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-rm9d5" message=< Jan 29 14:24:42 crc kubenswrapper[4753]: Exiting ovn-controller (1) [FAILED] Jan 29 14:24:42 crc kubenswrapper[4753]: Killing ovn-controller (1) [ OK ] Jan 29 14:24:42 crc kubenswrapper[4753]: Killing ovn-controller (1) with SIGKILL [ OK ] Jan 29 14:24:42 crc kubenswrapper[4753]: 2026-01-29T14:24:35Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Jan 29 14:24:42 crc kubenswrapper[4753]: /etc/init.d/functions: line 589: 393 Alarm clock "$@" Jan 29 14:24:42 crc kubenswrapper[4753]: > Jan 29 14:24:42 crc kubenswrapper[4753]: E0129 14:24:42.993686 4753 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 29 14:24:42 crc kubenswrapper[4753]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-01-29T14:24:35Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Jan 29 14:24:42 crc kubenswrapper[4753]: /etc/init.d/functions: line 589: 393 Alarm clock "$@" Jan 29 14:24:42 crc kubenswrapper[4753]: > pod="openstack/ovn-controller-rm9d5" podUID="5ca7a69c-2f29-46d8-ab2a-67393114629f" containerName="ovn-controller" containerID="cri-o://a6dfbee60ef1dbfc42673695d5f8844faa43eef9fa5e99e94df7f7bc0fbae6ec" Jan 29 14:24:42 crc kubenswrapper[4753]: I0129 14:24:42.993729 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-rm9d5" podUID="5ca7a69c-2f29-46d8-ab2a-67393114629f" containerName="ovn-controller" containerID="cri-o://a6dfbee60ef1dbfc42673695d5f8844faa43eef9fa5e99e94df7f7bc0fbae6ec" gracePeriod=22 Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.169536 4753 generic.go:334] "Generic (PLEG): container finished" podID="94157b6b-3cc9-44e9-9625-64d34611046a" containerID="85e717f2d1168cff52e5656e97f7028853eb41a9dedbe7b1a8d1cda97bf06e35" exitCode=0 Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.169602 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94157b6b-3cc9-44e9-9625-64d34611046a","Type":"ContainerDied","Data":"85e717f2d1168cff52e5656e97f7028853eb41a9dedbe7b1a8d1cda97bf06e35"} Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.170808 4753 generic.go:334] "Generic (PLEG): container finished" podID="cf6045aa-89c7-46c0-ba1e-4d63b9740883" containerID="e450be8a7013dbd08831b7150e1c8ccd73e085bf16b828b286d89403e5f7cfbc" exitCode=0 Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.170843 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b9f57fc94-gqqlc" event={"ID":"cf6045aa-89c7-46c0-ba1e-4d63b9740883","Type":"ContainerDied","Data":"e450be8a7013dbd08831b7150e1c8ccd73e085bf16b828b286d89403e5f7cfbc"} Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.170857 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b9f57fc94-gqqlc" event={"ID":"cf6045aa-89c7-46c0-ba1e-4d63b9740883","Type":"ContainerDied","Data":"d945de52fc406e592ec0c1c5c84efcee7158744da734004630f75b6961d196a9"} Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.170868 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d945de52fc406e592ec0c1c5c84efcee7158744da734004630f75b6961d196a9" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.172569 4753 generic.go:334] "Generic (PLEG): container finished" podID="af44a6dc-a0bc-487f-b82a-05e0d08aa7ea" containerID="1b5479e0d4430d6fc3745e3ed9afa5a4334d10df67d89187482d0455266b8f05" exitCode=0 Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.172603 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea","Type":"ContainerDied","Data":"1b5479e0d4430d6fc3745e3ed9afa5a4334d10df67d89187482d0455266b8f05"} Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.172617 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea","Type":"ContainerDied","Data":"1552ae38dfaf11b830c740e2f045634d853aca7e5898bddb3981b56216ade5a1"} Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.172625 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1552ae38dfaf11b830c740e2f045634d853aca7e5898bddb3981b56216ade5a1" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.176880 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rm9d5_5ca7a69c-2f29-46d8-ab2a-67393114629f/ovn-controller/0.log" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.176909 4753 generic.go:334] "Generic (PLEG): container finished" podID="5ca7a69c-2f29-46d8-ab2a-67393114629f" containerID="a6dfbee60ef1dbfc42673695d5f8844faa43eef9fa5e99e94df7f7bc0fbae6ec" exitCode=137 Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.176950 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rm9d5" event={"ID":"5ca7a69c-2f29-46d8-ab2a-67393114629f","Type":"ContainerDied","Data":"a6dfbee60ef1dbfc42673695d5f8844faa43eef9fa5e99e94df7f7bc0fbae6ec"} Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.186110 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad5c04aa-ed92-4c33-ad37-4420b362e237","Type":"ContainerDied","Data":"e04a382dc583dcc5b5c0702ebe67f18b09aff1f243ceb257a57d7d0071dc19ec"} Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.186138 4753 scope.go:117] "RemoveContainer" containerID="3b34f95853a15ff9210f7c5a34e53924e5ea049fe09b94a0c39100cd6c83fdab" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.186287 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.200174 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.217372 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.250478 4753 scope.go:117] "RemoveContainer" containerID="6ea11bc6de1dca2ccb62590a48ec34192f4584015fe67a14444f1dcae0fb9d8d" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261145 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-credential-keys\") pod \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261216 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-combined-ca-bundle\") pod \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261263 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-scripts\") pod \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261290 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-fernet-keys\") pod \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261337 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzpx4\" (UniqueName: \"kubernetes.io/projected/cf6045aa-89c7-46c0-ba1e-4d63b9740883-kube-api-access-pzpx4\") pod \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261365 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-internal-tls-certs\") pod \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261422 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-config-data\") pod \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261464 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxrm9\" (UniqueName: \"kubernetes.io/projected/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-kube-api-access-pxrm9\") pod \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261489 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-combined-ca-bundle\") pod \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261518 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-public-tls-certs\") pod \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\" (UID: \"cf6045aa-89c7-46c0-ba1e-4d63b9740883\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.261547 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-config-data\") pod \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\" (UID: \"af44a6dc-a0bc-487f-b82a-05e0d08aa7ea\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.281651 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.290160 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6045aa-89c7-46c0-ba1e-4d63b9740883-kube-api-access-pzpx4" (OuterVolumeSpecName: "kube-api-access-pzpx4") pod "cf6045aa-89c7-46c0-ba1e-4d63b9740883" (UID: "cf6045aa-89c7-46c0-ba1e-4d63b9740883"). InnerVolumeSpecName "kube-api-access-pzpx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.292844 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cf6045aa-89c7-46c0-ba1e-4d63b9740883" (UID: "cf6045aa-89c7-46c0-ba1e-4d63b9740883"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.293330 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-scripts" (OuterVolumeSpecName: "scripts") pod "cf6045aa-89c7-46c0-ba1e-4d63b9740883" (UID: "cf6045aa-89c7-46c0-ba1e-4d63b9740883"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.296916 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cf6045aa-89c7-46c0-ba1e-4d63b9740883" (UID: "cf6045aa-89c7-46c0-ba1e-4d63b9740883"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.298180 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf6045aa-89c7-46c0-ba1e-4d63b9740883" (UID: "cf6045aa-89c7-46c0-ba1e-4d63b9740883"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.298317 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-kube-api-access-pxrm9" (OuterVolumeSpecName: "kube-api-access-pxrm9") pod "af44a6dc-a0bc-487f-b82a-05e0d08aa7ea" (UID: "af44a6dc-a0bc-487f-b82a-05e0d08aa7ea"). InnerVolumeSpecName "kube-api-access-pxrm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.306733 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.311059 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.311420 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-config-data" (OuterVolumeSpecName: "config-data") pod "af44a6dc-a0bc-487f-b82a-05e0d08aa7ea" (UID: "af44a6dc-a0bc-487f-b82a-05e0d08aa7ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.319233 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af44a6dc-a0bc-487f-b82a-05e0d08aa7ea" (UID: "af44a6dc-a0bc-487f-b82a-05e0d08aa7ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.344644 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-config-data" (OuterVolumeSpecName: "config-data") pod "cf6045aa-89c7-46c0-ba1e-4d63b9740883" (UID: "cf6045aa-89c7-46c0-ba1e-4d63b9740883"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.362852 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-galera-tls-certs\") pod \"94157b6b-3cc9-44e9-9625-64d34611046a\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.362899 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-kolla-config\") pod \"94157b6b-3cc9-44e9-9625-64d34611046a\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.362922 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-generated\") pod \"94157b6b-3cc9-44e9-9625-64d34611046a\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.362952 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"94157b6b-3cc9-44e9-9625-64d34611046a\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.362977 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jnmh\" (UniqueName: \"kubernetes.io/projected/94157b6b-3cc9-44e9-9625-64d34611046a-kube-api-access-2jnmh\") pod \"94157b6b-3cc9-44e9-9625-64d34611046a\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363006 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-operator-scripts\") pod \"94157b6b-3cc9-44e9-9625-64d34611046a\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363049 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-combined-ca-bundle\") pod \"94157b6b-3cc9-44e9-9625-64d34611046a\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363091 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-default\") pod \"94157b6b-3cc9-44e9-9625-64d34611046a\" (UID: \"94157b6b-3cc9-44e9-9625-64d34611046a\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363274 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cf6045aa-89c7-46c0-ba1e-4d63b9740883" (UID: "cf6045aa-89c7-46c0-ba1e-4d63b9740883"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363507 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363526 4753 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363537 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzpx4\" (UniqueName: \"kubernetes.io/projected/cf6045aa-89c7-46c0-ba1e-4d63b9740883-kube-api-access-pzpx4\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363547 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363555 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363565 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxrm9\" (UniqueName: \"kubernetes.io/projected/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-kube-api-access-pxrm9\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363573 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363581 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363590 4753 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.363597 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.364215 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "94157b6b-3cc9-44e9-9625-64d34611046a" (UID: "94157b6b-3cc9-44e9-9625-64d34611046a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.364323 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "94157b6b-3cc9-44e9-9625-64d34611046a" (UID: "94157b6b-3cc9-44e9-9625-64d34611046a"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.364563 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "94157b6b-3cc9-44e9-9625-64d34611046a" (UID: "94157b6b-3cc9-44e9-9625-64d34611046a"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.364777 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94157b6b-3cc9-44e9-9625-64d34611046a" (UID: "94157b6b-3cc9-44e9-9625-64d34611046a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.366456 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf6045aa-89c7-46c0-ba1e-4d63b9740883" (UID: "cf6045aa-89c7-46c0-ba1e-4d63b9740883"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.367767 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94157b6b-3cc9-44e9-9625-64d34611046a-kube-api-access-2jnmh" (OuterVolumeSpecName: "kube-api-access-2jnmh") pod "94157b6b-3cc9-44e9-9625-64d34611046a" (UID: "94157b6b-3cc9-44e9-9625-64d34611046a"). InnerVolumeSpecName "kube-api-access-2jnmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.373280 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "94157b6b-3cc9-44e9-9625-64d34611046a" (UID: "94157b6b-3cc9-44e9-9625-64d34611046a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.400039 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94157b6b-3cc9-44e9-9625-64d34611046a" (UID: "94157b6b-3cc9-44e9-9625-64d34611046a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.410433 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "94157b6b-3cc9-44e9-9625-64d34611046a" (UID: "94157b6b-3cc9-44e9-9625-64d34611046a"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.465360 4753 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.465391 4753 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.465429 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.465466 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.465477 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jnmh\" (UniqueName: \"kubernetes.io/projected/94157b6b-3cc9-44e9-9625-64d34611046a-kube-api-access-2jnmh\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.465487 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6045aa-89c7-46c0-ba1e-4d63b9740883-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.465496 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.465504 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94157b6b-3cc9-44e9-9625-64d34611046a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.465512 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94157b6b-3cc9-44e9-9625-64d34611046a-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.469540 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rm9d5_5ca7a69c-2f29-46d8-ab2a-67393114629f/ovn-controller/0.log" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.469598 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rm9d5" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.510025 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.566323 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqb2l\" (UniqueName: \"kubernetes.io/projected/5ca7a69c-2f29-46d8-ab2a-67393114629f-kube-api-access-mqb2l\") pod \"5ca7a69c-2f29-46d8-ab2a-67393114629f\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.566366 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run-ovn\") pod \"5ca7a69c-2f29-46d8-ab2a-67393114629f\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.566423 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ca7a69c-2f29-46d8-ab2a-67393114629f-scripts\") pod \"5ca7a69c-2f29-46d8-ab2a-67393114629f\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.566451 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-log-ovn\") pod \"5ca7a69c-2f29-46d8-ab2a-67393114629f\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.566492 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-ovn-controller-tls-certs\") pod \"5ca7a69c-2f29-46d8-ab2a-67393114629f\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.566514 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run\") pod \"5ca7a69c-2f29-46d8-ab2a-67393114629f\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.566549 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-combined-ca-bundle\") pod \"5ca7a69c-2f29-46d8-ab2a-67393114629f\" (UID: \"5ca7a69c-2f29-46d8-ab2a-67393114629f\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.566820 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.567109 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5ca7a69c-2f29-46d8-ab2a-67393114629f" (UID: "5ca7a69c-2f29-46d8-ab2a-67393114629f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.567179 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5ca7a69c-2f29-46d8-ab2a-67393114629f" (UID: "5ca7a69c-2f29-46d8-ab2a-67393114629f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.568066 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca7a69c-2f29-46d8-ab2a-67393114629f-scripts" (OuterVolumeSpecName: "scripts") pod "5ca7a69c-2f29-46d8-ab2a-67393114629f" (UID: "5ca7a69c-2f29-46d8-ab2a-67393114629f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.569982 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca7a69c-2f29-46d8-ab2a-67393114629f-kube-api-access-mqb2l" (OuterVolumeSpecName: "kube-api-access-mqb2l") pod "5ca7a69c-2f29-46d8-ab2a-67393114629f" (UID: "5ca7a69c-2f29-46d8-ab2a-67393114629f"). InnerVolumeSpecName "kube-api-access-mqb2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.570142 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run" (OuterVolumeSpecName: "var-run") pod "5ca7a69c-2f29-46d8-ab2a-67393114629f" (UID: "5ca7a69c-2f29-46d8-ab2a-67393114629f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.592531 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ca7a69c-2f29-46d8-ab2a-67393114629f" (UID: "5ca7a69c-2f29-46d8-ab2a-67393114629f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.621829 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "5ca7a69c-2f29-46d8-ab2a-67393114629f" (UID: "5ca7a69c-2f29-46d8-ab2a-67393114629f"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.641665 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.668405 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqb2l\" (UniqueName: \"kubernetes.io/projected/5ca7a69c-2f29-46d8-ab2a-67393114629f-kube-api-access-mqb2l\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.668444 4753 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.668457 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ca7a69c-2f29-46d8-ab2a-67393114629f-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.668471 4753 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.668484 4753 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.668496 4753 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5ca7a69c-2f29-46d8-ab2a-67393114629f-var-run\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.668507 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca7a69c-2f29-46d8-ab2a-67393114629f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.769689 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr4sw\" (UniqueName: \"kubernetes.io/projected/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-kube-api-access-kr4sw\") pod \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.769846 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-scripts\") pod \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.769886 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-config-data\") pod \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.769903 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-combined-ca-bundle\") pod \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.769925 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-sg-core-conf-yaml\") pod \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.770004 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-ceilometer-tls-certs\") pod \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.770036 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-run-httpd\") pod \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.770073 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-log-httpd\") pod \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\" (UID: \"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d\") " Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.770850 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" (UID: "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.772552 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" (UID: "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.775309 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-scripts" (OuterVolumeSpecName: "scripts") pod "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" (UID: "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.777006 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-kube-api-access-kr4sw" (OuterVolumeSpecName: "kube-api-access-kr4sw") pod "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" (UID: "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d"). InnerVolumeSpecName "kube-api-access-kr4sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.791810 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" (UID: "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.812888 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" (UID: "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.833972 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" (UID: "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.858134 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-config-data" (OuterVolumeSpecName: "config-data") pod "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" (UID: "d5ffa6e0-f5c8-4e29-84e2-a02a8061101d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.872086 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.872324 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.872465 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.872529 4753 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.872582 4753 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.872632 4753 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.872691 4753 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:43 crc kubenswrapper[4753]: I0129 14:24:43.872749 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr4sw\" (UniqueName: \"kubernetes.io/projected/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d-kube-api-access-kr4sw\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.004973 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-55965d95bf-pftcq" podUID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": dial tcp 10.217.0.156:9696: connect: connection refused" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.158028 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31019cc8-ce90-453f-be4f-949ed45a5873" path="/var/lib/kubelet/pods/31019cc8-ce90-453f-be4f-949ed45a5873/volumes" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.158759 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f7e3e27-a036-4623-8d63-557a3c0d76e6" path="/var/lib/kubelet/pods/5f7e3e27-a036-4623-8d63-557a3c0d76e6/volumes" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.159275 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="964a679c-ce73-46ec-8e88-37c972fbe817" path="/var/lib/kubelet/pods/964a679c-ce73-46ec-8e88-37c972fbe817/volumes" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.160225 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5c04aa-ed92-4c33-ad37-4420b362e237" path="/var/lib/kubelet/pods/ad5c04aa-ed92-4c33-ad37-4420b362e237/volumes" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.195469 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rm9d5_5ca7a69c-2f29-46d8-ab2a-67393114629f/ovn-controller/0.log" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.195831 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rm9d5" event={"ID":"5ca7a69c-2f29-46d8-ab2a-67393114629f","Type":"ContainerDied","Data":"50e506e8e9f1b22f13930607560c49f3d08c0311073e60c9d11c1906427e185e"} Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.195886 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rm9d5" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.195889 4753 scope.go:117] "RemoveContainer" containerID="a6dfbee60ef1dbfc42673695d5f8844faa43eef9fa5e99e94df7f7bc0fbae6ec" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.200061 4753 generic.go:334] "Generic (PLEG): container finished" podID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerID="f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5" exitCode=0 Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.200113 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.200121 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d","Type":"ContainerDied","Data":"f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5"} Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.200249 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5ffa6e0-f5c8-4e29-84e2-a02a8061101d","Type":"ContainerDied","Data":"13ebadd82bb3bab52a4dc2e23a96c886a9bb225f678fbd5dc64f63564f50661a"} Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.204684 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b9f57fc94-gqqlc" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.204865 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.204866 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"94157b6b-3cc9-44e9-9625-64d34611046a","Type":"ContainerDied","Data":"5fbe6adbe68b71996173814eff92879f61d37767a107508b1ea3b41021023941"} Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.204991 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.242356 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.245352 4753 scope.go:117] "RemoveContainer" containerID="0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.248249 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.260312 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.267636 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.274902 4753 scope.go:117] "RemoveContainer" containerID="8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.283660 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7b9f57fc94-gqqlc"] Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.287519 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7b9f57fc94-gqqlc"] Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.299325 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rm9d5"] Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.301568 4753 scope.go:117] "RemoveContainer" containerID="f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.306033 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rm9d5"] Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.310298 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.315209 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.321120 4753 scope.go:117] "RemoveContainer" containerID="77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.338499 4753 scope.go:117] "RemoveContainer" containerID="0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc" Jan 29 14:24:44 crc kubenswrapper[4753]: E0129 14:24:44.338938 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc\": container with ID starting with 0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc not found: ID does not exist" containerID="0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.338993 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc"} err="failed to get container status \"0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc\": rpc error: code = NotFound desc = could not find container \"0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc\": container with ID starting with 0872f47f0837e2e340e203dad13fbad322d2f2d5612e6e032417fc797f2927cc not found: ID does not exist" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.339014 4753 scope.go:117] "RemoveContainer" containerID="8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4" Jan 29 14:24:44 crc kubenswrapper[4753]: E0129 14:24:44.339475 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4\": container with ID starting with 8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4 not found: ID does not exist" containerID="8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.339505 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4"} err="failed to get container status \"8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4\": rpc error: code = NotFound desc = could not find container \"8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4\": container with ID starting with 8f4f504bb7514d17de6f9d0e4bc29ed5bbb7a16fcbf60f50df950c1d011171f4 not found: ID does not exist" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.339522 4753 scope.go:117] "RemoveContainer" containerID="f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5" Jan 29 14:24:44 crc kubenswrapper[4753]: E0129 14:24:44.339831 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5\": container with ID starting with f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5 not found: ID does not exist" containerID="f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.340039 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5"} err="failed to get container status \"f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5\": rpc error: code = NotFound desc = could not find container \"f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5\": container with ID starting with f5cafdd4943e781ba476a6c69f1dc71b80a9449ef462d25a8c8b8c34d0874fc5 not found: ID does not exist" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.342300 4753 scope.go:117] "RemoveContainer" containerID="77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd" Jan 29 14:24:44 crc kubenswrapper[4753]: E0129 14:24:44.342797 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd\": container with ID starting with 77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd not found: ID does not exist" containerID="77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.342832 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd"} err="failed to get container status \"77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd\": rpc error: code = NotFound desc = could not find container \"77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd\": container with ID starting with 77a59d6786fb7372e272cf55c73dd8632bccf7c850c6664fd7b5535d55c355cd not found: ID does not exist" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.342852 4753 scope.go:117] "RemoveContainer" containerID="85e717f2d1168cff52e5656e97f7028853eb41a9dedbe7b1a8d1cda97bf06e35" Jan 29 14:24:44 crc kubenswrapper[4753]: I0129 14:24:44.367278 4753 scope.go:117] "RemoveContainer" containerID="0f883b984a6efe1a3b4819a8200d14cf585a8a3d8843d988a84fda1b04aaa30e" Jan 29 14:24:45 crc kubenswrapper[4753]: I0129 14:24:45.022399 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-754c57f55b-2hkbd" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.161:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 29 14:24:45 crc kubenswrapper[4753]: I0129 14:24:45.022834 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-754c57f55b-2hkbd" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.161:9311/healthcheck\": dial tcp 10.217.0.161:9311: i/o timeout" Jan 29 14:24:46 crc kubenswrapper[4753]: I0129 14:24:46.208478 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca7a69c-2f29-46d8-ab2a-67393114629f" path="/var/lib/kubelet/pods/5ca7a69c-2f29-46d8-ab2a-67393114629f/volumes" Jan 29 14:24:46 crc kubenswrapper[4753]: I0129 14:24:46.209856 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94157b6b-3cc9-44e9-9625-64d34611046a" path="/var/lib/kubelet/pods/94157b6b-3cc9-44e9-9625-64d34611046a/volumes" Jan 29 14:24:46 crc kubenswrapper[4753]: I0129 14:24:46.211884 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af44a6dc-a0bc-487f-b82a-05e0d08aa7ea" path="/var/lib/kubelet/pods/af44a6dc-a0bc-487f-b82a-05e0d08aa7ea/volumes" Jan 29 14:24:46 crc kubenswrapper[4753]: I0129 14:24:46.214398 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf6045aa-89c7-46c0-ba1e-4d63b9740883" path="/var/lib/kubelet/pods/cf6045aa-89c7-46c0-ba1e-4d63b9740883/volumes" Jan 29 14:24:46 crc kubenswrapper[4753]: I0129 14:24:46.215808 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" path="/var/lib/kubelet/pods/d5ffa6e0-f5c8-4e29-84e2-a02a8061101d/volumes" Jan 29 14:24:47 crc kubenswrapper[4753]: E0129 14:24:47.185017 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:47 crc kubenswrapper[4753]: E0129 14:24:47.185703 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:47 crc kubenswrapper[4753]: E0129 14:24:47.186033 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:47 crc kubenswrapper[4753]: E0129 14:24:47.186772 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:47 crc kubenswrapper[4753]: E0129 14:24:47.186772 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:47 crc kubenswrapper[4753]: E0129 14:24:47.186968 4753 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" Jan 29 14:24:47 crc kubenswrapper[4753]: E0129 14:24:47.189517 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:47 crc kubenswrapper[4753]: E0129 14:24:47.189615 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovs-vswitchd" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.285639 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.303184 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-ovndb-tls-certs\") pod \"1897b4f4-3f70-4584-9801-59c207f4d1db\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.303260 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-public-tls-certs\") pod \"1897b4f4-3f70-4584-9801-59c207f4d1db\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.303329 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-internal-tls-certs\") pod \"1897b4f4-3f70-4584-9801-59c207f4d1db\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.303378 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-config\") pod \"1897b4f4-3f70-4584-9801-59c207f4d1db\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.303417 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-combined-ca-bundle\") pod \"1897b4f4-3f70-4584-9801-59c207f4d1db\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.303461 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-httpd-config\") pod \"1897b4f4-3f70-4584-9801-59c207f4d1db\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.303619 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpg98\" (UniqueName: \"kubernetes.io/projected/1897b4f4-3f70-4584-9801-59c207f4d1db-kube-api-access-lpg98\") pod \"1897b4f4-3f70-4584-9801-59c207f4d1db\" (UID: \"1897b4f4-3f70-4584-9801-59c207f4d1db\") " Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.305980 4753 generic.go:334] "Generic (PLEG): container finished" podID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerID="24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78" exitCode=0 Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.306033 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55965d95bf-pftcq" event={"ID":"1897b4f4-3f70-4584-9801-59c207f4d1db","Type":"ContainerDied","Data":"24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78"} Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.306088 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55965d95bf-pftcq" event={"ID":"1897b4f4-3f70-4584-9801-59c207f4d1db","Type":"ContainerDied","Data":"0fd70a45beb18140cb3620e5018c855d745317f628eca68ac5a1e7e4ce42157c"} Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.306115 4753 scope.go:117] "RemoveContainer" containerID="ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.306120 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55965d95bf-pftcq" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.311928 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1897b4f4-3f70-4584-9801-59c207f4d1db" (UID: "1897b4f4-3f70-4584-9801-59c207f4d1db"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.312724 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1897b4f4-3f70-4584-9801-59c207f4d1db-kube-api-access-lpg98" (OuterVolumeSpecName: "kube-api-access-lpg98") pod "1897b4f4-3f70-4584-9801-59c207f4d1db" (UID: "1897b4f4-3f70-4584-9801-59c207f4d1db"). InnerVolumeSpecName "kube-api-access-lpg98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.372408 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1897b4f4-3f70-4584-9801-59c207f4d1db" (UID: "1897b4f4-3f70-4584-9801-59c207f4d1db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.378741 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1897b4f4-3f70-4584-9801-59c207f4d1db" (UID: "1897b4f4-3f70-4584-9801-59c207f4d1db"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.400393 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-config" (OuterVolumeSpecName: "config") pod "1897b4f4-3f70-4584-9801-59c207f4d1db" (UID: "1897b4f4-3f70-4584-9801-59c207f4d1db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.403865 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1897b4f4-3f70-4584-9801-59c207f4d1db" (UID: "1897b4f4-3f70-4584-9801-59c207f4d1db"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.406576 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1897b4f4-3f70-4584-9801-59c207f4d1db" (UID: "1897b4f4-3f70-4584-9801-59c207f4d1db"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.406078 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.408296 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpg98\" (UniqueName: \"kubernetes.io/projected/1897b4f4-3f70-4584-9801-59c207f4d1db-kube-api-access-lpg98\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.408323 4753 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.408344 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.451401 4753 scope.go:117] "RemoveContainer" containerID="24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.470510 4753 scope.go:117] "RemoveContainer" containerID="ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560" Jan 29 14:24:51 crc kubenswrapper[4753]: E0129 14:24:51.471475 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560\": container with ID starting with ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560 not found: ID does not exist" containerID="ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.471525 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560"} err="failed to get container status \"ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560\": rpc error: code = NotFound desc = could not find container \"ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560\": container with ID starting with ae68a7c0713b63761334d704b6d58a6e586d99734ba26a5dd648aa2966357560 not found: ID does not exist" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.471553 4753 scope.go:117] "RemoveContainer" containerID="24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78" Jan 29 14:24:51 crc kubenswrapper[4753]: E0129 14:24:51.471834 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78\": container with ID starting with 24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78 not found: ID does not exist" containerID="24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.471862 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78"} err="failed to get container status \"24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78\": rpc error: code = NotFound desc = could not find container \"24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78\": container with ID starting with 24b7d8e79af178fbb041e6823eaec921f1b4acb05681424366ec240a0da10d78 not found: ID does not exist" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.510322 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-config\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.510386 4753 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.510408 4753 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1897b4f4-3f70-4584-9801-59c207f4d1db-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.646891 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55965d95bf-pftcq"] Jan 29 14:24:51 crc kubenswrapper[4753]: I0129 14:24:51.653784 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-55965d95bf-pftcq"] Jan 29 14:24:52 crc kubenswrapper[4753]: I0129 14:24:52.159368 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1897b4f4-3f70-4584-9801-59c207f4d1db" path="/var/lib/kubelet/pods/1897b4f4-3f70-4584-9801-59c207f4d1db/volumes" Jan 29 14:24:52 crc kubenswrapper[4753]: E0129 14:24:52.184370 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:52 crc kubenswrapper[4753]: E0129 14:24:52.184849 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:52 crc kubenswrapper[4753]: E0129 14:24:52.185203 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:52 crc kubenswrapper[4753]: E0129 14:24:52.185228 4753 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" Jan 29 14:24:52 crc kubenswrapper[4753]: E0129 14:24:52.185919 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:52 crc kubenswrapper[4753]: E0129 14:24:52.187523 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:52 crc kubenswrapper[4753]: E0129 14:24:52.189327 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:52 crc kubenswrapper[4753]: E0129 14:24:52.189396 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovs-vswitchd" Jan 29 14:24:57 crc kubenswrapper[4753]: I0129 14:24:57.054788 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:24:57 crc kubenswrapper[4753]: I0129 14:24:57.055279 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:24:57 crc kubenswrapper[4753]: E0129 14:24:57.183697 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:57 crc kubenswrapper[4753]: E0129 14:24:57.184633 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:57 crc kubenswrapper[4753]: E0129 14:24:57.185088 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:24:57 crc kubenswrapper[4753]: E0129 14:24:57.185229 4753 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" Jan 29 14:24:57 crc kubenswrapper[4753]: E0129 14:24:57.185689 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:57 crc kubenswrapper[4753]: E0129 14:24:57.187730 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:57 crc kubenswrapper[4753]: E0129 14:24:57.190126 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:24:57 crc kubenswrapper[4753]: E0129 14:24:57.190254 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovs-vswitchd" Jan 29 14:25:02 crc kubenswrapper[4753]: E0129 14:25:02.183806 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:25:02 crc kubenswrapper[4753]: E0129 14:25:02.185504 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:25:02 crc kubenswrapper[4753]: E0129 14:25:02.185917 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:25:02 crc kubenswrapper[4753]: E0129 14:25:02.186242 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 29 14:25:02 crc kubenswrapper[4753]: E0129 14:25:02.186472 4753 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" Jan 29 14:25:02 crc kubenswrapper[4753]: E0129 14:25:02.187850 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:25:02 crc kubenswrapper[4753]: E0129 14:25:02.192930 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 29 14:25:02 crc kubenswrapper[4753]: E0129 14:25:02.193048 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-98h7m" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovs-vswitchd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.395061 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c2lps"] Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.395784 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94157b6b-3cc9-44e9-9625-64d34611046a" containerName="mysql-bootstrap" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.395802 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="94157b6b-3cc9-44e9-9625-64d34611046a" containerName="mysql-bootstrap" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.395829 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31019cc8-ce90-453f-be4f-949ed45a5873" containerName="openstack-network-exporter" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.395844 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="31019cc8-ce90-453f-be4f-949ed45a5873" containerName="openstack-network-exporter" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.395858 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerName="barbican-api-log" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.395869 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerName="barbican-api-log" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.395885 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f7e3e27-a036-4623-8d63-557a3c0d76e6" containerName="setup-container" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.395893 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f7e3e27-a036-4623-8d63-557a3c0d76e6" containerName="setup-container" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.395905 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="ceilometer-central-agent" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.395913 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="ceilometer-central-agent" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.395922 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5c04aa-ed92-4c33-ad37-4420b362e237" containerName="rabbitmq" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.395930 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5c04aa-ed92-4c33-ad37-4420b362e237" containerName="rabbitmq" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.395945 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31019cc8-ce90-453f-be4f-949ed45a5873" containerName="ovn-northd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.395953 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="31019cc8-ce90-453f-be4f-949ed45a5873" containerName="ovn-northd" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.395970 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6045aa-89c7-46c0-ba1e-4d63b9740883" containerName="keystone-api" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.395978 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6045aa-89c7-46c0-ba1e-4d63b9740883" containerName="keystone-api" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.395989 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerName="barbican-api" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.395997 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerName="barbican-api" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396009 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="proxy-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396016 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="proxy-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396035 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-log" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396043 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-log" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396059 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6810e266-dec6-4731-884b-067f214781c2" containerName="glance-log" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396067 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6810e266-dec6-4731-884b-067f214781c2" containerName="glance-log" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396078 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" containerName="nova-api-api" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396086 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" containerName="nova-api-api" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396098 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-metadata" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396106 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-metadata" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396117 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94157b6b-3cc9-44e9-9625-64d34611046a" containerName="galera" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396125 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="94157b6b-3cc9-44e9-9625-64d34611046a" containerName="galera" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396136 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" containerName="nova-api-log" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396144 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" containerName="nova-api-log" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396159 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="sg-core" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396168 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="sg-core" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396180 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6810e266-dec6-4731-884b-067f214781c2" containerName="glance-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396224 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6810e266-dec6-4731-884b-067f214781c2" containerName="glance-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396263 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b88ba9-8449-4e76-a36c-34ca2b2be488" containerName="nova-scheduler-scheduler" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396272 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b88ba9-8449-4e76-a36c-34ca2b2be488" containerName="nova-scheduler-scheduler" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396285 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerName="neutron-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396293 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerName="neutron-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396302 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="ceilometer-notification-agent" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396309 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="ceilometer-notification-agent" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396319 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3184c640-0157-4211-aa5a-aada8557e9f8" containerName="glance-log" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396327 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3184c640-0157-4211-aa5a-aada8557e9f8" containerName="glance-log" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396338 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5c04aa-ed92-4c33-ad37-4420b362e237" containerName="setup-container" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396347 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5c04aa-ed92-4c33-ad37-4420b362e237" containerName="setup-container" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396361 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f7e3e27-a036-4623-8d63-557a3c0d76e6" containerName="rabbitmq" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396369 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f7e3e27-a036-4623-8d63-557a3c0d76e6" containerName="rabbitmq" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396382 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7a325b-7833-484f-8bba-7dc85ebf57cd" containerName="nova-cell0-conductor-conductor" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396390 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7a325b-7833-484f-8bba-7dc85ebf57cd" containerName="nova-cell0-conductor-conductor" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396406 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca7a69c-2f29-46d8-ab2a-67393114629f" containerName="ovn-controller" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396414 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca7a69c-2f29-46d8-ab2a-67393114629f" containerName="ovn-controller" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396429 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3184c640-0157-4211-aa5a-aada8557e9f8" containerName="glance-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396437 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3184c640-0157-4211-aa5a-aada8557e9f8" containerName="glance-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396445 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05315649-b501-4aae-9c14-4e632b89be53" containerName="kube-state-metrics" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396456 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="05315649-b501-4aae-9c14-4e632b89be53" containerName="kube-state-metrics" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396467 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af44a6dc-a0bc-487f-b82a-05e0d08aa7ea" containerName="nova-cell1-conductor-conductor" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396475 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="af44a6dc-a0bc-487f-b82a-05e0d08aa7ea" containerName="nova-cell1-conductor-conductor" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396484 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerName="neutron-api" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396492 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerName="neutron-api" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.396503 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" containerName="memcached" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396511 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" containerName="memcached" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396677 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerName="barbican-api-log" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396692 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="94157b6b-3cc9-44e9-9625-64d34611046a" containerName="galera" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396704 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1df95b-0bf3-47ef-a25f-e5e8d7181ecb" containerName="memcached" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396713 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b88ba9-8449-4e76-a36c-34ca2b2be488" containerName="nova-scheduler-scheduler" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396726 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca7a69c-2f29-46d8-ab2a-67393114629f" containerName="ovn-controller" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396735 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerName="neutron-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396743 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6810e266-dec6-4731-884b-067f214781c2" containerName="glance-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396759 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="ceilometer-notification-agent" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396774 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="31019cc8-ce90-453f-be4f-949ed45a5873" containerName="openstack-network-exporter" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396784 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="ceilometer-central-agent" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396798 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f7e3e27-a036-4623-8d63-557a3c0d76e6" containerName="rabbitmq" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396810 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-metadata" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396821 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="af44a6dc-a0bc-487f-b82a-05e0d08aa7ea" containerName="nova-cell1-conductor-conductor" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396833 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="31019cc8-ce90-453f-be4f-949ed45a5873" containerName="ovn-northd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396846 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1897b4f4-3f70-4584-9801-59c207f4d1db" containerName="neutron-api" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396860 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3184c640-0157-4211-aa5a-aada8557e9f8" containerName="glance-log" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396871 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" containerName="nova-api-api" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396892 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb7a325b-7833-484f-8bba-7dc85ebf57cd" containerName="nova-cell0-conductor-conductor" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396909 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="05315649-b501-4aae-9c14-4e632b89be53" containerName="kube-state-metrics" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396925 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5c04aa-ed92-4c33-ad37-4420b362e237" containerName="rabbitmq" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396942 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="sg-core" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396957 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ffa6e0-f5c8-4e29-84e2-a02a8061101d" containerName="proxy-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396971 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="920278e2-a31f-4ad2-81be-d30a799b9d64" containerName="barbican-api" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396983 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3184c640-0157-4211-aa5a-aada8557e9f8" containerName="glance-httpd" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.396994 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6810e266-dec6-4731-884b-067f214781c2" containerName="glance-log" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.397008 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7947b8f0-b134-40d9-beba-116bbb51a1c2" containerName="nova-metadata-log" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.397022 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6045aa-89c7-46c0-ba1e-4d63b9740883" containerName="keystone-api" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.397036 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="13672aee-1e34-4763-88d7-35ac9b484c87" containerName="nova-api-log" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.398459 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.407557 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2lps"] Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.427053 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.428760 4753 generic.go:334] "Generic (PLEG): container finished" podID="86d005cc-e014-44b4-b8fa-f402d656ae5a" containerID="5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef" exitCode=137 Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.428795 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86d005cc-e014-44b4-b8fa-f402d656ae5a","Type":"ContainerDied","Data":"5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef"} Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.428818 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86d005cc-e014-44b4-b8fa-f402d656ae5a","Type":"ContainerDied","Data":"610ee9ef9361d8e9f8a1747de3404adb6c4ac64de4f821f935f682958decbab7"} Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.428834 4753 scope.go:117] "RemoveContainer" containerID="b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.430425 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-929kf\" (UniqueName: \"kubernetes.io/projected/df8f04c8-671e-453e-83b8-5ced69b76402-kube-api-access-929kf\") pod \"redhat-operators-c2lps\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.430561 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-catalog-content\") pod \"redhat-operators-c2lps\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.430716 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-utilities\") pod \"redhat-operators-c2lps\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.452629 4753 scope.go:117] "RemoveContainer" containerID="5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.483757 4753 scope.go:117] "RemoveContainer" containerID="b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.484653 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046\": container with ID starting with b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046 not found: ID does not exist" containerID="b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.484697 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046"} err="failed to get container status \"b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046\": rpc error: code = NotFound desc = could not find container \"b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046\": container with ID starting with b31ea80637e5e5306fa22f080d7b06d4a830f6428739bc5f606e9204c3d34046 not found: ID does not exist" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.484724 4753 scope.go:117] "RemoveContainer" containerID="5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef" Jan 29 14:25:03 crc kubenswrapper[4753]: E0129 14:25:03.485092 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef\": container with ID starting with 5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef not found: ID does not exist" containerID="5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.485128 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef"} err="failed to get container status \"5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef\": rpc error: code = NotFound desc = could not find container \"5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef\": container with ID starting with 5285fead66a6c4524f89786c7fc35c9989322cecd4a7bad8b5d1238bb14770ef not found: ID does not exist" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.531899 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-combined-ca-bundle\") pod \"86d005cc-e014-44b4-b8fa-f402d656ae5a\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.531982 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-scripts\") pod \"86d005cc-e014-44b4-b8fa-f402d656ae5a\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.532021 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data-custom\") pod \"86d005cc-e014-44b4-b8fa-f402d656ae5a\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.532080 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86d005cc-e014-44b4-b8fa-f402d656ae5a-etc-machine-id\") pod \"86d005cc-e014-44b4-b8fa-f402d656ae5a\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.532102 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mlw8\" (UniqueName: \"kubernetes.io/projected/86d005cc-e014-44b4-b8fa-f402d656ae5a-kube-api-access-7mlw8\") pod \"86d005cc-e014-44b4-b8fa-f402d656ae5a\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.532189 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data\") pod \"86d005cc-e014-44b4-b8fa-f402d656ae5a\" (UID: \"86d005cc-e014-44b4-b8fa-f402d656ae5a\") " Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.532376 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-929kf\" (UniqueName: \"kubernetes.io/projected/df8f04c8-671e-453e-83b8-5ced69b76402-kube-api-access-929kf\") pod \"redhat-operators-c2lps\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.532429 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-catalog-content\") pod \"redhat-operators-c2lps\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.532461 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-utilities\") pod \"redhat-operators-c2lps\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.533035 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-utilities\") pod \"redhat-operators-c2lps\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.534297 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86d005cc-e014-44b4-b8fa-f402d656ae5a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "86d005cc-e014-44b4-b8fa-f402d656ae5a" (UID: "86d005cc-e014-44b4-b8fa-f402d656ae5a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.534593 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-catalog-content\") pod \"redhat-operators-c2lps\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.552502 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-scripts" (OuterVolumeSpecName: "scripts") pod "86d005cc-e014-44b4-b8fa-f402d656ae5a" (UID: "86d005cc-e014-44b4-b8fa-f402d656ae5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.552517 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86d005cc-e014-44b4-b8fa-f402d656ae5a-kube-api-access-7mlw8" (OuterVolumeSpecName: "kube-api-access-7mlw8") pod "86d005cc-e014-44b4-b8fa-f402d656ae5a" (UID: "86d005cc-e014-44b4-b8fa-f402d656ae5a"). InnerVolumeSpecName "kube-api-access-7mlw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.552521 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "86d005cc-e014-44b4-b8fa-f402d656ae5a" (UID: "86d005cc-e014-44b4-b8fa-f402d656ae5a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.558616 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-929kf\" (UniqueName: \"kubernetes.io/projected/df8f04c8-671e-453e-83b8-5ced69b76402-kube-api-access-929kf\") pod \"redhat-operators-c2lps\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.598925 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86d005cc-e014-44b4-b8fa-f402d656ae5a" (UID: "86d005cc-e014-44b4-b8fa-f402d656ae5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.616006 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data" (OuterVolumeSpecName: "config-data") pod "86d005cc-e014-44b4-b8fa-f402d656ae5a" (UID: "86d005cc-e014-44b4-b8fa-f402d656ae5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.634353 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.634397 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.634411 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.634424 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86d005cc-e014-44b4-b8fa-f402d656ae5a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.634437 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mlw8\" (UniqueName: \"kubernetes.io/projected/86d005cc-e014-44b4-b8fa-f402d656ae5a-kube-api-access-7mlw8\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.634451 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86d005cc-e014-44b4-b8fa-f402d656ae5a-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:03 crc kubenswrapper[4753]: I0129 14:25:03.743550 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:04 crc kubenswrapper[4753]: I0129 14:25:04.225942 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2lps"] Jan 29 14:25:04 crc kubenswrapper[4753]: I0129 14:25:04.458250 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2lps" event={"ID":"df8f04c8-671e-453e-83b8-5ced69b76402","Type":"ContainerStarted","Data":"67739f88084ac5b79aa18eda2babb5c98fa1c4315a0484afb6058571d3eeec15"} Jan 29 14:25:04 crc kubenswrapper[4753]: I0129 14:25:04.458473 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2lps" event={"ID":"df8f04c8-671e-453e-83b8-5ced69b76402","Type":"ContainerStarted","Data":"f01a6fd7e4fb2ff0ac51c1ef94d18a0517a2d5ae1c819658e0236f0f941e06bb"} Jan 29 14:25:04 crc kubenswrapper[4753]: I0129 14:25:04.463782 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 14:25:04 crc kubenswrapper[4753]: I0129 14:25:04.547927 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:25:04 crc kubenswrapper[4753]: I0129 14:25:04.552120 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.471521 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-98h7m_a17eeeff-955e-4718-9e0e-15fae4b8d9db/ovs-vswitchd/0.log" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.472729 4753 generic.go:334] "Generic (PLEG): container finished" podID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" exitCode=137 Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.472783 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-98h7m" event={"ID":"a17eeeff-955e-4718-9e0e-15fae4b8d9db","Type":"ContainerDied","Data":"49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0"} Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.473971 4753 generic.go:334] "Generic (PLEG): container finished" podID="df8f04c8-671e-453e-83b8-5ced69b76402" containerID="67739f88084ac5b79aa18eda2babb5c98fa1c4315a0484afb6058571d3eeec15" exitCode=0 Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.474010 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2lps" event={"ID":"df8f04c8-671e-453e-83b8-5ced69b76402","Type":"ContainerDied","Data":"67739f88084ac5b79aa18eda2babb5c98fa1c4315a0484afb6058571d3eeec15"} Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.476209 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.489207 4753 generic.go:334] "Generic (PLEG): container finished" podID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerID="3c1a5183ff790d39a91d088d0d1476e957753555d63f7eb99e9f9efc1ecd852f" exitCode=137 Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.489245 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"3c1a5183ff790d39a91d088d0d1476e957753555d63f7eb99e9f9efc1ecd852f"} Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.582254 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-98h7m_a17eeeff-955e-4718-9e0e-15fae4b8d9db/ovs-vswitchd/0.log" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.582837 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669087 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-etc-ovs\") pod \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669142 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a17eeeff-955e-4718-9e0e-15fae4b8d9db-scripts\") pod \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669199 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmrcd\" (UniqueName: \"kubernetes.io/projected/a17eeeff-955e-4718-9e0e-15fae4b8d9db-kube-api-access-fmrcd\") pod \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669216 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-run\") pod \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669257 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-log\") pod \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669278 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-lib\") pod \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\" (UID: \"a17eeeff-955e-4718-9e0e-15fae4b8d9db\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669273 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "a17eeeff-955e-4718-9e0e-15fae4b8d9db" (UID: "a17eeeff-955e-4718-9e0e-15fae4b8d9db"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669360 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-log" (OuterVolumeSpecName: "var-log") pod "a17eeeff-955e-4718-9e0e-15fae4b8d9db" (UID: "a17eeeff-955e-4718-9e0e-15fae4b8d9db"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669361 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-run" (OuterVolumeSpecName: "var-run") pod "a17eeeff-955e-4718-9e0e-15fae4b8d9db" (UID: "a17eeeff-955e-4718-9e0e-15fae4b8d9db"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669407 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-lib" (OuterVolumeSpecName: "var-lib") pod "a17eeeff-955e-4718-9e0e-15fae4b8d9db" (UID: "a17eeeff-955e-4718-9e0e-15fae4b8d9db"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669661 4753 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-lib\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669680 4753 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-etc-ovs\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669689 4753 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-run\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.669698 4753 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a17eeeff-955e-4718-9e0e-15fae4b8d9db-var-log\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.670509 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a17eeeff-955e-4718-9e0e-15fae4b8d9db-scripts" (OuterVolumeSpecName: "scripts") pod "a17eeeff-955e-4718-9e0e-15fae4b8d9db" (UID: "a17eeeff-955e-4718-9e0e-15fae4b8d9db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.674518 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a17eeeff-955e-4718-9e0e-15fae4b8d9db-kube-api-access-fmrcd" (OuterVolumeSpecName: "kube-api-access-fmrcd") pod "a17eeeff-955e-4718-9e0e-15fae4b8d9db" (UID: "a17eeeff-955e-4718-9e0e-15fae4b8d9db"). InnerVolumeSpecName "kube-api-access-fmrcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.688752 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.770693 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-lock\") pod \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.770758 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-cache\") pod \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.770784 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vktf4\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-kube-api-access-vktf4\") pod \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.770814 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6d9169-050f-40e0-91ff-80d0afa6ff53-combined-ca-bundle\") pod \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.770836 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift\") pod \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.770868 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\" (UID: \"ac6d9169-050f-40e0-91ff-80d0afa6ff53\") " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.771046 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a17eeeff-955e-4718-9e0e-15fae4b8d9db-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.771057 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmrcd\" (UniqueName: \"kubernetes.io/projected/a17eeeff-955e-4718-9e0e-15fae4b8d9db-kube-api-access-fmrcd\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.771672 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-lock" (OuterVolumeSpecName: "lock") pod "ac6d9169-050f-40e0-91ff-80d0afa6ff53" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.772120 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-cache" (OuterVolumeSpecName: "cache") pod "ac6d9169-050f-40e0-91ff-80d0afa6ff53" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.775050 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "ac6d9169-050f-40e0-91ff-80d0afa6ff53" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.775073 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-kube-api-access-vktf4" (OuterVolumeSpecName: "kube-api-access-vktf4") pod "ac6d9169-050f-40e0-91ff-80d0afa6ff53" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53"). InnerVolumeSpecName "kube-api-access-vktf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.775399 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ac6d9169-050f-40e0-91ff-80d0afa6ff53" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.872075 4753 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-lock\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.872112 4753 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ac6d9169-050f-40e0-91ff-80d0afa6ff53-cache\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.872127 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vktf4\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-kube-api-access-vktf4\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.872142 4753 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ac6d9169-050f-40e0-91ff-80d0afa6ff53-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.872196 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.888308 4753 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 29 14:25:05 crc kubenswrapper[4753]: I0129 14:25:05.974293 4753 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.090486 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6d9169-050f-40e0-91ff-80d0afa6ff53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac6d9169-050f-40e0-91ff-80d0afa6ff53" (UID: "ac6d9169-050f-40e0-91ff-80d0afa6ff53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.166409 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86d005cc-e014-44b4-b8fa-f402d656ae5a" path="/var/lib/kubelet/pods/86d005cc-e014-44b4-b8fa-f402d656ae5a/volumes" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.177885 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6d9169-050f-40e0-91ff-80d0afa6ff53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.503659 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-98h7m_a17eeeff-955e-4718-9e0e-15fae4b8d9db/ovs-vswitchd/0.log" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.505193 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-98h7m" event={"ID":"a17eeeff-955e-4718-9e0e-15fae4b8d9db","Type":"ContainerDied","Data":"92d9bdc0d276e91faa557fcc71325bda4d2fef360a9a067f152c8787ad06822e"} Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.505258 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-98h7m" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.505303 4753 scope.go:117] "RemoveContainer" containerID="49542e6482144dada4b536028bddb50efa5bf964b6b655684ae9572d215b58f0" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.518629 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ac6d9169-050f-40e0-91ff-80d0afa6ff53","Type":"ContainerDied","Data":"85ed28ab259952d0e0fdba457c93a2bac0461baa62a68e52c5d1b868679b1e3a"} Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.518723 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.537358 4753 scope.go:117] "RemoveContainer" containerID="f8f864ca36d71ff49e670e283d5f1465650229e0a336b7eb1746e958347e6742" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.543345 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-98h7m"] Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.550220 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-98h7m"] Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.565723 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.575580 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.578419 4753 scope.go:117] "RemoveContainer" containerID="554ab450b98c8af45b7d97da5f184a9f2c588b955ad8fc71c36ce3fca25f5e71" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.620311 4753 scope.go:117] "RemoveContainer" containerID="3c1a5183ff790d39a91d088d0d1476e957753555d63f7eb99e9f9efc1ecd852f" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.643337 4753 scope.go:117] "RemoveContainer" containerID="f2a0991e4200eaba753ab9243604efaa3906af593fb0112ee1a38be84f321412" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.662297 4753 scope.go:117] "RemoveContainer" containerID="4e7b00175d4cce7fcb0347c3d927ada241593a834021b289319e8fb80e3be8d0" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.678618 4753 scope.go:117] "RemoveContainer" containerID="9e38b526a388417cc886c1224a1c67b02aaed0db6c47390206d2232611f965db" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.713711 4753 scope.go:117] "RemoveContainer" containerID="5ea90ff3252e8f33e1b49dcc054b155d3ca71041184f597d0ebf57cf05baa4d3" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.734737 4753 scope.go:117] "RemoveContainer" containerID="3cebc2a254087212c9cd362b275183545752ff8879dd6da9c51d87070dfa4dab" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.757732 4753 scope.go:117] "RemoveContainer" containerID="f7948c4c2ea98b702b132274cc49f43a7f9f174f2d4a40192b402344998935a2" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.793487 4753 scope.go:117] "RemoveContainer" containerID="2f0532da04eff45cc1a9f36829b01187bceb564dbe88cdc93585471ff9447783" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.813535 4753 scope.go:117] "RemoveContainer" containerID="9cc09e193c62f0685bbeb053af7bcee09164f6f7a1f094768044b967541fdd99" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.839965 4753 scope.go:117] "RemoveContainer" containerID="e4b8c37d71f04f91b5041bbfd2ebe5398fdb2a0eabb2c788da216bac1701192b" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.857688 4753 scope.go:117] "RemoveContainer" containerID="902624d332ba6e97b89215d66460f6d1894207ecaac9579489c0f78d378469ed" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.884054 4753 scope.go:117] "RemoveContainer" containerID="851cf5e8ab4d46230b00d65ff6d2fe461116124fb2be946904a0939073438463" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.918954 4753 scope.go:117] "RemoveContainer" containerID="e165e57a40f6bc0c996660f2d010eaf2048fe443cee81d3a5bb35a9b618cefea" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.942094 4753 scope.go:117] "RemoveContainer" containerID="8c14da07d1d3d581ed1cdb3f263473ebec5da8f60db91aaf8dae95021a6acbfa" Jan 29 14:25:06 crc kubenswrapper[4753]: I0129 14:25:06.968998 4753 scope.go:117] "RemoveContainer" containerID="894c79912ca8aa5fcde9530fa2c8b0075ce313eea0c4e93fccea72f8d718e504" Jan 29 14:25:08 crc kubenswrapper[4753]: I0129 14:25:08.175979 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" path="/var/lib/kubelet/pods/a17eeeff-955e-4718-9e0e-15fae4b8d9db/volumes" Jan 29 14:25:08 crc kubenswrapper[4753]: I0129 14:25:08.176906 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" path="/var/lib/kubelet/pods/ac6d9169-050f-40e0-91ff-80d0afa6ff53/volumes" Jan 29 14:25:08 crc kubenswrapper[4753]: I0129 14:25:08.543577 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2lps" event={"ID":"df8f04c8-671e-453e-83b8-5ced69b76402","Type":"ContainerStarted","Data":"ee659220c5a97c6757aaa93bc87dc40cc3e931a6647a61d65e9b6af24ceca0f8"} Jan 29 14:25:09 crc kubenswrapper[4753]: I0129 14:25:09.555943 4753 generic.go:334] "Generic (PLEG): container finished" podID="df8f04c8-671e-453e-83b8-5ced69b76402" containerID="ee659220c5a97c6757aaa93bc87dc40cc3e931a6647a61d65e9b6af24ceca0f8" exitCode=0 Jan 29 14:25:09 crc kubenswrapper[4753]: I0129 14:25:09.556006 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2lps" event={"ID":"df8f04c8-671e-453e-83b8-5ced69b76402","Type":"ContainerDied","Data":"ee659220c5a97c6757aaa93bc87dc40cc3e931a6647a61d65e9b6af24ceca0f8"} Jan 29 14:25:12 crc kubenswrapper[4753]: I0129 14:25:12.583096 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2lps" event={"ID":"df8f04c8-671e-453e-83b8-5ced69b76402","Type":"ContainerStarted","Data":"2b8d6f5f823d79c5552be1a33a063d9917a87efebc6b117bc7cf6449fe76d059"} Jan 29 14:25:12 crc kubenswrapper[4753]: I0129 14:25:12.625123 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c2lps" podStartSLOduration=3.474982861 podStartE2EDuration="9.625096484s" podCreationTimestamp="2026-01-29 14:25:03 +0000 UTC" firstStartedPulling="2026-01-29 14:25:05.475998301 +0000 UTC m=+1340.170732683" lastFinishedPulling="2026-01-29 14:25:11.626111924 +0000 UTC m=+1346.320846306" observedRunningTime="2026-01-29 14:25:12.617099872 +0000 UTC m=+1347.311834344" watchObservedRunningTime="2026-01-29 14:25:12.625096484 +0000 UTC m=+1347.319830906" Jan 29 14:25:13 crc kubenswrapper[4753]: I0129 14:25:13.744515 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:13 crc kubenswrapper[4753]: I0129 14:25:13.744559 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:14 crc kubenswrapper[4753]: I0129 14:25:14.818944 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c2lps" podUID="df8f04c8-671e-453e-83b8-5ced69b76402" containerName="registry-server" probeResult="failure" output=< Jan 29 14:25:14 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 14:25:14 crc kubenswrapper[4753]: > Jan 29 14:25:23 crc kubenswrapper[4753]: I0129 14:25:23.813421 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:23 crc kubenswrapper[4753]: I0129 14:25:23.880227 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:24 crc kubenswrapper[4753]: I0129 14:25:24.068258 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2lps"] Jan 29 14:25:25 crc kubenswrapper[4753]: I0129 14:25:25.704108 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c2lps" podUID="df8f04c8-671e-453e-83b8-5ced69b76402" containerName="registry-server" containerID="cri-o://2b8d6f5f823d79c5552be1a33a063d9917a87efebc6b117bc7cf6449fe76d059" gracePeriod=2 Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.713924 4753 generic.go:334] "Generic (PLEG): container finished" podID="df8f04c8-671e-453e-83b8-5ced69b76402" containerID="2b8d6f5f823d79c5552be1a33a063d9917a87efebc6b117bc7cf6449fe76d059" exitCode=0 Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.714022 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2lps" event={"ID":"df8f04c8-671e-453e-83b8-5ced69b76402","Type":"ContainerDied","Data":"2b8d6f5f823d79c5552be1a33a063d9917a87efebc6b117bc7cf6449fe76d059"} Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.714416 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2lps" event={"ID":"df8f04c8-671e-453e-83b8-5ced69b76402","Type":"ContainerDied","Data":"f01a6fd7e4fb2ff0ac51c1ef94d18a0517a2d5ae1c819658e0236f0f941e06bb"} Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.714438 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f01a6fd7e4fb2ff0ac51c1ef94d18a0517a2d5ae1c819658e0236f0f941e06bb" Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.744213 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.819135 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-929kf\" (UniqueName: \"kubernetes.io/projected/df8f04c8-671e-453e-83b8-5ced69b76402-kube-api-access-929kf\") pod \"df8f04c8-671e-453e-83b8-5ced69b76402\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.819231 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-utilities\") pod \"df8f04c8-671e-453e-83b8-5ced69b76402\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.819375 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-catalog-content\") pod \"df8f04c8-671e-453e-83b8-5ced69b76402\" (UID: \"df8f04c8-671e-453e-83b8-5ced69b76402\") " Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.824234 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-utilities" (OuterVolumeSpecName: "utilities") pod "df8f04c8-671e-453e-83b8-5ced69b76402" (UID: "df8f04c8-671e-453e-83b8-5ced69b76402"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.828417 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df8f04c8-671e-453e-83b8-5ced69b76402-kube-api-access-929kf" (OuterVolumeSpecName: "kube-api-access-929kf") pod "df8f04c8-671e-453e-83b8-5ced69b76402" (UID: "df8f04c8-671e-453e-83b8-5ced69b76402"). InnerVolumeSpecName "kube-api-access-929kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.920742 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-929kf\" (UniqueName: \"kubernetes.io/projected/df8f04c8-671e-453e-83b8-5ced69b76402-kube-api-access-929kf\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.920768 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:26 crc kubenswrapper[4753]: I0129 14:25:26.970702 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df8f04c8-671e-453e-83b8-5ced69b76402" (UID: "df8f04c8-671e-453e-83b8-5ced69b76402"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.021752 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df8f04c8-671e-453e-83b8-5ced69b76402-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.054387 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.054454 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.054506 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.055191 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad5415f8f2b12f61c8e4717f3b18699a52b4bcca8d38b639a61f5684d21e9c46"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.055263 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://ad5415f8f2b12f61c8e4717f3b18699a52b4bcca8d38b639a61f5684d21e9c46" gracePeriod=600 Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.723447 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="ad5415f8f2b12f61c8e4717f3b18699a52b4bcca8d38b639a61f5684d21e9c46" exitCode=0 Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.723528 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"ad5415f8f2b12f61c8e4717f3b18699a52b4bcca8d38b639a61f5684d21e9c46"} Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.723816 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88"} Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.723835 4753 scope.go:117] "RemoveContainer" containerID="12d7924f9ff7f63db0598221481b584d9481ba358c87450c2b5683ad81272c03" Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.723838 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2lps" Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.796377 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2lps"] Jan 29 14:25:27 crc kubenswrapper[4753]: I0129 14:25:27.802034 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c2lps"] Jan 29 14:25:28 crc kubenswrapper[4753]: I0129 14:25:28.164792 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df8f04c8-671e-453e-83b8-5ced69b76402" path="/var/lib/kubelet/pods/df8f04c8-671e-453e-83b8-5ced69b76402/volumes" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.091862 4753 scope.go:117] "RemoveContainer" containerID="576381ff3eac0a6ba00cc42f35866b9e306929cc77dfe313fbee2eabe20eabfd" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.124684 4753 scope.go:117] "RemoveContainer" containerID="04cf2bb41576bf58389254d683314aee72861c9ebd5be98e2ebdeec419c78c85" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.187690 4753 scope.go:117] "RemoveContainer" containerID="77bf30ffd3f8171c6519acda2167defb718243931cd2d3e74a471e92923e38b6" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.230740 4753 scope.go:117] "RemoveContainer" containerID="b74d0aaf87d9391cd8357f4b54f86596f5fc1f32b80ddcb4a7b5f92ee6f68faf" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.276973 4753 scope.go:117] "RemoveContainer" containerID="e9817aee647b3f9d81adf2e4ce14d5bae32c01236812cacdf0002005b7c75760" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.324307 4753 scope.go:117] "RemoveContainer" containerID="52d311fc58397250b09be34c36b15dfcfcdccde7d8b80b2368229261729b241a" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.355780 4753 scope.go:117] "RemoveContainer" containerID="ebad3f69ada789b8ad418576e49d5088e4c02b2732eef46ca9ae5b4cc644d332" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.410221 4753 scope.go:117] "RemoveContainer" containerID="197dc1a71d086547b1bd896b6d0217f47e70ba62b316ca866c8afcd65fa24313" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.451246 4753 scope.go:117] "RemoveContainer" containerID="600dea35b4935654339a65c2998a749f0c3146c44c8980223cca2ab749d53237" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.480503 4753 scope.go:117] "RemoveContainer" containerID="ebd26770e4f3deee8e6cab6959319039b301c068d712e07635f86a428ddaecb3" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.504044 4753 scope.go:117] "RemoveContainer" containerID="7d9bc6a338bfd8692c4f35a236d7b9e2ef5f2c475ad33ced83ef75edf0c8103b" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.526550 4753 scope.go:117] "RemoveContainer" containerID="bd11cb05b1130591c9e8b25904125b7acdc099679066354cd495cac1fb5d906c" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.574549 4753 scope.go:117] "RemoveContainer" containerID="838908d5e19934d82b12ae2e84d73201f81cdc2b5ee863afff8ead4c7b4b6cbf" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.622443 4753 scope.go:117] "RemoveContainer" containerID="a09633d250fe3b97729c4d076f39aed0c8cbd6633607e39dc28ae59a436fa5e4" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.639062 4753 scope.go:117] "RemoveContainer" containerID="273ad29011f17b33fc1b0ca8c43fc417a7a9fcd227644ac6dea1d27d6b03361b" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.653745 4753 scope.go:117] "RemoveContainer" containerID="cdc4d17706fe091cbd13f2095b63490a23b139430dc546836664bce0499b67f6" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.672416 4753 scope.go:117] "RemoveContainer" containerID="9de8348dae5eee512c3a686a7ee1add3f75b5b58d76f466710a04d0d910d5cd7" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.706776 4753 scope.go:117] "RemoveContainer" containerID="a9f3c8865dd90ae8cd702248995b59416d6e48fd3e228641478bf40b18db9234" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.743971 4753 scope.go:117] "RemoveContainer" containerID="0f4a8a7dd4d3d6d4c66441e081f1c7bde2dc88522a62f0e1514743d865dd861a" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.776577 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlqq"] Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.776928 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-server" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.776949 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-server" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.776965 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-auditor" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.776973 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-auditor" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.776983 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-server" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.776990 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-server" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.776999 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-server" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777007 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-server" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777020 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-replicator" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777029 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-replicator" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777041 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server-init" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777048 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server-init" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777060 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777068 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777080 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-replicator" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777088 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-replicator" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777099 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-replicator" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777106 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-replicator" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777116 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86d005cc-e014-44b4-b8fa-f402d656ae5a" containerName="probe" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777123 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="86d005cc-e014-44b4-b8fa-f402d656ae5a" containerName="probe" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777132 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-reaper" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777140 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-reaper" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777165 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-auditor" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777173 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-auditor" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777184 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8f04c8-671e-453e-83b8-5ced69b76402" containerName="extract-content" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777192 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8f04c8-671e-453e-83b8-5ced69b76402" containerName="extract-content" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777206 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8f04c8-671e-453e-83b8-5ced69b76402" containerName="registry-server" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777214 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8f04c8-671e-453e-83b8-5ced69b76402" containerName="registry-server" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777226 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86d005cc-e014-44b4-b8fa-f402d656ae5a" containerName="cinder-scheduler" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777235 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="86d005cc-e014-44b4-b8fa-f402d656ae5a" containerName="cinder-scheduler" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777250 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-expirer" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777259 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-expirer" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777270 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="rsync" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777278 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="rsync" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777293 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovs-vswitchd" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777301 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovs-vswitchd" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777312 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-updater" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777320 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-updater" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777332 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8f04c8-671e-453e-83b8-5ced69b76402" containerName="extract-utilities" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777339 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8f04c8-671e-453e-83b8-5ced69b76402" containerName="extract-utilities" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777348 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-updater" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777356 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-updater" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777369 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-auditor" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777377 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-auditor" Jan 29 14:27:01 crc kubenswrapper[4753]: E0129 14:27:01.777393 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="swift-recon-cron" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777400 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="swift-recon-cron" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777551 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-replicator" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777568 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-updater" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777578 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-reaper" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777585 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="rsync" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777595 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="df8f04c8-671e-453e-83b8-5ced69b76402" containerName="registry-server" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777605 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-replicator" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777617 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-server" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777629 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-server" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777641 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="swift-recon-cron" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777653 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="account-auditor" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777661 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-replicator" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777673 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-auditor" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777681 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-updater" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777690 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="86d005cc-e014-44b4-b8fa-f402d656ae5a" containerName="cinder-scheduler" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777699 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovs-vswitchd" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777709 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-auditor" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777717 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="object-expirer" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777728 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a17eeeff-955e-4718-9e0e-15fae4b8d9db" containerName="ovsdb-server" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777740 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="86d005cc-e014-44b4-b8fa-f402d656ae5a" containerName="probe" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.777752 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6d9169-050f-40e0-91ff-80d0afa6ff53" containerName="container-server" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.779204 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.786088 4753 scope.go:117] "RemoveContainer" containerID="e00e67a39ecbbcca5a026b499f041483245ed4bf836f275bfea36c7f48164b65" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.790687 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlqq"] Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.830109 4753 scope.go:117] "RemoveContainer" containerID="14e5df1a722e4fdd9f1cdff60bc64e62c47ca34041f0ff038441520109891f30" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.872990 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-utilities\") pod \"redhat-marketplace-vdlqq\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.873070 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqdvx\" (UniqueName: \"kubernetes.io/projected/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-kube-api-access-xqdvx\") pod \"redhat-marketplace-vdlqq\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.873173 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-catalog-content\") pod \"redhat-marketplace-vdlqq\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.877595 4753 scope.go:117] "RemoveContainer" containerID="1a0cd98dd6eddf05f0e42f6c2fc60a459aac1e62568241e836fba1754c6c671e" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.974116 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-catalog-content\") pod \"redhat-marketplace-vdlqq\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.974191 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-utilities\") pod \"redhat-marketplace-vdlqq\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.974237 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqdvx\" (UniqueName: \"kubernetes.io/projected/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-kube-api-access-xqdvx\") pod \"redhat-marketplace-vdlqq\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.974851 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-catalog-content\") pod \"redhat-marketplace-vdlqq\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.975099 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-utilities\") pod \"redhat-marketplace-vdlqq\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:01 crc kubenswrapper[4753]: I0129 14:27:01.997758 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqdvx\" (UniqueName: \"kubernetes.io/projected/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-kube-api-access-xqdvx\") pod \"redhat-marketplace-vdlqq\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:02 crc kubenswrapper[4753]: I0129 14:27:02.106469 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:02 crc kubenswrapper[4753]: I0129 14:27:02.631383 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlqq"] Jan 29 14:27:02 crc kubenswrapper[4753]: I0129 14:27:02.836720 4753 generic.go:334] "Generic (PLEG): container finished" podID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerID="1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1" exitCode=0 Jan 29 14:27:02 crc kubenswrapper[4753]: I0129 14:27:02.837015 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlqq" event={"ID":"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d","Type":"ContainerDied","Data":"1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1"} Jan 29 14:27:02 crc kubenswrapper[4753]: I0129 14:27:02.837048 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlqq" event={"ID":"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d","Type":"ContainerStarted","Data":"32798055c94f3367fe60cbd537129a9b28e34c675a335ef513599802e202e838"} Jan 29 14:27:04 crc kubenswrapper[4753]: I0129 14:27:04.860645 4753 generic.go:334] "Generic (PLEG): container finished" podID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerID="49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b" exitCode=0 Jan 29 14:27:04 crc kubenswrapper[4753]: I0129 14:27:04.860703 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlqq" event={"ID":"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d","Type":"ContainerDied","Data":"49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b"} Jan 29 14:27:05 crc kubenswrapper[4753]: I0129 14:27:05.874000 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlqq" event={"ID":"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d","Type":"ContainerStarted","Data":"641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a"} Jan 29 14:27:05 crc kubenswrapper[4753]: I0129 14:27:05.904298 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vdlqq" podStartSLOduration=2.450226374 podStartE2EDuration="4.904269968s" podCreationTimestamp="2026-01-29 14:27:01 +0000 UTC" firstStartedPulling="2026-01-29 14:27:02.840476046 +0000 UTC m=+1457.535210428" lastFinishedPulling="2026-01-29 14:27:05.29451964 +0000 UTC m=+1459.989254022" observedRunningTime="2026-01-29 14:27:05.895540415 +0000 UTC m=+1460.590274807" watchObservedRunningTime="2026-01-29 14:27:05.904269968 +0000 UTC m=+1460.599004380" Jan 29 14:27:12 crc kubenswrapper[4753]: I0129 14:27:12.108267 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:12 crc kubenswrapper[4753]: I0129 14:27:12.109394 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:12 crc kubenswrapper[4753]: I0129 14:27:12.161950 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:13 crc kubenswrapper[4753]: I0129 14:27:13.018763 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:13 crc kubenswrapper[4753]: I0129 14:27:13.083222 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlqq"] Jan 29 14:27:14 crc kubenswrapper[4753]: I0129 14:27:14.942916 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vdlqq" podUID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerName="registry-server" containerID="cri-o://641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a" gracePeriod=2 Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.398576 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.579261 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqdvx\" (UniqueName: \"kubernetes.io/projected/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-kube-api-access-xqdvx\") pod \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.579370 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-utilities\") pod \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.579420 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-catalog-content\") pod \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\" (UID: \"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d\") " Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.580120 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-utilities" (OuterVolumeSpecName: "utilities") pod "a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" (UID: "a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.587172 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-kube-api-access-xqdvx" (OuterVolumeSpecName: "kube-api-access-xqdvx") pod "a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" (UID: "a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d"). InnerVolumeSpecName "kube-api-access-xqdvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.621965 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" (UID: "a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.681614 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.681658 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.681731 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqdvx\" (UniqueName: \"kubernetes.io/projected/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d-kube-api-access-xqdvx\") on node \"crc\" DevicePath \"\"" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.951201 4753 generic.go:334] "Generic (PLEG): container finished" podID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerID="641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a" exitCode=0 Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.951242 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlqq" event={"ID":"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d","Type":"ContainerDied","Data":"641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a"} Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.951268 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdlqq" event={"ID":"a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d","Type":"ContainerDied","Data":"32798055c94f3367fe60cbd537129a9b28e34c675a335ef513599802e202e838"} Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.951285 4753 scope.go:117] "RemoveContainer" containerID="641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.951282 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdlqq" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.971112 4753 scope.go:117] "RemoveContainer" containerID="49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.986181 4753 scope.go:117] "RemoveContainer" containerID="1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1" Jan 29 14:27:15 crc kubenswrapper[4753]: I0129 14:27:15.997521 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlqq"] Jan 29 14:27:16 crc kubenswrapper[4753]: I0129 14:27:16.002991 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdlqq"] Jan 29 14:27:16 crc kubenswrapper[4753]: I0129 14:27:16.012058 4753 scope.go:117] "RemoveContainer" containerID="641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a" Jan 29 14:27:16 crc kubenswrapper[4753]: E0129 14:27:16.012569 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a\": container with ID starting with 641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a not found: ID does not exist" containerID="641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a" Jan 29 14:27:16 crc kubenswrapper[4753]: I0129 14:27:16.012607 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a"} err="failed to get container status \"641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a\": rpc error: code = NotFound desc = could not find container \"641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a\": container with ID starting with 641c0b57d1cf786eb38bf53c5c13c53c22ea463219ac62213b976516e2e4509a not found: ID does not exist" Jan 29 14:27:16 crc kubenswrapper[4753]: I0129 14:27:16.012630 4753 scope.go:117] "RemoveContainer" containerID="49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b" Jan 29 14:27:16 crc kubenswrapper[4753]: E0129 14:27:16.013024 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b\": container with ID starting with 49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b not found: ID does not exist" containerID="49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b" Jan 29 14:27:16 crc kubenswrapper[4753]: I0129 14:27:16.013070 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b"} err="failed to get container status \"49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b\": rpc error: code = NotFound desc = could not find container \"49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b\": container with ID starting with 49d0d361159ad79d132ac28bd11cbbaeadb0df6c3027244bdb91b558131df54b not found: ID does not exist" Jan 29 14:27:16 crc kubenswrapper[4753]: I0129 14:27:16.013098 4753 scope.go:117] "RemoveContainer" containerID="1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1" Jan 29 14:27:16 crc kubenswrapper[4753]: E0129 14:27:16.013420 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1\": container with ID starting with 1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1 not found: ID does not exist" containerID="1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1" Jan 29 14:27:16 crc kubenswrapper[4753]: I0129 14:27:16.013450 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1"} err="failed to get container status \"1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1\": rpc error: code = NotFound desc = could not find container \"1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1\": container with ID starting with 1d3a7fd40e51acc0657b5aed6311af4c87d5e168081370a7cd1bc365f55498a1 not found: ID does not exist" Jan 29 14:27:16 crc kubenswrapper[4753]: I0129 14:27:16.160364 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" path="/var/lib/kubelet/pods/a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d/volumes" Jan 29 14:27:27 crc kubenswrapper[4753]: I0129 14:27:27.056894 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:27:27 crc kubenswrapper[4753]: I0129 14:27:27.057386 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:27:57 crc kubenswrapper[4753]: I0129 14:27:57.054970 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:27:57 crc kubenswrapper[4753]: I0129 14:27:57.055861 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:28:02 crc kubenswrapper[4753]: I0129 14:28:02.340957 4753 scope.go:117] "RemoveContainer" containerID="937a66892ca282be79cde33ce6bfc835a53f9fcd82a044866e67c40c8313e482" Jan 29 14:28:02 crc kubenswrapper[4753]: I0129 14:28:02.405866 4753 scope.go:117] "RemoveContainer" containerID="8c50dc9a01d5ccf0266d0160ffff1fd0b192424fdbde1d80fdc73417a471c529" Jan 29 14:28:02 crc kubenswrapper[4753]: I0129 14:28:02.433450 4753 scope.go:117] "RemoveContainer" containerID="b77d8c3d96dc50269ebb9916110f03d8650bc3d367afa27d97606e1aa2292caa" Jan 29 14:28:02 crc kubenswrapper[4753]: I0129 14:28:02.462578 4753 scope.go:117] "RemoveContainer" containerID="09fccc35a93bd7153a392185cb41cd50ef322765121cbd848e3a3e60eb0e62c6" Jan 29 14:28:02 crc kubenswrapper[4753]: I0129 14:28:02.494827 4753 scope.go:117] "RemoveContainer" containerID="4e6451e61042e484600018939d4da18de458042e275e4e20bffaa3d9d6779d31" Jan 29 14:28:02 crc kubenswrapper[4753]: I0129 14:28:02.523272 4753 scope.go:117] "RemoveContainer" containerID="0619b103b3467bd1fe70b0902054e340ccfb8ab97a1116e60aa44770c60580dc" Jan 29 14:28:02 crc kubenswrapper[4753]: I0129 14:28:02.546878 4753 scope.go:117] "RemoveContainer" containerID="90e322952691e5340d7dfdb0eef5d70192f25afb3fa06e2774112e840dc0d238" Jan 29 14:28:02 crc kubenswrapper[4753]: I0129 14:28:02.598465 4753 scope.go:117] "RemoveContainer" containerID="e450be8a7013dbd08831b7150e1c8ccd73e085bf16b828b286d89403e5f7cfbc" Jan 29 14:28:02 crc kubenswrapper[4753]: I0129 14:28:02.640327 4753 scope.go:117] "RemoveContainer" containerID="3b46cefbd0f083420a3feabe6fcc97942c18ff0babb3df072dd24c5df0a8f895" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.033832 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vshxl"] Jan 29 14:28:27 crc kubenswrapper[4753]: E0129 14:28:27.037283 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerName="extract-utilities" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.037308 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerName="extract-utilities" Jan 29 14:28:27 crc kubenswrapper[4753]: E0129 14:28:27.037342 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerName="registry-server" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.037353 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerName="registry-server" Jan 29 14:28:27 crc kubenswrapper[4753]: E0129 14:28:27.037369 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerName="extract-content" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.037378 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerName="extract-content" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.037857 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a96c0175-c66b-46ff-a9f1-8b3fc8e9eb8d" containerName="registry-server" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.039434 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.053042 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vshxl"] Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.055835 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.055886 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.055935 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.056613 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.056682 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" gracePeriod=600 Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.141909 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm9jn\" (UniqueName: \"kubernetes.io/projected/3c3c4d3e-7053-4969-aea2-2995eb981024-kube-api-access-nm9jn\") pod \"community-operators-vshxl\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.141980 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-catalog-content\") pod \"community-operators-vshxl\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.142087 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-utilities\") pod \"community-operators-vshxl\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.243115 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm9jn\" (UniqueName: \"kubernetes.io/projected/3c3c4d3e-7053-4969-aea2-2995eb981024-kube-api-access-nm9jn\") pod \"community-operators-vshxl\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.243233 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-catalog-content\") pod \"community-operators-vshxl\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.243306 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-utilities\") pod \"community-operators-vshxl\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.244030 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-catalog-content\") pod \"community-operators-vshxl\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.244086 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-utilities\") pod \"community-operators-vshxl\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.270974 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm9jn\" (UniqueName: \"kubernetes.io/projected/3c3c4d3e-7053-4969-aea2-2995eb981024-kube-api-access-nm9jn\") pod \"community-operators-vshxl\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.373176 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.644644 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vshxl"] Jan 29 14:28:27 crc kubenswrapper[4753]: E0129 14:28:27.772228 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.811371 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vshxl" event={"ID":"3c3c4d3e-7053-4969-aea2-2995eb981024","Type":"ContainerStarted","Data":"b3b28543124a7392a39f6a37094ee336845e4cc5d9aa3b3ecf943e4ecf435514"} Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.813721 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" exitCode=0 Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.813749 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88"} Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.813771 4753 scope.go:117] "RemoveContainer" containerID="ad5415f8f2b12f61c8e4717f3b18699a52b4bcca8d38b639a61f5684d21e9c46" Jan 29 14:28:27 crc kubenswrapper[4753]: I0129 14:28:27.814403 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:28:27 crc kubenswrapper[4753]: E0129 14:28:27.814687 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:28:28 crc kubenswrapper[4753]: I0129 14:28:28.828413 4753 generic.go:334] "Generic (PLEG): container finished" podID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerID="8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c" exitCode=0 Jan 29 14:28:28 crc kubenswrapper[4753]: I0129 14:28:28.828491 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vshxl" event={"ID":"3c3c4d3e-7053-4969-aea2-2995eb981024","Type":"ContainerDied","Data":"8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c"} Jan 29 14:28:30 crc kubenswrapper[4753]: I0129 14:28:30.852370 4753 generic.go:334] "Generic (PLEG): container finished" podID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerID="9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359" exitCode=0 Jan 29 14:28:30 crc kubenswrapper[4753]: I0129 14:28:30.852664 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vshxl" event={"ID":"3c3c4d3e-7053-4969-aea2-2995eb981024","Type":"ContainerDied","Data":"9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359"} Jan 29 14:28:31 crc kubenswrapper[4753]: I0129 14:28:31.866461 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vshxl" event={"ID":"3c3c4d3e-7053-4969-aea2-2995eb981024","Type":"ContainerStarted","Data":"89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd"} Jan 29 14:28:31 crc kubenswrapper[4753]: I0129 14:28:31.912974 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vshxl" podStartSLOduration=3.280825035 podStartE2EDuration="5.912945052s" podCreationTimestamp="2026-01-29 14:28:26 +0000 UTC" firstStartedPulling="2026-01-29 14:28:28.831583241 +0000 UTC m=+1543.526317663" lastFinishedPulling="2026-01-29 14:28:31.463703288 +0000 UTC m=+1546.158437680" observedRunningTime="2026-01-29 14:28:31.902399749 +0000 UTC m=+1546.597134171" watchObservedRunningTime="2026-01-29 14:28:31.912945052 +0000 UTC m=+1546.607679464" Jan 29 14:28:37 crc kubenswrapper[4753]: I0129 14:28:37.374263 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:37 crc kubenswrapper[4753]: I0129 14:28:37.375661 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:37 crc kubenswrapper[4753]: I0129 14:28:37.440085 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:37 crc kubenswrapper[4753]: I0129 14:28:37.964580 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:38 crc kubenswrapper[4753]: I0129 14:28:38.020493 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vshxl"] Jan 29 14:28:38 crc kubenswrapper[4753]: I0129 14:28:38.149774 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:28:38 crc kubenswrapper[4753]: E0129 14:28:38.150355 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:28:39 crc kubenswrapper[4753]: I0129 14:28:39.940134 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vshxl" podUID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerName="registry-server" containerID="cri-o://89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd" gracePeriod=2 Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.421051 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.458760 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm9jn\" (UniqueName: \"kubernetes.io/projected/3c3c4d3e-7053-4969-aea2-2995eb981024-kube-api-access-nm9jn\") pod \"3c3c4d3e-7053-4969-aea2-2995eb981024\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.458865 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-catalog-content\") pod \"3c3c4d3e-7053-4969-aea2-2995eb981024\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.458900 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-utilities\") pod \"3c3c4d3e-7053-4969-aea2-2995eb981024\" (UID: \"3c3c4d3e-7053-4969-aea2-2995eb981024\") " Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.460929 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-utilities" (OuterVolumeSpecName: "utilities") pod "3c3c4d3e-7053-4969-aea2-2995eb981024" (UID: "3c3c4d3e-7053-4969-aea2-2995eb981024"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.469393 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3c4d3e-7053-4969-aea2-2995eb981024-kube-api-access-nm9jn" (OuterVolumeSpecName: "kube-api-access-nm9jn") pod "3c3c4d3e-7053-4969-aea2-2995eb981024" (UID: "3c3c4d3e-7053-4969-aea2-2995eb981024"). InnerVolumeSpecName "kube-api-access-nm9jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.539949 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c3c4d3e-7053-4969-aea2-2995eb981024" (UID: "3c3c4d3e-7053-4969-aea2-2995eb981024"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.560591 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.560648 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4d3e-7053-4969-aea2-2995eb981024-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.560672 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm9jn\" (UniqueName: \"kubernetes.io/projected/3c3c4d3e-7053-4969-aea2-2995eb981024-kube-api-access-nm9jn\") on node \"crc\" DevicePath \"\"" Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.957902 4753 generic.go:334] "Generic (PLEG): container finished" podID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerID="89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd" exitCode=0 Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.957971 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vshxl" Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.957990 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vshxl" event={"ID":"3c3c4d3e-7053-4969-aea2-2995eb981024","Type":"ContainerDied","Data":"89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd"} Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.958759 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vshxl" event={"ID":"3c3c4d3e-7053-4969-aea2-2995eb981024","Type":"ContainerDied","Data":"b3b28543124a7392a39f6a37094ee336845e4cc5d9aa3b3ecf943e4ecf435514"} Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.958796 4753 scope.go:117] "RemoveContainer" containerID="89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd" Jan 29 14:28:40 crc kubenswrapper[4753]: I0129 14:28:40.989925 4753 scope.go:117] "RemoveContainer" containerID="9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359" Jan 29 14:28:41 crc kubenswrapper[4753]: I0129 14:28:41.009675 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vshxl"] Jan 29 14:28:41 crc kubenswrapper[4753]: I0129 14:28:41.014103 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vshxl"] Jan 29 14:28:41 crc kubenswrapper[4753]: I0129 14:28:41.041616 4753 scope.go:117] "RemoveContainer" containerID="8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c" Jan 29 14:28:41 crc kubenswrapper[4753]: I0129 14:28:41.069661 4753 scope.go:117] "RemoveContainer" containerID="89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd" Jan 29 14:28:41 crc kubenswrapper[4753]: E0129 14:28:41.070338 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd\": container with ID starting with 89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd not found: ID does not exist" containerID="89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd" Jan 29 14:28:41 crc kubenswrapper[4753]: I0129 14:28:41.070391 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd"} err="failed to get container status \"89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd\": rpc error: code = NotFound desc = could not find container \"89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd\": container with ID starting with 89028ecfc8546a7ddbead97a4f7d26b9bdd579ad7a928ee8ac7af0dbbb9f88bd not found: ID does not exist" Jan 29 14:28:41 crc kubenswrapper[4753]: I0129 14:28:41.070425 4753 scope.go:117] "RemoveContainer" containerID="9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359" Jan 29 14:28:41 crc kubenswrapper[4753]: E0129 14:28:41.070835 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359\": container with ID starting with 9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359 not found: ID does not exist" containerID="9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359" Jan 29 14:28:41 crc kubenswrapper[4753]: I0129 14:28:41.070885 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359"} err="failed to get container status \"9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359\": rpc error: code = NotFound desc = could not find container \"9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359\": container with ID starting with 9537481d99df7f76656f682223fcaea6135195cd1507d269da887e748a00b359 not found: ID does not exist" Jan 29 14:28:41 crc kubenswrapper[4753]: I0129 14:28:41.070947 4753 scope.go:117] "RemoveContainer" containerID="8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c" Jan 29 14:28:41 crc kubenswrapper[4753]: E0129 14:28:41.071557 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c\": container with ID starting with 8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c not found: ID does not exist" containerID="8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c" Jan 29 14:28:41 crc kubenswrapper[4753]: I0129 14:28:41.071601 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c"} err="failed to get container status \"8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c\": rpc error: code = NotFound desc = could not find container \"8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c\": container with ID starting with 8fe3836b2c191ec99a7d521fcbbacf290c603d5366aeeab7dcadfd573aa31c3c not found: ID does not exist" Jan 29 14:28:42 crc kubenswrapper[4753]: I0129 14:28:42.164106 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c3c4d3e-7053-4969-aea2-2995eb981024" path="/var/lib/kubelet/pods/3c3c4d3e-7053-4969-aea2-2995eb981024/volumes" Jan 29 14:28:53 crc kubenswrapper[4753]: I0129 14:28:53.150025 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:28:53 crc kubenswrapper[4753]: E0129 14:28:53.150958 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:29:02 crc kubenswrapper[4753]: I0129 14:29:02.867374 4753 scope.go:117] "RemoveContainer" containerID="7e38bce9afd9e377481b189403130a86ffaab75f07eb1b987d4aa052f07ce6b2" Jan 29 14:29:02 crc kubenswrapper[4753]: I0129 14:29:02.939721 4753 scope.go:117] "RemoveContainer" containerID="308efde5a1b73816a7b74039efd8e127ce5aafd5f6d5ee3906227a9fcfccf2b8" Jan 29 14:29:02 crc kubenswrapper[4753]: I0129 14:29:02.970852 4753 scope.go:117] "RemoveContainer" containerID="5ebc3f1b2205648bb7601685da242e41c1b97a5b780c0a59881e50d9c9fc0cc9" Jan 29 14:29:03 crc kubenswrapper[4753]: I0129 14:29:03.031221 4753 scope.go:117] "RemoveContainer" containerID="ce2920a02cf6ae18e9004d13fee37b55154d0a3dda23fc0d8807c46cadda2a1c" Jan 29 14:29:03 crc kubenswrapper[4753]: I0129 14:29:03.093409 4753 scope.go:117] "RemoveContainer" containerID="1b5479e0d4430d6fc3745e3ed9afa5a4334d10df67d89187482d0455266b8f05" Jan 29 14:29:03 crc kubenswrapper[4753]: I0129 14:29:03.125034 4753 scope.go:117] "RemoveContainer" containerID="06f2a776446be8e207b874f0c09730d6dd63037c338c1c964cfe4ac94fa1b220" Jan 29 14:29:08 crc kubenswrapper[4753]: I0129 14:29:08.149618 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:29:08 crc kubenswrapper[4753]: E0129 14:29:08.150627 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:29:22 crc kubenswrapper[4753]: I0129 14:29:22.150619 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:29:22 crc kubenswrapper[4753]: E0129 14:29:22.152455 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:29:33 crc kubenswrapper[4753]: I0129 14:29:33.149868 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:29:33 crc kubenswrapper[4753]: E0129 14:29:33.151124 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:29:44 crc kubenswrapper[4753]: I0129 14:29:44.149334 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:29:44 crc kubenswrapper[4753]: E0129 14:29:44.150437 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:29:56 crc kubenswrapper[4753]: I0129 14:29:56.157482 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:29:56 crc kubenswrapper[4753]: E0129 14:29:56.159845 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.171516 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22"] Jan 29 14:30:00 crc kubenswrapper[4753]: E0129 14:30:00.172513 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerName="registry-server" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.172547 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerName="registry-server" Jan 29 14:30:00 crc kubenswrapper[4753]: E0129 14:30:00.172575 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerName="extract-content" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.172592 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerName="extract-content" Jan 29 14:30:00 crc kubenswrapper[4753]: E0129 14:30:00.172639 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerName="extract-utilities" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.172656 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerName="extract-utilities" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.172973 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c3c4d3e-7053-4969-aea2-2995eb981024" containerName="registry-server" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.174045 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.177336 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.177815 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.198981 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22"] Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.252794 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-config-volume\") pod \"collect-profiles-29494950-62n22\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.252890 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9ncn\" (UniqueName: \"kubernetes.io/projected/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-kube-api-access-z9ncn\") pod \"collect-profiles-29494950-62n22\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.253057 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-secret-volume\") pod \"collect-profiles-29494950-62n22\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.354567 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-config-volume\") pod \"collect-profiles-29494950-62n22\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.354635 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9ncn\" (UniqueName: \"kubernetes.io/projected/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-kube-api-access-z9ncn\") pod \"collect-profiles-29494950-62n22\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.354685 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-secret-volume\") pod \"collect-profiles-29494950-62n22\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.357654 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-config-volume\") pod \"collect-profiles-29494950-62n22\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.365559 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-secret-volume\") pod \"collect-profiles-29494950-62n22\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.381144 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9ncn\" (UniqueName: \"kubernetes.io/projected/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-kube-api-access-z9ncn\") pod \"collect-profiles-29494950-62n22\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.507624 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:00 crc kubenswrapper[4753]: I0129 14:30:00.761767 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22"] Jan 29 14:30:01 crc kubenswrapper[4753]: I0129 14:30:01.737675 4753 generic.go:334] "Generic (PLEG): container finished" podID="41f636be-69c3-4afd-a2f0-fbf0e47c0e83" containerID="c173db9ae7bc291d121b51324e26a1f9182e273f61a8730452f19be66d914eb4" exitCode=0 Jan 29 14:30:01 crc kubenswrapper[4753]: I0129 14:30:01.737752 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" event={"ID":"41f636be-69c3-4afd-a2f0-fbf0e47c0e83","Type":"ContainerDied","Data":"c173db9ae7bc291d121b51324e26a1f9182e273f61a8730452f19be66d914eb4"} Jan 29 14:30:01 crc kubenswrapper[4753]: I0129 14:30:01.738089 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" event={"ID":"41f636be-69c3-4afd-a2f0-fbf0e47c0e83","Type":"ContainerStarted","Data":"4a1f8d5fbbe01525d993c970f344a18ead12572168e04258119d98938663826f"} Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.076045 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.097940 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-secret-volume\") pod \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.098079 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9ncn\" (UniqueName: \"kubernetes.io/projected/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-kube-api-access-z9ncn\") pod \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.098144 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-config-volume\") pod \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\" (UID: \"41f636be-69c3-4afd-a2f0-fbf0e47c0e83\") " Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.098930 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-config-volume" (OuterVolumeSpecName: "config-volume") pod "41f636be-69c3-4afd-a2f0-fbf0e47c0e83" (UID: "41f636be-69c3-4afd-a2f0-fbf0e47c0e83"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.099894 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.103431 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-kube-api-access-z9ncn" (OuterVolumeSpecName: "kube-api-access-z9ncn") pod "41f636be-69c3-4afd-a2f0-fbf0e47c0e83" (UID: "41f636be-69c3-4afd-a2f0-fbf0e47c0e83"). InnerVolumeSpecName "kube-api-access-z9ncn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.103584 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "41f636be-69c3-4afd-a2f0-fbf0e47c0e83" (UID: "41f636be-69c3-4afd-a2f0-fbf0e47c0e83"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.201210 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9ncn\" (UniqueName: \"kubernetes.io/projected/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-kube-api-access-z9ncn\") on node \"crc\" DevicePath \"\"" Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.201633 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41f636be-69c3-4afd-a2f0-fbf0e47c0e83-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.266094 4753 scope.go:117] "RemoveContainer" containerID="1382f4decacf5f428514245aaf0e2515208a47c41adb2f648c7202bccd9216f9" Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.314714 4753 scope.go:117] "RemoveContainer" containerID="857539f16ab7d5f43263fb3fc69c40c7cfc6306d139f5a0e01c992147a025e17" Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.756801 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" event={"ID":"41f636be-69c3-4afd-a2f0-fbf0e47c0e83","Type":"ContainerDied","Data":"4a1f8d5fbbe01525d993c970f344a18ead12572168e04258119d98938663826f"} Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.756864 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a1f8d5fbbe01525d993c970f344a18ead12572168e04258119d98938663826f" Jan 29 14:30:03 crc kubenswrapper[4753]: I0129 14:30:03.756914 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22" Jan 29 14:30:09 crc kubenswrapper[4753]: I0129 14:30:09.149568 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:30:09 crc kubenswrapper[4753]: E0129 14:30:09.152013 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:30:21 crc kubenswrapper[4753]: I0129 14:30:21.149369 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:30:21 crc kubenswrapper[4753]: E0129 14:30:21.150076 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:30:36 crc kubenswrapper[4753]: I0129 14:30:36.153774 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:30:36 crc kubenswrapper[4753]: E0129 14:30:36.154606 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:30:48 crc kubenswrapper[4753]: I0129 14:30:48.149414 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:30:48 crc kubenswrapper[4753]: E0129 14:30:48.150770 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:31:01 crc kubenswrapper[4753]: I0129 14:31:01.149404 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:31:01 crc kubenswrapper[4753]: E0129 14:31:01.150362 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:31:13 crc kubenswrapper[4753]: I0129 14:31:13.149536 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:31:13 crc kubenswrapper[4753]: E0129 14:31:13.150885 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:31:25 crc kubenswrapper[4753]: I0129 14:31:25.150376 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:31:25 crc kubenswrapper[4753]: E0129 14:31:25.152722 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:31:37 crc kubenswrapper[4753]: I0129 14:31:37.149506 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:31:37 crc kubenswrapper[4753]: E0129 14:31:37.150563 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:31:48 crc kubenswrapper[4753]: I0129 14:31:48.154829 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:31:48 crc kubenswrapper[4753]: E0129 14:31:48.156484 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:32:01 crc kubenswrapper[4753]: I0129 14:32:01.149037 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:32:01 crc kubenswrapper[4753]: E0129 14:32:01.150000 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:32:03 crc kubenswrapper[4753]: I0129 14:32:03.411431 4753 scope.go:117] "RemoveContainer" containerID="67739f88084ac5b79aa18eda2babb5c98fa1c4315a0484afb6058571d3eeec15" Jan 29 14:32:03 crc kubenswrapper[4753]: I0129 14:32:03.435991 4753 scope.go:117] "RemoveContainer" containerID="2b8d6f5f823d79c5552be1a33a063d9917a87efebc6b117bc7cf6449fe76d059" Jan 29 14:32:03 crc kubenswrapper[4753]: I0129 14:32:03.469330 4753 scope.go:117] "RemoveContainer" containerID="ee659220c5a97c6757aaa93bc87dc40cc3e931a6647a61d65e9b6af24ceca0f8" Jan 29 14:32:12 crc kubenswrapper[4753]: I0129 14:32:12.149782 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:32:12 crc kubenswrapper[4753]: E0129 14:32:12.150670 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:32:26 crc kubenswrapper[4753]: I0129 14:32:26.157303 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:32:26 crc kubenswrapper[4753]: E0129 14:32:26.158209 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:32:40 crc kubenswrapper[4753]: I0129 14:32:40.150791 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:32:40 crc kubenswrapper[4753]: E0129 14:32:40.154025 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:32:54 crc kubenswrapper[4753]: I0129 14:32:54.148871 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:32:54 crc kubenswrapper[4753]: E0129 14:32:54.149740 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:33:06 crc kubenswrapper[4753]: I0129 14:33:06.156238 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:33:06 crc kubenswrapper[4753]: E0129 14:33:06.158794 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:33:19 crc kubenswrapper[4753]: I0129 14:33:19.150317 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:33:19 crc kubenswrapper[4753]: E0129 14:33:19.151538 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:33:32 crc kubenswrapper[4753]: I0129 14:33:32.149514 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:33:32 crc kubenswrapper[4753]: I0129 14:33:32.712804 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"0f86dfb6acef39d11ef3c29fe257ba700155bfdc434286a050a57a0cbe7089c3"} Jan 29 14:35:20 crc kubenswrapper[4753]: I0129 14:35:20.874427 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l5ncs"] Jan 29 14:35:20 crc kubenswrapper[4753]: E0129 14:35:20.875522 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f636be-69c3-4afd-a2f0-fbf0e47c0e83" containerName="collect-profiles" Jan 29 14:35:20 crc kubenswrapper[4753]: I0129 14:35:20.875544 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f636be-69c3-4afd-a2f0-fbf0e47c0e83" containerName="collect-profiles" Jan 29 14:35:20 crc kubenswrapper[4753]: I0129 14:35:20.875815 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f636be-69c3-4afd-a2f0-fbf0e47c0e83" containerName="collect-profiles" Jan 29 14:35:20 crc kubenswrapper[4753]: I0129 14:35:20.877658 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:20 crc kubenswrapper[4753]: I0129 14:35:20.887609 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5ncs"] Jan 29 14:35:20 crc kubenswrapper[4753]: I0129 14:35:20.919548 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-utilities\") pod \"redhat-operators-l5ncs\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:20 crc kubenswrapper[4753]: I0129 14:35:20.919612 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-catalog-content\") pod \"redhat-operators-l5ncs\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:20 crc kubenswrapper[4753]: I0129 14:35:20.919758 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bp68\" (UniqueName: \"kubernetes.io/projected/da789a68-3efa-4053-802e-13a6b8914c44-kube-api-access-5bp68\") pod \"redhat-operators-l5ncs\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.020748 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-utilities\") pod \"redhat-operators-l5ncs\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.020791 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-catalog-content\") pod \"redhat-operators-l5ncs\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.020860 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bp68\" (UniqueName: \"kubernetes.io/projected/da789a68-3efa-4053-802e-13a6b8914c44-kube-api-access-5bp68\") pod \"redhat-operators-l5ncs\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.021447 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-utilities\") pod \"redhat-operators-l5ncs\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.021519 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-catalog-content\") pod \"redhat-operators-l5ncs\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.047267 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bp68\" (UniqueName: \"kubernetes.io/projected/da789a68-3efa-4053-802e-13a6b8914c44-kube-api-access-5bp68\") pod \"redhat-operators-l5ncs\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.206462 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.433964 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5ncs"] Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.706600 4753 generic.go:334] "Generic (PLEG): container finished" podID="da789a68-3efa-4053-802e-13a6b8914c44" containerID="cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca" exitCode=0 Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.706867 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5ncs" event={"ID":"da789a68-3efa-4053-802e-13a6b8914c44","Type":"ContainerDied","Data":"cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca"} Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.707018 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5ncs" event={"ID":"da789a68-3efa-4053-802e-13a6b8914c44","Type":"ContainerStarted","Data":"9b760c091500b6e7655bfc1844ef44b8203965334760db9a63cf20bf995ad4c8"} Jan 29 14:35:21 crc kubenswrapper[4753]: I0129 14:35:21.709027 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 14:35:23 crc kubenswrapper[4753]: I0129 14:35:23.721805 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5ncs" event={"ID":"da789a68-3efa-4053-802e-13a6b8914c44","Type":"ContainerStarted","Data":"80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07"} Jan 29 14:35:24 crc kubenswrapper[4753]: I0129 14:35:24.733360 4753 generic.go:334] "Generic (PLEG): container finished" podID="da789a68-3efa-4053-802e-13a6b8914c44" containerID="80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07" exitCode=0 Jan 29 14:35:24 crc kubenswrapper[4753]: I0129 14:35:24.733687 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5ncs" event={"ID":"da789a68-3efa-4053-802e-13a6b8914c44","Type":"ContainerDied","Data":"80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07"} Jan 29 14:35:25 crc kubenswrapper[4753]: I0129 14:35:25.744918 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5ncs" event={"ID":"da789a68-3efa-4053-802e-13a6b8914c44","Type":"ContainerStarted","Data":"9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008"} Jan 29 14:35:25 crc kubenswrapper[4753]: I0129 14:35:25.784314 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l5ncs" podStartSLOduration=2.307260485 podStartE2EDuration="5.78428726s" podCreationTimestamp="2026-01-29 14:35:20 +0000 UTC" firstStartedPulling="2026-01-29 14:35:21.708684371 +0000 UTC m=+1956.403418753" lastFinishedPulling="2026-01-29 14:35:25.185711116 +0000 UTC m=+1959.880445528" observedRunningTime="2026-01-29 14:35:25.778016806 +0000 UTC m=+1960.472751248" watchObservedRunningTime="2026-01-29 14:35:25.78428726 +0000 UTC m=+1960.479021692" Jan 29 14:35:31 crc kubenswrapper[4753]: I0129 14:35:31.207538 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:31 crc kubenswrapper[4753]: I0129 14:35:31.208047 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:32 crc kubenswrapper[4753]: I0129 14:35:32.256751 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l5ncs" podUID="da789a68-3efa-4053-802e-13a6b8914c44" containerName="registry-server" probeResult="failure" output=< Jan 29 14:35:32 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 14:35:32 crc kubenswrapper[4753]: > Jan 29 14:35:41 crc kubenswrapper[4753]: I0129 14:35:41.293729 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:41 crc kubenswrapper[4753]: I0129 14:35:41.364478 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:41 crc kubenswrapper[4753]: I0129 14:35:41.551955 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5ncs"] Jan 29 14:35:42 crc kubenswrapper[4753]: I0129 14:35:42.876848 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l5ncs" podUID="da789a68-3efa-4053-802e-13a6b8914c44" containerName="registry-server" containerID="cri-o://9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008" gracePeriod=2 Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.495702 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.614418 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-catalog-content\") pod \"da789a68-3efa-4053-802e-13a6b8914c44\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.614612 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bp68\" (UniqueName: \"kubernetes.io/projected/da789a68-3efa-4053-802e-13a6b8914c44-kube-api-access-5bp68\") pod \"da789a68-3efa-4053-802e-13a6b8914c44\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.614674 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-utilities\") pod \"da789a68-3efa-4053-802e-13a6b8914c44\" (UID: \"da789a68-3efa-4053-802e-13a6b8914c44\") " Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.616009 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-utilities" (OuterVolumeSpecName: "utilities") pod "da789a68-3efa-4053-802e-13a6b8914c44" (UID: "da789a68-3efa-4053-802e-13a6b8914c44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.622122 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da789a68-3efa-4053-802e-13a6b8914c44-kube-api-access-5bp68" (OuterVolumeSpecName: "kube-api-access-5bp68") pod "da789a68-3efa-4053-802e-13a6b8914c44" (UID: "da789a68-3efa-4053-802e-13a6b8914c44"). InnerVolumeSpecName "kube-api-access-5bp68". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.716678 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bp68\" (UniqueName: \"kubernetes.io/projected/da789a68-3efa-4053-802e-13a6b8914c44-kube-api-access-5bp68\") on node \"crc\" DevicePath \"\"" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.717038 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.783679 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da789a68-3efa-4053-802e-13a6b8914c44" (UID: "da789a68-3efa-4053-802e-13a6b8914c44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.817718 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da789a68-3efa-4053-802e-13a6b8914c44-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.886936 4753 generic.go:334] "Generic (PLEG): container finished" podID="da789a68-3efa-4053-802e-13a6b8914c44" containerID="9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008" exitCode=0 Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.886984 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5ncs" event={"ID":"da789a68-3efa-4053-802e-13a6b8914c44","Type":"ContainerDied","Data":"9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008"} Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.887016 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5ncs" event={"ID":"da789a68-3efa-4053-802e-13a6b8914c44","Type":"ContainerDied","Data":"9b760c091500b6e7655bfc1844ef44b8203965334760db9a63cf20bf995ad4c8"} Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.887042 4753 scope.go:117] "RemoveContainer" containerID="9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.887043 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5ncs" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.912238 4753 scope.go:117] "RemoveContainer" containerID="80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.928634 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5ncs"] Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.947073 4753 scope.go:117] "RemoveContainer" containerID="cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.951843 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l5ncs"] Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.988846 4753 scope.go:117] "RemoveContainer" containerID="9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008" Jan 29 14:35:43 crc kubenswrapper[4753]: E0129 14:35:43.989426 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008\": container with ID starting with 9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008 not found: ID does not exist" containerID="9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.989496 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008"} err="failed to get container status \"9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008\": rpc error: code = NotFound desc = could not find container \"9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008\": container with ID starting with 9d79886b70db4adf0c182523cc13a378fdf71fabced4f3fa0fc465548bc46008 not found: ID does not exist" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.989543 4753 scope.go:117] "RemoveContainer" containerID="80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07" Jan 29 14:35:43 crc kubenswrapper[4753]: E0129 14:35:43.990088 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07\": container with ID starting with 80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07 not found: ID does not exist" containerID="80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.990220 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07"} err="failed to get container status \"80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07\": rpc error: code = NotFound desc = could not find container \"80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07\": container with ID starting with 80a9a85c8e89d1dfd4b3170f12e1921efdc45000cb9d1ee327b711c9ab026b07 not found: ID does not exist" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.990313 4753 scope.go:117] "RemoveContainer" containerID="cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca" Jan 29 14:35:43 crc kubenswrapper[4753]: E0129 14:35:43.990724 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca\": container with ID starting with cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca not found: ID does not exist" containerID="cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca" Jan 29 14:35:43 crc kubenswrapper[4753]: I0129 14:35:43.990809 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca"} err="failed to get container status \"cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca\": rpc error: code = NotFound desc = could not find container \"cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca\": container with ID starting with cba8e80d297bb3b04eb1ede5f203b0cf7761f8a6b726f6900860e5fa3aaa79ca not found: ID does not exist" Jan 29 14:35:44 crc kubenswrapper[4753]: I0129 14:35:44.163230 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da789a68-3efa-4053-802e-13a6b8914c44" path="/var/lib/kubelet/pods/da789a68-3efa-4053-802e-13a6b8914c44/volumes" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.111606 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zmdqf"] Jan 29 14:35:53 crc kubenswrapper[4753]: E0129 14:35:53.115504 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da789a68-3efa-4053-802e-13a6b8914c44" containerName="extract-utilities" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.115530 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="da789a68-3efa-4053-802e-13a6b8914c44" containerName="extract-utilities" Jan 29 14:35:53 crc kubenswrapper[4753]: E0129 14:35:53.115544 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da789a68-3efa-4053-802e-13a6b8914c44" containerName="registry-server" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.115557 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="da789a68-3efa-4053-802e-13a6b8914c44" containerName="registry-server" Jan 29 14:35:53 crc kubenswrapper[4753]: E0129 14:35:53.115577 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da789a68-3efa-4053-802e-13a6b8914c44" containerName="extract-content" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.115590 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="da789a68-3efa-4053-802e-13a6b8914c44" containerName="extract-content" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.115856 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="da789a68-3efa-4053-802e-13a6b8914c44" containerName="registry-server" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.117522 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.135344 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zmdqf"] Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.270008 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-catalog-content\") pod \"certified-operators-zmdqf\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.270311 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tcmv\" (UniqueName: \"kubernetes.io/projected/469997cd-a0ab-4219-ab81-442db29f0621-kube-api-access-2tcmv\") pod \"certified-operators-zmdqf\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.270377 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-utilities\") pod \"certified-operators-zmdqf\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.371536 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-catalog-content\") pod \"certified-operators-zmdqf\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.371614 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tcmv\" (UniqueName: \"kubernetes.io/projected/469997cd-a0ab-4219-ab81-442db29f0621-kube-api-access-2tcmv\") pod \"certified-operators-zmdqf\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.371678 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-utilities\") pod \"certified-operators-zmdqf\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.372381 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-catalog-content\") pod \"certified-operators-zmdqf\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.372415 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-utilities\") pod \"certified-operators-zmdqf\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.393884 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tcmv\" (UniqueName: \"kubernetes.io/projected/469997cd-a0ab-4219-ab81-442db29f0621-kube-api-access-2tcmv\") pod \"certified-operators-zmdqf\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.440106 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.707837 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zmdqf"] Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.972456 4753 generic.go:334] "Generic (PLEG): container finished" podID="469997cd-a0ab-4219-ab81-442db29f0621" containerID="0c524c4221737d707174fc79639fd24008e12c27a8be14a10e8ec02b5afa626b" exitCode=0 Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.972533 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmdqf" event={"ID":"469997cd-a0ab-4219-ab81-442db29f0621","Type":"ContainerDied","Data":"0c524c4221737d707174fc79639fd24008e12c27a8be14a10e8ec02b5afa626b"} Jan 29 14:35:53 crc kubenswrapper[4753]: I0129 14:35:53.972611 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmdqf" event={"ID":"469997cd-a0ab-4219-ab81-442db29f0621","Type":"ContainerStarted","Data":"e0810b2248aabff31d6749e9ec49b47837d7881a33b85df7cb74a93a4697ef1b"} Jan 29 14:35:54 crc kubenswrapper[4753]: I0129 14:35:54.987975 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmdqf" event={"ID":"469997cd-a0ab-4219-ab81-442db29f0621","Type":"ContainerStarted","Data":"07640e8667ce6794581242c6ab96519eaa6791d3134276f1de8b132c6dd85fe5"} Jan 29 14:35:55 crc kubenswrapper[4753]: I0129 14:35:55.999352 4753 generic.go:334] "Generic (PLEG): container finished" podID="469997cd-a0ab-4219-ab81-442db29f0621" containerID="07640e8667ce6794581242c6ab96519eaa6791d3134276f1de8b132c6dd85fe5" exitCode=0 Jan 29 14:35:55 crc kubenswrapper[4753]: I0129 14:35:55.999391 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmdqf" event={"ID":"469997cd-a0ab-4219-ab81-442db29f0621","Type":"ContainerDied","Data":"07640e8667ce6794581242c6ab96519eaa6791d3134276f1de8b132c6dd85fe5"} Jan 29 14:35:57 crc kubenswrapper[4753]: I0129 14:35:57.007632 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmdqf" event={"ID":"469997cd-a0ab-4219-ab81-442db29f0621","Type":"ContainerStarted","Data":"a51bd8d0cda9538352889dccb9c1bade27c7d03cbe574ccbdb67b030dac0bc36"} Jan 29 14:35:57 crc kubenswrapper[4753]: I0129 14:35:57.033825 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zmdqf" podStartSLOduration=1.579060101 podStartE2EDuration="4.033807349s" podCreationTimestamp="2026-01-29 14:35:53 +0000 UTC" firstStartedPulling="2026-01-29 14:35:53.97377567 +0000 UTC m=+1988.668510052" lastFinishedPulling="2026-01-29 14:35:56.428522888 +0000 UTC m=+1991.123257300" observedRunningTime="2026-01-29 14:35:57.031601524 +0000 UTC m=+1991.726335976" watchObservedRunningTime="2026-01-29 14:35:57.033807349 +0000 UTC m=+1991.728541731" Jan 29 14:35:57 crc kubenswrapper[4753]: I0129 14:35:57.055613 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:35:57 crc kubenswrapper[4753]: I0129 14:35:57.055707 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:36:03 crc kubenswrapper[4753]: I0129 14:36:03.442427 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:36:03 crc kubenswrapper[4753]: I0129 14:36:03.443265 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:36:03 crc kubenswrapper[4753]: I0129 14:36:03.517433 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:36:04 crc kubenswrapper[4753]: I0129 14:36:04.165849 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:36:06 crc kubenswrapper[4753]: I0129 14:36:06.276881 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zmdqf"] Jan 29 14:36:06 crc kubenswrapper[4753]: I0129 14:36:06.277524 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zmdqf" podUID="469997cd-a0ab-4219-ab81-442db29f0621" containerName="registry-server" containerID="cri-o://a51bd8d0cda9538352889dccb9c1bade27c7d03cbe574ccbdb67b030dac0bc36" gracePeriod=2 Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.105014 4753 generic.go:334] "Generic (PLEG): container finished" podID="469997cd-a0ab-4219-ab81-442db29f0621" containerID="a51bd8d0cda9538352889dccb9c1bade27c7d03cbe574ccbdb67b030dac0bc36" exitCode=0 Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.105113 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmdqf" event={"ID":"469997cd-a0ab-4219-ab81-442db29f0621","Type":"ContainerDied","Data":"a51bd8d0cda9538352889dccb9c1bade27c7d03cbe574ccbdb67b030dac0bc36"} Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.239479 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.400113 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-utilities\") pod \"469997cd-a0ab-4219-ab81-442db29f0621\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.400190 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-catalog-content\") pod \"469997cd-a0ab-4219-ab81-442db29f0621\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.400225 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tcmv\" (UniqueName: \"kubernetes.io/projected/469997cd-a0ab-4219-ab81-442db29f0621-kube-api-access-2tcmv\") pod \"469997cd-a0ab-4219-ab81-442db29f0621\" (UID: \"469997cd-a0ab-4219-ab81-442db29f0621\") " Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.401468 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-utilities" (OuterVolumeSpecName: "utilities") pod "469997cd-a0ab-4219-ab81-442db29f0621" (UID: "469997cd-a0ab-4219-ab81-442db29f0621"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.411375 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469997cd-a0ab-4219-ab81-442db29f0621-kube-api-access-2tcmv" (OuterVolumeSpecName: "kube-api-access-2tcmv") pod "469997cd-a0ab-4219-ab81-442db29f0621" (UID: "469997cd-a0ab-4219-ab81-442db29f0621"). InnerVolumeSpecName "kube-api-access-2tcmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.479712 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "469997cd-a0ab-4219-ab81-442db29f0621" (UID: "469997cd-a0ab-4219-ab81-442db29f0621"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.501939 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.502001 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/469997cd-a0ab-4219-ab81-442db29f0621-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:36:07 crc kubenswrapper[4753]: I0129 14:36:07.502028 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tcmv\" (UniqueName: \"kubernetes.io/projected/469997cd-a0ab-4219-ab81-442db29f0621-kube-api-access-2tcmv\") on node \"crc\" DevicePath \"\"" Jan 29 14:36:08 crc kubenswrapper[4753]: I0129 14:36:08.114945 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmdqf" event={"ID":"469997cd-a0ab-4219-ab81-442db29f0621","Type":"ContainerDied","Data":"e0810b2248aabff31d6749e9ec49b47837d7881a33b85df7cb74a93a4697ef1b"} Jan 29 14:36:08 crc kubenswrapper[4753]: I0129 14:36:08.114996 4753 scope.go:117] "RemoveContainer" containerID="a51bd8d0cda9538352889dccb9c1bade27c7d03cbe574ccbdb67b030dac0bc36" Jan 29 14:36:08 crc kubenswrapper[4753]: I0129 14:36:08.115111 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmdqf" Jan 29 14:36:08 crc kubenswrapper[4753]: I0129 14:36:08.158748 4753 scope.go:117] "RemoveContainer" containerID="07640e8667ce6794581242c6ab96519eaa6791d3134276f1de8b132c6dd85fe5" Jan 29 14:36:08 crc kubenswrapper[4753]: I0129 14:36:08.172338 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zmdqf"] Jan 29 14:36:08 crc kubenswrapper[4753]: I0129 14:36:08.172396 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zmdqf"] Jan 29 14:36:08 crc kubenswrapper[4753]: I0129 14:36:08.207660 4753 scope.go:117] "RemoveContainer" containerID="0c524c4221737d707174fc79639fd24008e12c27a8be14a10e8ec02b5afa626b" Jan 29 14:36:10 crc kubenswrapper[4753]: I0129 14:36:10.158130 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="469997cd-a0ab-4219-ab81-442db29f0621" path="/var/lib/kubelet/pods/469997cd-a0ab-4219-ab81-442db29f0621/volumes" Jan 29 14:36:27 crc kubenswrapper[4753]: I0129 14:36:27.054632 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:36:27 crc kubenswrapper[4753]: I0129 14:36:27.055403 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:36:57 crc kubenswrapper[4753]: I0129 14:36:57.055090 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:36:57 crc kubenswrapper[4753]: I0129 14:36:57.055761 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:36:57 crc kubenswrapper[4753]: I0129 14:36:57.055818 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:36:57 crc kubenswrapper[4753]: I0129 14:36:57.056732 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0f86dfb6acef39d11ef3c29fe257ba700155bfdc434286a050a57a0cbe7089c3"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:36:57 crc kubenswrapper[4753]: I0129 14:36:57.056809 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://0f86dfb6acef39d11ef3c29fe257ba700155bfdc434286a050a57a0cbe7089c3" gracePeriod=600 Jan 29 14:36:57 crc kubenswrapper[4753]: I0129 14:36:57.491860 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="0f86dfb6acef39d11ef3c29fe257ba700155bfdc434286a050a57a0cbe7089c3" exitCode=0 Jan 29 14:36:57 crc kubenswrapper[4753]: I0129 14:36:57.492032 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"0f86dfb6acef39d11ef3c29fe257ba700155bfdc434286a050a57a0cbe7089c3"} Jan 29 14:36:57 crc kubenswrapper[4753]: I0129 14:36:57.492286 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47"} Jan 29 14:36:57 crc kubenswrapper[4753]: I0129 14:36:57.492311 4753 scope.go:117] "RemoveContainer" containerID="6d177ec05e72c5959770c1f87e4ec4613dbdf5ffc53d0c3ce8385b5796a05f88" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.270208 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wz2xz"] Jan 29 14:37:44 crc kubenswrapper[4753]: E0129 14:37:44.271043 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469997cd-a0ab-4219-ab81-442db29f0621" containerName="registry-server" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.271056 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="469997cd-a0ab-4219-ab81-442db29f0621" containerName="registry-server" Jan 29 14:37:44 crc kubenswrapper[4753]: E0129 14:37:44.271078 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469997cd-a0ab-4219-ab81-442db29f0621" containerName="extract-content" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.271084 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="469997cd-a0ab-4219-ab81-442db29f0621" containerName="extract-content" Jan 29 14:37:44 crc kubenswrapper[4753]: E0129 14:37:44.271104 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469997cd-a0ab-4219-ab81-442db29f0621" containerName="extract-utilities" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.271110 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="469997cd-a0ab-4219-ab81-442db29f0621" containerName="extract-utilities" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.271268 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="469997cd-a0ab-4219-ab81-442db29f0621" containerName="registry-server" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.272259 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.295118 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz2xz"] Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.437828 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-utilities\") pod \"redhat-marketplace-wz2xz\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.437928 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49xwn\" (UniqueName: \"kubernetes.io/projected/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-kube-api-access-49xwn\") pod \"redhat-marketplace-wz2xz\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.437983 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-catalog-content\") pod \"redhat-marketplace-wz2xz\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.539882 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-utilities\") pod \"redhat-marketplace-wz2xz\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.539962 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49xwn\" (UniqueName: \"kubernetes.io/projected/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-kube-api-access-49xwn\") pod \"redhat-marketplace-wz2xz\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.540002 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-catalog-content\") pod \"redhat-marketplace-wz2xz\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.540486 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-utilities\") pod \"redhat-marketplace-wz2xz\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.540630 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-catalog-content\") pod \"redhat-marketplace-wz2xz\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.559315 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49xwn\" (UniqueName: \"kubernetes.io/projected/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-kube-api-access-49xwn\") pod \"redhat-marketplace-wz2xz\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.586878 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.843486 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz2xz"] Jan 29 14:37:44 crc kubenswrapper[4753]: I0129 14:37:44.969975 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz2xz" event={"ID":"a4a0a80d-56bf-4e3a-adbf-522c927a0c95","Type":"ContainerStarted","Data":"9c6a8dce1a66006de9aebdc0256792943b2d9abadd8579e3eeb314f43d26137e"} Jan 29 14:37:45 crc kubenswrapper[4753]: I0129 14:37:45.978710 4753 generic.go:334] "Generic (PLEG): container finished" podID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerID="bfc18149847a486d2da5b62d9f824fabb196401cef519f993cb3760724b46e59" exitCode=0 Jan 29 14:37:45 crc kubenswrapper[4753]: I0129 14:37:45.978760 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz2xz" event={"ID":"a4a0a80d-56bf-4e3a-adbf-522c927a0c95","Type":"ContainerDied","Data":"bfc18149847a486d2da5b62d9f824fabb196401cef519f993cb3760724b46e59"} Jan 29 14:37:47 crc kubenswrapper[4753]: I0129 14:37:47.995298 4753 generic.go:334] "Generic (PLEG): container finished" podID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerID="74aa4868552fba6ad5ebba57875664929981a8ff53b1ba7130d5e79768c3bb72" exitCode=0 Jan 29 14:37:47 crc kubenswrapper[4753]: I0129 14:37:47.995456 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz2xz" event={"ID":"a4a0a80d-56bf-4e3a-adbf-522c927a0c95","Type":"ContainerDied","Data":"74aa4868552fba6ad5ebba57875664929981a8ff53b1ba7130d5e79768c3bb72"} Jan 29 14:37:50 crc kubenswrapper[4753]: I0129 14:37:50.016342 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz2xz" event={"ID":"a4a0a80d-56bf-4e3a-adbf-522c927a0c95","Type":"ContainerStarted","Data":"eeb5c9592aee6f4124adf4b882f5daf86fa66d80801492fb0302747ff4ca4236"} Jan 29 14:37:54 crc kubenswrapper[4753]: I0129 14:37:54.587293 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:54 crc kubenswrapper[4753]: I0129 14:37:54.589069 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:54 crc kubenswrapper[4753]: I0129 14:37:54.634956 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:54 crc kubenswrapper[4753]: I0129 14:37:54.660257 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wz2xz" podStartSLOduration=6.841801133 podStartE2EDuration="10.660237185s" podCreationTimestamp="2026-01-29 14:37:44 +0000 UTC" firstStartedPulling="2026-01-29 14:37:45.980214424 +0000 UTC m=+2100.674948806" lastFinishedPulling="2026-01-29 14:37:49.798650446 +0000 UTC m=+2104.493384858" observedRunningTime="2026-01-29 14:37:50.037985351 +0000 UTC m=+2104.732719753" watchObservedRunningTime="2026-01-29 14:37:54.660237185 +0000 UTC m=+2109.354971577" Jan 29 14:37:55 crc kubenswrapper[4753]: I0129 14:37:55.108471 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:37:56 crc kubenswrapper[4753]: I0129 14:37:56.258392 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz2xz"] Jan 29 14:37:58 crc kubenswrapper[4753]: I0129 14:37:58.085190 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wz2xz" podUID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerName="registry-server" containerID="cri-o://eeb5c9592aee6f4124adf4b882f5daf86fa66d80801492fb0302747ff4ca4236" gracePeriod=2 Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.102199 4753 generic.go:334] "Generic (PLEG): container finished" podID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerID="eeb5c9592aee6f4124adf4b882f5daf86fa66d80801492fb0302747ff4ca4236" exitCode=0 Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.102279 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz2xz" event={"ID":"a4a0a80d-56bf-4e3a-adbf-522c927a0c95","Type":"ContainerDied","Data":"eeb5c9592aee6f4124adf4b882f5daf86fa66d80801492fb0302747ff4ca4236"} Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.468798 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.585236 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-catalog-content\") pod \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.585294 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-utilities\") pod \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.585357 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49xwn\" (UniqueName: \"kubernetes.io/projected/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-kube-api-access-49xwn\") pod \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\" (UID: \"a4a0a80d-56bf-4e3a-adbf-522c927a0c95\") " Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.586079 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-utilities" (OuterVolumeSpecName: "utilities") pod "a4a0a80d-56bf-4e3a-adbf-522c927a0c95" (UID: "a4a0a80d-56bf-4e3a-adbf-522c927a0c95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.590834 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-kube-api-access-49xwn" (OuterVolumeSpecName: "kube-api-access-49xwn") pod "a4a0a80d-56bf-4e3a-adbf-522c927a0c95" (UID: "a4a0a80d-56bf-4e3a-adbf-522c927a0c95"). InnerVolumeSpecName "kube-api-access-49xwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.620235 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4a0a80d-56bf-4e3a-adbf-522c927a0c95" (UID: "a4a0a80d-56bf-4e3a-adbf-522c927a0c95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.687400 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.687453 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:38:00 crc kubenswrapper[4753]: I0129 14:38:00.687507 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49xwn\" (UniqueName: \"kubernetes.io/projected/a4a0a80d-56bf-4e3a-adbf-522c927a0c95-kube-api-access-49xwn\") on node \"crc\" DevicePath \"\"" Jan 29 14:38:01 crc kubenswrapper[4753]: I0129 14:38:01.113476 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz2xz" event={"ID":"a4a0a80d-56bf-4e3a-adbf-522c927a0c95","Type":"ContainerDied","Data":"9c6a8dce1a66006de9aebdc0256792943b2d9abadd8579e3eeb314f43d26137e"} Jan 29 14:38:01 crc kubenswrapper[4753]: I0129 14:38:01.113555 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz2xz" Jan 29 14:38:01 crc kubenswrapper[4753]: I0129 14:38:01.113603 4753 scope.go:117] "RemoveContainer" containerID="eeb5c9592aee6f4124adf4b882f5daf86fa66d80801492fb0302747ff4ca4236" Jan 29 14:38:01 crc kubenswrapper[4753]: I0129 14:38:01.161561 4753 scope.go:117] "RemoveContainer" containerID="74aa4868552fba6ad5ebba57875664929981a8ff53b1ba7130d5e79768c3bb72" Jan 29 14:38:01 crc kubenswrapper[4753]: I0129 14:38:01.167990 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz2xz"] Jan 29 14:38:01 crc kubenswrapper[4753]: I0129 14:38:01.178127 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz2xz"] Jan 29 14:38:01 crc kubenswrapper[4753]: I0129 14:38:01.207489 4753 scope.go:117] "RemoveContainer" containerID="bfc18149847a486d2da5b62d9f824fabb196401cef519f993cb3760724b46e59" Jan 29 14:38:02 crc kubenswrapper[4753]: I0129 14:38:02.166128 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" path="/var/lib/kubelet/pods/a4a0a80d-56bf-4e3a-adbf-522c927a0c95/volumes" Jan 29 14:38:57 crc kubenswrapper[4753]: I0129 14:38:57.055109 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:38:57 crc kubenswrapper[4753]: I0129 14:38:57.055749 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:39:27 crc kubenswrapper[4753]: I0129 14:39:27.054750 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:39:27 crc kubenswrapper[4753]: I0129 14:39:27.055843 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.285581 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9r7r2"] Jan 29 14:39:37 crc kubenswrapper[4753]: E0129 14:39:37.286631 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerName="registry-server" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.286652 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerName="registry-server" Jan 29 14:39:37 crc kubenswrapper[4753]: E0129 14:39:37.286675 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerName="extract-utilities" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.286685 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerName="extract-utilities" Jan 29 14:39:37 crc kubenswrapper[4753]: E0129 14:39:37.286721 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerName="extract-content" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.286732 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerName="extract-content" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.286939 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a0a80d-56bf-4e3a-adbf-522c927a0c95" containerName="registry-server" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.288525 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.302574 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9r7r2"] Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.357608 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-catalog-content\") pod \"community-operators-9r7r2\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.357675 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-utilities\") pod \"community-operators-9r7r2\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.357717 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7ptl\" (UniqueName: \"kubernetes.io/projected/c251f724-a172-4cb9-ad93-0b0145377d40-kube-api-access-s7ptl\") pod \"community-operators-9r7r2\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.459393 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-catalog-content\") pod \"community-operators-9r7r2\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.459475 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-utilities\") pod \"community-operators-9r7r2\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.459534 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7ptl\" (UniqueName: \"kubernetes.io/projected/c251f724-a172-4cb9-ad93-0b0145377d40-kube-api-access-s7ptl\") pod \"community-operators-9r7r2\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.460029 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-catalog-content\") pod \"community-operators-9r7r2\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.460135 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-utilities\") pod \"community-operators-9r7r2\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.480370 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7ptl\" (UniqueName: \"kubernetes.io/projected/c251f724-a172-4cb9-ad93-0b0145377d40-kube-api-access-s7ptl\") pod \"community-operators-9r7r2\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.616944 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.880119 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9r7r2"] Jan 29 14:39:37 crc kubenswrapper[4753]: I0129 14:39:37.921457 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r7r2" event={"ID":"c251f724-a172-4cb9-ad93-0b0145377d40","Type":"ContainerStarted","Data":"dfce5f21088ad11190281a1faba834789b5771dde80242e51a46331cd7ac55a0"} Jan 29 14:39:38 crc kubenswrapper[4753]: I0129 14:39:38.932687 4753 generic.go:334] "Generic (PLEG): container finished" podID="c251f724-a172-4cb9-ad93-0b0145377d40" containerID="be7b3f8fc9f245dd5fe79d1d9e0aaa0987a805c22b014f59fa854adff6b9bdf5" exitCode=0 Jan 29 14:39:38 crc kubenswrapper[4753]: I0129 14:39:38.932860 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r7r2" event={"ID":"c251f724-a172-4cb9-ad93-0b0145377d40","Type":"ContainerDied","Data":"be7b3f8fc9f245dd5fe79d1d9e0aaa0987a805c22b014f59fa854adff6b9bdf5"} Jan 29 14:39:40 crc kubenswrapper[4753]: I0129 14:39:40.952665 4753 generic.go:334] "Generic (PLEG): container finished" podID="c251f724-a172-4cb9-ad93-0b0145377d40" containerID="a9fe9299c993cd5757fdf6939b39efcc3104c4b1e921c26df26593022c2596d6" exitCode=0 Jan 29 14:39:40 crc kubenswrapper[4753]: I0129 14:39:40.952890 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r7r2" event={"ID":"c251f724-a172-4cb9-ad93-0b0145377d40","Type":"ContainerDied","Data":"a9fe9299c993cd5757fdf6939b39efcc3104c4b1e921c26df26593022c2596d6"} Jan 29 14:39:42 crc kubenswrapper[4753]: I0129 14:39:42.980780 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r7r2" event={"ID":"c251f724-a172-4cb9-ad93-0b0145377d40","Type":"ContainerStarted","Data":"5deb5d844cc5f186f8f72b7e40dae2ecc4128d8c19612114e5a8eb9aa22a68d1"} Jan 29 14:39:47 crc kubenswrapper[4753]: I0129 14:39:47.617887 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:47 crc kubenswrapper[4753]: I0129 14:39:47.618451 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:47 crc kubenswrapper[4753]: I0129 14:39:47.692493 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:47 crc kubenswrapper[4753]: I0129 14:39:47.724083 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9r7r2" podStartSLOduration=7.886090922 podStartE2EDuration="10.72406718s" podCreationTimestamp="2026-01-29 14:39:37 +0000 UTC" firstStartedPulling="2026-01-29 14:39:38.9344917 +0000 UTC m=+2213.629226082" lastFinishedPulling="2026-01-29 14:39:41.772467958 +0000 UTC m=+2216.467202340" observedRunningTime="2026-01-29 14:39:43.005855349 +0000 UTC m=+2217.700589741" watchObservedRunningTime="2026-01-29 14:39:47.72406718 +0000 UTC m=+2222.418801562" Jan 29 14:39:48 crc kubenswrapper[4753]: I0129 14:39:48.102550 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:48 crc kubenswrapper[4753]: I0129 14:39:48.168645 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9r7r2"] Jan 29 14:39:50 crc kubenswrapper[4753]: I0129 14:39:50.044468 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9r7r2" podUID="c251f724-a172-4cb9-ad93-0b0145377d40" containerName="registry-server" containerID="cri-o://5deb5d844cc5f186f8f72b7e40dae2ecc4128d8c19612114e5a8eb9aa22a68d1" gracePeriod=2 Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.056035 4753 generic.go:334] "Generic (PLEG): container finished" podID="c251f724-a172-4cb9-ad93-0b0145377d40" containerID="5deb5d844cc5f186f8f72b7e40dae2ecc4128d8c19612114e5a8eb9aa22a68d1" exitCode=0 Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.056121 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r7r2" event={"ID":"c251f724-a172-4cb9-ad93-0b0145377d40","Type":"ContainerDied","Data":"5deb5d844cc5f186f8f72b7e40dae2ecc4128d8c19612114e5a8eb9aa22a68d1"} Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.056562 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9r7r2" event={"ID":"c251f724-a172-4cb9-ad93-0b0145377d40","Type":"ContainerDied","Data":"dfce5f21088ad11190281a1faba834789b5771dde80242e51a46331cd7ac55a0"} Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.056588 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfce5f21088ad11190281a1faba834789b5771dde80242e51a46331cd7ac55a0" Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.071913 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.166471 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7ptl\" (UniqueName: \"kubernetes.io/projected/c251f724-a172-4cb9-ad93-0b0145377d40-kube-api-access-s7ptl\") pod \"c251f724-a172-4cb9-ad93-0b0145377d40\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.166592 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-catalog-content\") pod \"c251f724-a172-4cb9-ad93-0b0145377d40\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.166624 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-utilities\") pod \"c251f724-a172-4cb9-ad93-0b0145377d40\" (UID: \"c251f724-a172-4cb9-ad93-0b0145377d40\") " Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.168436 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-utilities" (OuterVolumeSpecName: "utilities") pod "c251f724-a172-4cb9-ad93-0b0145377d40" (UID: "c251f724-a172-4cb9-ad93-0b0145377d40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.175511 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c251f724-a172-4cb9-ad93-0b0145377d40-kube-api-access-s7ptl" (OuterVolumeSpecName: "kube-api-access-s7ptl") pod "c251f724-a172-4cb9-ad93-0b0145377d40" (UID: "c251f724-a172-4cb9-ad93-0b0145377d40"). InnerVolumeSpecName "kube-api-access-s7ptl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.268493 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7ptl\" (UniqueName: \"kubernetes.io/projected/c251f724-a172-4cb9-ad93-0b0145377d40-kube-api-access-s7ptl\") on node \"crc\" DevicePath \"\"" Jan 29 14:39:51 crc kubenswrapper[4753]: I0129 14:39:51.268849 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:39:52 crc kubenswrapper[4753]: I0129 14:39:52.063882 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9r7r2" Jan 29 14:39:52 crc kubenswrapper[4753]: I0129 14:39:52.318582 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c251f724-a172-4cb9-ad93-0b0145377d40" (UID: "c251f724-a172-4cb9-ad93-0b0145377d40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:39:52 crc kubenswrapper[4753]: I0129 14:39:52.387675 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c251f724-a172-4cb9-ad93-0b0145377d40-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:39:52 crc kubenswrapper[4753]: I0129 14:39:52.422924 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9r7r2"] Jan 29 14:39:52 crc kubenswrapper[4753]: I0129 14:39:52.436497 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9r7r2"] Jan 29 14:39:54 crc kubenswrapper[4753]: I0129 14:39:54.163777 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c251f724-a172-4cb9-ad93-0b0145377d40" path="/var/lib/kubelet/pods/c251f724-a172-4cb9-ad93-0b0145377d40/volumes" Jan 29 14:39:57 crc kubenswrapper[4753]: I0129 14:39:57.054387 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:39:57 crc kubenswrapper[4753]: I0129 14:39:57.054464 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:39:57 crc kubenswrapper[4753]: I0129 14:39:57.054521 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:39:57 crc kubenswrapper[4753]: I0129 14:39:57.055344 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:39:57 crc kubenswrapper[4753]: I0129 14:39:57.055416 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" gracePeriod=600 Jan 29 14:39:57 crc kubenswrapper[4753]: E0129 14:39:57.691391 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:39:58 crc kubenswrapper[4753]: I0129 14:39:58.116808 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" exitCode=0 Jan 29 14:39:58 crc kubenswrapper[4753]: I0129 14:39:58.116865 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47"} Jan 29 14:39:58 crc kubenswrapper[4753]: I0129 14:39:58.116909 4753 scope.go:117] "RemoveContainer" containerID="0f86dfb6acef39d11ef3c29fe257ba700155bfdc434286a050a57a0cbe7089c3" Jan 29 14:39:58 crc kubenswrapper[4753]: I0129 14:39:58.117626 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:39:58 crc kubenswrapper[4753]: E0129 14:39:58.117931 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:40:13 crc kubenswrapper[4753]: I0129 14:40:13.149954 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:40:13 crc kubenswrapper[4753]: E0129 14:40:13.151019 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:40:24 crc kubenswrapper[4753]: I0129 14:40:24.149935 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:40:24 crc kubenswrapper[4753]: E0129 14:40:24.150966 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:40:39 crc kubenswrapper[4753]: I0129 14:40:39.149740 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:40:39 crc kubenswrapper[4753]: E0129 14:40:39.151142 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:40:50 crc kubenswrapper[4753]: I0129 14:40:50.149066 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:40:50 crc kubenswrapper[4753]: E0129 14:40:50.149806 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:41:04 crc kubenswrapper[4753]: I0129 14:41:04.149844 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:41:04 crc kubenswrapper[4753]: E0129 14:41:04.150806 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:41:16 crc kubenswrapper[4753]: I0129 14:41:16.155722 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:41:16 crc kubenswrapper[4753]: E0129 14:41:16.157407 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:41:31 crc kubenswrapper[4753]: I0129 14:41:31.149309 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:41:31 crc kubenswrapper[4753]: E0129 14:41:31.150126 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:41:44 crc kubenswrapper[4753]: I0129 14:41:44.149326 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:41:44 crc kubenswrapper[4753]: E0129 14:41:44.150062 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:41:59 crc kubenswrapper[4753]: I0129 14:41:59.149470 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:41:59 crc kubenswrapper[4753]: E0129 14:41:59.150187 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:42:13 crc kubenswrapper[4753]: I0129 14:42:13.149945 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:42:13 crc kubenswrapper[4753]: E0129 14:42:13.150789 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:42:25 crc kubenswrapper[4753]: I0129 14:42:25.149624 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:42:25 crc kubenswrapper[4753]: E0129 14:42:25.150795 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:42:37 crc kubenswrapper[4753]: I0129 14:42:37.149553 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:42:37 crc kubenswrapper[4753]: E0129 14:42:37.150520 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:42:50 crc kubenswrapper[4753]: I0129 14:42:50.149798 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:42:50 crc kubenswrapper[4753]: E0129 14:42:50.152208 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:43:02 crc kubenswrapper[4753]: I0129 14:43:02.149927 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:43:02 crc kubenswrapper[4753]: E0129 14:43:02.151104 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:43:14 crc kubenswrapper[4753]: I0129 14:43:14.150246 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:43:14 crc kubenswrapper[4753]: E0129 14:43:14.151515 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:43:26 crc kubenswrapper[4753]: I0129 14:43:26.153872 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:43:26 crc kubenswrapper[4753]: E0129 14:43:26.154726 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:43:41 crc kubenswrapper[4753]: I0129 14:43:41.149192 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:43:41 crc kubenswrapper[4753]: E0129 14:43:41.149972 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:43:54 crc kubenswrapper[4753]: I0129 14:43:54.156033 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:43:54 crc kubenswrapper[4753]: E0129 14:43:54.157302 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:44:06 crc kubenswrapper[4753]: I0129 14:44:06.156342 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:44:06 crc kubenswrapper[4753]: E0129 14:44:06.157665 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:44:20 crc kubenswrapper[4753]: I0129 14:44:20.151069 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:44:20 crc kubenswrapper[4753]: E0129 14:44:20.151818 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:44:35 crc kubenswrapper[4753]: I0129 14:44:35.149535 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:44:35 crc kubenswrapper[4753]: E0129 14:44:35.150344 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:44:48 crc kubenswrapper[4753]: I0129 14:44:48.149718 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:44:48 crc kubenswrapper[4753]: E0129 14:44:48.150430 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:44:59 crc kubenswrapper[4753]: I0129 14:44:59.149629 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:44:59 crc kubenswrapper[4753]: I0129 14:44:59.564257 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"f2942e28242220ecf81b3ccb242246ca35d9dd454dc42c62353d1a49f59b34e7"} Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.142573 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g"] Jan 29 14:45:00 crc kubenswrapper[4753]: E0129 14:45:00.142942 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c251f724-a172-4cb9-ad93-0b0145377d40" containerName="extract-content" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.142966 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c251f724-a172-4cb9-ad93-0b0145377d40" containerName="extract-content" Jan 29 14:45:00 crc kubenswrapper[4753]: E0129 14:45:00.142981 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c251f724-a172-4cb9-ad93-0b0145377d40" containerName="registry-server" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.142989 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c251f724-a172-4cb9-ad93-0b0145377d40" containerName="registry-server" Jan 29 14:45:00 crc kubenswrapper[4753]: E0129 14:45:00.143013 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c251f724-a172-4cb9-ad93-0b0145377d40" containerName="extract-utilities" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.143024 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c251f724-a172-4cb9-ad93-0b0145377d40" containerName="extract-utilities" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.143213 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c251f724-a172-4cb9-ad93-0b0145377d40" containerName="registry-server" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.143784 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.145522 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.147370 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.166140 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g"] Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.263827 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0ef84df-77f3-4784-9297-24192a945026-secret-volume\") pod \"collect-profiles-29494965-nts8g\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.264455 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6zh6\" (UniqueName: \"kubernetes.io/projected/c0ef84df-77f3-4784-9297-24192a945026-kube-api-access-p6zh6\") pod \"collect-profiles-29494965-nts8g\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.264634 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0ef84df-77f3-4784-9297-24192a945026-config-volume\") pod \"collect-profiles-29494965-nts8g\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.365737 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0ef84df-77f3-4784-9297-24192a945026-config-volume\") pod \"collect-profiles-29494965-nts8g\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.365781 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0ef84df-77f3-4784-9297-24192a945026-secret-volume\") pod \"collect-profiles-29494965-nts8g\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.365834 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6zh6\" (UniqueName: \"kubernetes.io/projected/c0ef84df-77f3-4784-9297-24192a945026-kube-api-access-p6zh6\") pod \"collect-profiles-29494965-nts8g\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.367104 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0ef84df-77f3-4784-9297-24192a945026-config-volume\") pod \"collect-profiles-29494965-nts8g\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.371667 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0ef84df-77f3-4784-9297-24192a945026-secret-volume\") pod \"collect-profiles-29494965-nts8g\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.383427 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6zh6\" (UniqueName: \"kubernetes.io/projected/c0ef84df-77f3-4784-9297-24192a945026-kube-api-access-p6zh6\") pod \"collect-profiles-29494965-nts8g\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.462020 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:00 crc kubenswrapper[4753]: I0129 14:45:00.938535 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g"] Jan 29 14:45:01 crc kubenswrapper[4753]: I0129 14:45:01.593774 4753 generic.go:334] "Generic (PLEG): container finished" podID="c0ef84df-77f3-4784-9297-24192a945026" containerID="ea199e04f3cbe79117dab2c6f58bb9d022ccfb9c01a1e0fb349945c347aefd7c" exitCode=0 Jan 29 14:45:01 crc kubenswrapper[4753]: I0129 14:45:01.593861 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" event={"ID":"c0ef84df-77f3-4784-9297-24192a945026","Type":"ContainerDied","Data":"ea199e04f3cbe79117dab2c6f58bb9d022ccfb9c01a1e0fb349945c347aefd7c"} Jan 29 14:45:01 crc kubenswrapper[4753]: I0129 14:45:01.594173 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" event={"ID":"c0ef84df-77f3-4784-9297-24192a945026","Type":"ContainerStarted","Data":"7f9d5ab06a535a3cc6e8532ef8b5851d6fd067b62e9df5520595bd3ba2378892"} Jan 29 14:45:02 crc kubenswrapper[4753]: I0129 14:45:02.901066 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.008766 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6zh6\" (UniqueName: \"kubernetes.io/projected/c0ef84df-77f3-4784-9297-24192a945026-kube-api-access-p6zh6\") pod \"c0ef84df-77f3-4784-9297-24192a945026\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.008880 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0ef84df-77f3-4784-9297-24192a945026-secret-volume\") pod \"c0ef84df-77f3-4784-9297-24192a945026\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.008917 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0ef84df-77f3-4784-9297-24192a945026-config-volume\") pod \"c0ef84df-77f3-4784-9297-24192a945026\" (UID: \"c0ef84df-77f3-4784-9297-24192a945026\") " Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.009883 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0ef84df-77f3-4784-9297-24192a945026-config-volume" (OuterVolumeSpecName: "config-volume") pod "c0ef84df-77f3-4784-9297-24192a945026" (UID: "c0ef84df-77f3-4784-9297-24192a945026"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.013850 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ef84df-77f3-4784-9297-24192a945026-kube-api-access-p6zh6" (OuterVolumeSpecName: "kube-api-access-p6zh6") pod "c0ef84df-77f3-4784-9297-24192a945026" (UID: "c0ef84df-77f3-4784-9297-24192a945026"). InnerVolumeSpecName "kube-api-access-p6zh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.016873 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0ef84df-77f3-4784-9297-24192a945026-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c0ef84df-77f3-4784-9297-24192a945026" (UID: "c0ef84df-77f3-4784-9297-24192a945026"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.110823 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6zh6\" (UniqueName: \"kubernetes.io/projected/c0ef84df-77f3-4784-9297-24192a945026-kube-api-access-p6zh6\") on node \"crc\" DevicePath \"\"" Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.110888 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0ef84df-77f3-4784-9297-24192a945026-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.110902 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0ef84df-77f3-4784-9297-24192a945026-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.609107 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" event={"ID":"c0ef84df-77f3-4784-9297-24192a945026","Type":"ContainerDied","Data":"7f9d5ab06a535a3cc6e8532ef8b5851d6fd067b62e9df5520595bd3ba2378892"} Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.609538 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f9d5ab06a535a3cc6e8532ef8b5851d6fd067b62e9df5520595bd3ba2378892" Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.609214 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g" Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.977464 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp"] Jan 29 14:45:03 crc kubenswrapper[4753]: I0129 14:45:03.982344 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494920-hwfxp"] Jan 29 14:45:04 crc kubenswrapper[4753]: I0129 14:45:04.160799 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647fe5d7-4243-4608-8351-6bc2e13b9f15" path="/var/lib/kubelet/pods/647fe5d7-4243-4608-8351-6bc2e13b9f15/volumes" Jan 29 14:45:47 crc kubenswrapper[4753]: I0129 14:45:47.856987 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sh6dc"] Jan 29 14:45:47 crc kubenswrapper[4753]: E0129 14:45:47.858928 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ef84df-77f3-4784-9297-24192a945026" containerName="collect-profiles" Jan 29 14:45:47 crc kubenswrapper[4753]: I0129 14:45:47.858952 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ef84df-77f3-4784-9297-24192a945026" containerName="collect-profiles" Jan 29 14:45:47 crc kubenswrapper[4753]: I0129 14:45:47.859132 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ef84df-77f3-4784-9297-24192a945026" containerName="collect-profiles" Jan 29 14:45:47 crc kubenswrapper[4753]: I0129 14:45:47.860507 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:47 crc kubenswrapper[4753]: I0129 14:45:47.883700 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sh6dc"] Jan 29 14:45:47 crc kubenswrapper[4753]: I0129 14:45:47.933898 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-catalog-content\") pod \"redhat-operators-sh6dc\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:47 crc kubenswrapper[4753]: I0129 14:45:47.933965 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqstt\" (UniqueName: \"kubernetes.io/projected/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-kube-api-access-mqstt\") pod \"redhat-operators-sh6dc\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:47 crc kubenswrapper[4753]: I0129 14:45:47.934075 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-utilities\") pod \"redhat-operators-sh6dc\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:48 crc kubenswrapper[4753]: I0129 14:45:48.035366 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-utilities\") pod \"redhat-operators-sh6dc\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:48 crc kubenswrapper[4753]: I0129 14:45:48.035476 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-catalog-content\") pod \"redhat-operators-sh6dc\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:48 crc kubenswrapper[4753]: I0129 14:45:48.035508 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqstt\" (UniqueName: \"kubernetes.io/projected/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-kube-api-access-mqstt\") pod \"redhat-operators-sh6dc\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:48 crc kubenswrapper[4753]: I0129 14:45:48.036073 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-catalog-content\") pod \"redhat-operators-sh6dc\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:48 crc kubenswrapper[4753]: I0129 14:45:48.036353 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-utilities\") pod \"redhat-operators-sh6dc\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:48 crc kubenswrapper[4753]: I0129 14:45:48.059756 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqstt\" (UniqueName: \"kubernetes.io/projected/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-kube-api-access-mqstt\") pod \"redhat-operators-sh6dc\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:48 crc kubenswrapper[4753]: I0129 14:45:48.179065 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:45:48 crc kubenswrapper[4753]: I0129 14:45:48.433250 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sh6dc"] Jan 29 14:45:48 crc kubenswrapper[4753]: I0129 14:45:48.976126 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh6dc" event={"ID":"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f","Type":"ContainerStarted","Data":"5ef6e05c731a471c738f0b98612a96a8d0fa1f6a4843812bf46afdc7813ac589"} Jan 29 14:45:49 crc kubenswrapper[4753]: I0129 14:45:49.986278 4753 generic.go:334] "Generic (PLEG): container finished" podID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerID="49be7c3c7204ddfdec702153bb766aa605b6082446492ba133d21d9551a52903" exitCode=0 Jan 29 14:45:49 crc kubenswrapper[4753]: I0129 14:45:49.986368 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh6dc" event={"ID":"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f","Type":"ContainerDied","Data":"49be7c3c7204ddfdec702153bb766aa605b6082446492ba133d21d9551a52903"} Jan 29 14:45:49 crc kubenswrapper[4753]: I0129 14:45:49.989822 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 14:45:59 crc kubenswrapper[4753]: I0129 14:45:59.064690 4753 generic.go:334] "Generic (PLEG): container finished" podID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerID="db80d2ce4985b467c7ddd081ff3bdf99236713ef4d3aae77c5946b0fec38f04c" exitCode=0 Jan 29 14:45:59 crc kubenswrapper[4753]: I0129 14:45:59.064775 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh6dc" event={"ID":"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f","Type":"ContainerDied","Data":"db80d2ce4985b467c7ddd081ff3bdf99236713ef4d3aae77c5946b0fec38f04c"} Jan 29 14:46:03 crc kubenswrapper[4753]: I0129 14:46:03.774116 4753 scope.go:117] "RemoveContainer" containerID="f7344526d39deb3662a79603e6c72db1784dcdad1b3d65cb4b36220c87cb7682" Jan 29 14:46:04 crc kubenswrapper[4753]: I0129 14:46:04.109695 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh6dc" event={"ID":"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f","Type":"ContainerStarted","Data":"eb1d85b9fce7354b748fecd9bcf7879fb3ed1fe147faac5a11818af2f922482d"} Jan 29 14:46:04 crc kubenswrapper[4753]: I0129 14:46:04.134791 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sh6dc" podStartSLOduration=4.263485887 podStartE2EDuration="17.134768609s" podCreationTimestamp="2026-01-29 14:45:47 +0000 UTC" firstStartedPulling="2026-01-29 14:45:49.989302871 +0000 UTC m=+2584.684037293" lastFinishedPulling="2026-01-29 14:46:02.860585633 +0000 UTC m=+2597.555320015" observedRunningTime="2026-01-29 14:46:04.127882994 +0000 UTC m=+2598.822617376" watchObservedRunningTime="2026-01-29 14:46:04.134768609 +0000 UTC m=+2598.829503011" Jan 29 14:46:04 crc kubenswrapper[4753]: I0129 14:46:04.264909 4753 scope.go:117] "RemoveContainer" containerID="be7b3f8fc9f245dd5fe79d1d9e0aaa0987a805c22b014f59fa854adff6b9bdf5" Jan 29 14:46:04 crc kubenswrapper[4753]: I0129 14:46:04.286030 4753 scope.go:117] "RemoveContainer" containerID="5deb5d844cc5f186f8f72b7e40dae2ecc4128d8c19612114e5a8eb9aa22a68d1" Jan 29 14:46:04 crc kubenswrapper[4753]: I0129 14:46:04.306707 4753 scope.go:117] "RemoveContainer" containerID="a9fe9299c993cd5757fdf6939b39efcc3104c4b1e921c26df26593022c2596d6" Jan 29 14:46:08 crc kubenswrapper[4753]: I0129 14:46:08.179650 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:46:08 crc kubenswrapper[4753]: I0129 14:46:08.180211 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:46:09 crc kubenswrapper[4753]: I0129 14:46:09.232397 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sh6dc" podUID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerName="registry-server" probeResult="failure" output=< Jan 29 14:46:09 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 14:46:09 crc kubenswrapper[4753]: > Jan 29 14:46:18 crc kubenswrapper[4753]: I0129 14:46:18.220948 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:46:18 crc kubenswrapper[4753]: I0129 14:46:18.283455 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:46:19 crc kubenswrapper[4753]: I0129 14:46:19.047742 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sh6dc"] Jan 29 14:46:19 crc kubenswrapper[4753]: I0129 14:46:19.264942 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sh6dc" podUID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerName="registry-server" containerID="cri-o://eb1d85b9fce7354b748fecd9bcf7879fb3ed1fe147faac5a11818af2f922482d" gracePeriod=2 Jan 29 14:46:20 crc kubenswrapper[4753]: I0129 14:46:20.274923 4753 generic.go:334] "Generic (PLEG): container finished" podID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerID="eb1d85b9fce7354b748fecd9bcf7879fb3ed1fe147faac5a11818af2f922482d" exitCode=0 Jan 29 14:46:20 crc kubenswrapper[4753]: I0129 14:46:20.274977 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh6dc" event={"ID":"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f","Type":"ContainerDied","Data":"eb1d85b9fce7354b748fecd9bcf7879fb3ed1fe147faac5a11818af2f922482d"} Jan 29 14:46:20 crc kubenswrapper[4753]: I0129 14:46:20.870749 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:46:20 crc kubenswrapper[4753]: I0129 14:46:20.914840 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqstt\" (UniqueName: \"kubernetes.io/projected/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-kube-api-access-mqstt\") pod \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " Jan 29 14:46:20 crc kubenswrapper[4753]: I0129 14:46:20.914937 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-catalog-content\") pod \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " Jan 29 14:46:20 crc kubenswrapper[4753]: I0129 14:46:20.915006 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-utilities\") pod \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\" (UID: \"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f\") " Jan 29 14:46:20 crc kubenswrapper[4753]: I0129 14:46:20.915944 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-utilities" (OuterVolumeSpecName: "utilities") pod "cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" (UID: "cb78f7c5-0646-4075-8e8a-4dc5585c3d2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:46:20 crc kubenswrapper[4753]: I0129 14:46:20.921756 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-kube-api-access-mqstt" (OuterVolumeSpecName: "kube-api-access-mqstt") pod "cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" (UID: "cb78f7c5-0646-4075-8e8a-4dc5585c3d2f"). InnerVolumeSpecName "kube-api-access-mqstt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.016506 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqstt\" (UniqueName: \"kubernetes.io/projected/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-kube-api-access-mqstt\") on node \"crc\" DevicePath \"\"" Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.016568 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.047165 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" (UID: "cb78f7c5-0646-4075-8e8a-4dc5585c3d2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.118015 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.284124 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh6dc" event={"ID":"cb78f7c5-0646-4075-8e8a-4dc5585c3d2f","Type":"ContainerDied","Data":"5ef6e05c731a471c738f0b98612a96a8d0fa1f6a4843812bf46afdc7813ac589"} Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.284205 4753 scope.go:117] "RemoveContainer" containerID="eb1d85b9fce7354b748fecd9bcf7879fb3ed1fe147faac5a11818af2f922482d" Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.284215 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh6dc" Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.304707 4753 scope.go:117] "RemoveContainer" containerID="db80d2ce4985b467c7ddd081ff3bdf99236713ef4d3aae77c5946b0fec38f04c" Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.319496 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sh6dc"] Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.324714 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sh6dc"] Jan 29 14:46:21 crc kubenswrapper[4753]: I0129 14:46:21.326398 4753 scope.go:117] "RemoveContainer" containerID="49be7c3c7204ddfdec702153bb766aa605b6082446492ba133d21d9551a52903" Jan 29 14:46:22 crc kubenswrapper[4753]: I0129 14:46:22.162571 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" path="/var/lib/kubelet/pods/cb78f7c5-0646-4075-8e8a-4dc5585c3d2f/volumes" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.232722 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c86cz"] Jan 29 14:47:03 crc kubenswrapper[4753]: E0129 14:47:03.234442 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerName="registry-server" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.234464 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerName="registry-server" Jan 29 14:47:03 crc kubenswrapper[4753]: E0129 14:47:03.234493 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerName="extract-utilities" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.234503 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerName="extract-utilities" Jan 29 14:47:03 crc kubenswrapper[4753]: E0129 14:47:03.234527 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerName="extract-content" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.234536 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerName="extract-content" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.234746 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb78f7c5-0646-4075-8e8a-4dc5585c3d2f" containerName="registry-server" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.236420 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.251242 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c86cz"] Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.336257 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-catalog-content\") pod \"certified-operators-c86cz\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.336593 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-utilities\") pod \"certified-operators-c86cz\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.336839 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26p8z\" (UniqueName: \"kubernetes.io/projected/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-kube-api-access-26p8z\") pod \"certified-operators-c86cz\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.438102 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-catalog-content\") pod \"certified-operators-c86cz\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.438524 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-utilities\") pod \"certified-operators-c86cz\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.438661 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26p8z\" (UniqueName: \"kubernetes.io/projected/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-kube-api-access-26p8z\") pod \"certified-operators-c86cz\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.438942 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-catalog-content\") pod \"certified-operators-c86cz\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.439141 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-utilities\") pod \"certified-operators-c86cz\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.462251 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26p8z\" (UniqueName: \"kubernetes.io/projected/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-kube-api-access-26p8z\") pod \"certified-operators-c86cz\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.573924 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:03 crc kubenswrapper[4753]: I0129 14:47:03.951643 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c86cz"] Jan 29 14:47:04 crc kubenswrapper[4753]: I0129 14:47:04.655298 4753 generic.go:334] "Generic (PLEG): container finished" podID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerID="ace5a993cad320757dfc776540fa39ffde2456e458219679ee197e5df0cc7494" exitCode=0 Jan 29 14:47:04 crc kubenswrapper[4753]: I0129 14:47:04.655343 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c86cz" event={"ID":"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c","Type":"ContainerDied","Data":"ace5a993cad320757dfc776540fa39ffde2456e458219679ee197e5df0cc7494"} Jan 29 14:47:04 crc kubenswrapper[4753]: I0129 14:47:04.655416 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c86cz" event={"ID":"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c","Type":"ContainerStarted","Data":"b9460fc267db7f2b355b769c989e0f0f41d7ad16b1fddd01c038ac71b4d9ae5a"} Jan 29 14:47:07 crc kubenswrapper[4753]: I0129 14:47:07.677087 4753 generic.go:334] "Generic (PLEG): container finished" podID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerID="bcd7e52980f0b15b4996a3ff951306ba0b2f5d0bd2c0831664d2ce02864fbcc5" exitCode=0 Jan 29 14:47:07 crc kubenswrapper[4753]: I0129 14:47:07.677240 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c86cz" event={"ID":"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c","Type":"ContainerDied","Data":"bcd7e52980f0b15b4996a3ff951306ba0b2f5d0bd2c0831664d2ce02864fbcc5"} Jan 29 14:47:07 crc kubenswrapper[4753]: E0129 14:47:07.710795 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ce5cb4_7a9a_4068_9d3e_29c3ef8db11c.slice/crio-bcd7e52980f0b15b4996a3ff951306ba0b2f5d0bd2c0831664d2ce02864fbcc5.scope\": RecentStats: unable to find data in memory cache]" Jan 29 14:47:09 crc kubenswrapper[4753]: I0129 14:47:09.691196 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c86cz" event={"ID":"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c","Type":"ContainerStarted","Data":"6ebb7345b6668de094c41541213c5bbd89952a5d046391785daf56d2a83c3b98"} Jan 29 14:47:09 crc kubenswrapper[4753]: I0129 14:47:09.712200 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c86cz" podStartSLOduration=2.794175674 podStartE2EDuration="6.712178406s" podCreationTimestamp="2026-01-29 14:47:03 +0000 UTC" firstStartedPulling="2026-01-29 14:47:04.656688925 +0000 UTC m=+2659.351423307" lastFinishedPulling="2026-01-29 14:47:08.574691617 +0000 UTC m=+2663.269426039" observedRunningTime="2026-01-29 14:47:09.705349932 +0000 UTC m=+2664.400084314" watchObservedRunningTime="2026-01-29 14:47:09.712178406 +0000 UTC m=+2664.406912808" Jan 29 14:47:13 crc kubenswrapper[4753]: I0129 14:47:13.575264 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:13 crc kubenswrapper[4753]: I0129 14:47:13.575649 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:13 crc kubenswrapper[4753]: I0129 14:47:13.625862 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:13 crc kubenswrapper[4753]: I0129 14:47:13.758887 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:13 crc kubenswrapper[4753]: I0129 14:47:13.861294 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c86cz"] Jan 29 14:47:15 crc kubenswrapper[4753]: I0129 14:47:15.727504 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c86cz" podUID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerName="registry-server" containerID="cri-o://6ebb7345b6668de094c41541213c5bbd89952a5d046391785daf56d2a83c3b98" gracePeriod=2 Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.736448 4753 generic.go:334] "Generic (PLEG): container finished" podID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerID="6ebb7345b6668de094c41541213c5bbd89952a5d046391785daf56d2a83c3b98" exitCode=0 Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.736490 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c86cz" event={"ID":"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c","Type":"ContainerDied","Data":"6ebb7345b6668de094c41541213c5bbd89952a5d046391785daf56d2a83c3b98"} Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.736876 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c86cz" event={"ID":"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c","Type":"ContainerDied","Data":"b9460fc267db7f2b355b769c989e0f0f41d7ad16b1fddd01c038ac71b4d9ae5a"} Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.736897 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9460fc267db7f2b355b769c989e0f0f41d7ad16b1fddd01c038ac71b4d9ae5a" Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.765323 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.925901 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-catalog-content\") pod \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.925995 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26p8z\" (UniqueName: \"kubernetes.io/projected/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-kube-api-access-26p8z\") pod \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.926079 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-utilities\") pod \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\" (UID: \"46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c\") " Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.927170 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-utilities" (OuterVolumeSpecName: "utilities") pod "46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" (UID: "46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.939676 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-kube-api-access-26p8z" (OuterVolumeSpecName: "kube-api-access-26p8z") pod "46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" (UID: "46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c"). InnerVolumeSpecName "kube-api-access-26p8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:47:16 crc kubenswrapper[4753]: I0129 14:47:16.973611 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" (UID: "46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:47:17 crc kubenswrapper[4753]: I0129 14:47:17.027246 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:47:17 crc kubenswrapper[4753]: I0129 14:47:17.027273 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:47:17 crc kubenswrapper[4753]: I0129 14:47:17.027284 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26p8z\" (UniqueName: \"kubernetes.io/projected/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c-kube-api-access-26p8z\") on node \"crc\" DevicePath \"\"" Jan 29 14:47:17 crc kubenswrapper[4753]: I0129 14:47:17.742225 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c86cz" Jan 29 14:47:17 crc kubenswrapper[4753]: I0129 14:47:17.773715 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c86cz"] Jan 29 14:47:17 crc kubenswrapper[4753]: I0129 14:47:17.788886 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c86cz"] Jan 29 14:47:17 crc kubenswrapper[4753]: E0129 14:47:17.888665 4753 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ce5cb4_7a9a_4068_9d3e_29c3ef8db11c.slice/crio-b9460fc267db7f2b355b769c989e0f0f41d7ad16b1fddd01c038ac71b4d9ae5a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ce5cb4_7a9a_4068_9d3e_29c3ef8db11c.slice\": RecentStats: unable to find data in memory cache]" Jan 29 14:47:18 crc kubenswrapper[4753]: I0129 14:47:18.157182 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" path="/var/lib/kubelet/pods/46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c/volumes" Jan 29 14:47:27 crc kubenswrapper[4753]: I0129 14:47:27.055076 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:47:27 crc kubenswrapper[4753]: I0129 14:47:27.055768 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:47:57 crc kubenswrapper[4753]: I0129 14:47:57.055579 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:47:57 crc kubenswrapper[4753]: I0129 14:47:57.057354 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:48:00 crc kubenswrapper[4753]: E0129 14:48:00.401624 4753 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="6.253s" Jan 29 14:48:27 crc kubenswrapper[4753]: I0129 14:48:27.055139 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:48:27 crc kubenswrapper[4753]: I0129 14:48:27.055706 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:48:27 crc kubenswrapper[4753]: I0129 14:48:27.055760 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:48:27 crc kubenswrapper[4753]: I0129 14:48:27.056474 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2942e28242220ecf81b3ccb242246ca35d9dd454dc42c62353d1a49f59b34e7"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:48:27 crc kubenswrapper[4753]: I0129 14:48:27.056529 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://f2942e28242220ecf81b3ccb242246ca35d9dd454dc42c62353d1a49f59b34e7" gracePeriod=600 Jan 29 14:48:27 crc kubenswrapper[4753]: I0129 14:48:27.258196 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="f2942e28242220ecf81b3ccb242246ca35d9dd454dc42c62353d1a49f59b34e7" exitCode=0 Jan 29 14:48:27 crc kubenswrapper[4753]: I0129 14:48:27.258259 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"f2942e28242220ecf81b3ccb242246ca35d9dd454dc42c62353d1a49f59b34e7"} Jan 29 14:48:27 crc kubenswrapper[4753]: I0129 14:48:27.258612 4753 scope.go:117] "RemoveContainer" containerID="890e49725611ae0d781a7aba5582a49412c6eb1706f40ec49f3a0f7731e80b47" Jan 29 14:48:28 crc kubenswrapper[4753]: I0129 14:48:28.272681 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8"} Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.334046 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w9xk8"] Jan 29 14:48:31 crc kubenswrapper[4753]: E0129 14:48:31.334757 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerName="extract-content" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.334770 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerName="extract-content" Jan 29 14:48:31 crc kubenswrapper[4753]: E0129 14:48:31.334786 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerName="registry-server" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.334792 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerName="registry-server" Jan 29 14:48:31 crc kubenswrapper[4753]: E0129 14:48:31.334810 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerName="extract-utilities" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.334817 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerName="extract-utilities" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.334955 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="46ce5cb4-7a9a-4068-9d3e-29c3ef8db11c" containerName="registry-server" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.335843 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.377819 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9xk8"] Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.383917 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-catalog-content\") pod \"redhat-marketplace-w9xk8\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.384231 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg4gx\" (UniqueName: \"kubernetes.io/projected/99c9f5af-80f2-44ae-88b3-604c39032aa7-kube-api-access-qg4gx\") pod \"redhat-marketplace-w9xk8\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.384364 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-utilities\") pod \"redhat-marketplace-w9xk8\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.485516 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg4gx\" (UniqueName: \"kubernetes.io/projected/99c9f5af-80f2-44ae-88b3-604c39032aa7-kube-api-access-qg4gx\") pod \"redhat-marketplace-w9xk8\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.485664 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-utilities\") pod \"redhat-marketplace-w9xk8\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.485739 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-catalog-content\") pod \"redhat-marketplace-w9xk8\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.486292 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-utilities\") pod \"redhat-marketplace-w9xk8\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.486409 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-catalog-content\") pod \"redhat-marketplace-w9xk8\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.506911 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg4gx\" (UniqueName: \"kubernetes.io/projected/99c9f5af-80f2-44ae-88b3-604c39032aa7-kube-api-access-qg4gx\") pod \"redhat-marketplace-w9xk8\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:31 crc kubenswrapper[4753]: I0129 14:48:31.671606 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:32 crc kubenswrapper[4753]: I0129 14:48:32.123019 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9xk8"] Jan 29 14:48:32 crc kubenswrapper[4753]: W0129 14:48:32.128786 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c9f5af_80f2_44ae_88b3_604c39032aa7.slice/crio-a01a9d1c48a848f9e87e74be6ef902bc61168b62a5d868533c4921a57b52f581 WatchSource:0}: Error finding container a01a9d1c48a848f9e87e74be6ef902bc61168b62a5d868533c4921a57b52f581: Status 404 returned error can't find the container with id a01a9d1c48a848f9e87e74be6ef902bc61168b62a5d868533c4921a57b52f581 Jan 29 14:48:32 crc kubenswrapper[4753]: I0129 14:48:32.300920 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9xk8" event={"ID":"99c9f5af-80f2-44ae-88b3-604c39032aa7","Type":"ContainerStarted","Data":"a01a9d1c48a848f9e87e74be6ef902bc61168b62a5d868533c4921a57b52f581"} Jan 29 14:48:33 crc kubenswrapper[4753]: I0129 14:48:33.311399 4753 generic.go:334] "Generic (PLEG): container finished" podID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerID="1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a" exitCode=0 Jan 29 14:48:33 crc kubenswrapper[4753]: I0129 14:48:33.311482 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9xk8" event={"ID":"99c9f5af-80f2-44ae-88b3-604c39032aa7","Type":"ContainerDied","Data":"1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a"} Jan 29 14:48:37 crc kubenswrapper[4753]: I0129 14:48:37.347712 4753 generic.go:334] "Generic (PLEG): container finished" podID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerID="79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5" exitCode=0 Jan 29 14:48:37 crc kubenswrapper[4753]: I0129 14:48:37.347775 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9xk8" event={"ID":"99c9f5af-80f2-44ae-88b3-604c39032aa7","Type":"ContainerDied","Data":"79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5"} Jan 29 14:48:40 crc kubenswrapper[4753]: I0129 14:48:40.375574 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9xk8" event={"ID":"99c9f5af-80f2-44ae-88b3-604c39032aa7","Type":"ContainerStarted","Data":"6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639"} Jan 29 14:48:40 crc kubenswrapper[4753]: I0129 14:48:40.413045 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w9xk8" podStartSLOduration=3.73380373 podStartE2EDuration="9.413024312s" podCreationTimestamp="2026-01-29 14:48:31 +0000 UTC" firstStartedPulling="2026-01-29 14:48:33.313867806 +0000 UTC m=+2748.008602238" lastFinishedPulling="2026-01-29 14:48:38.993088438 +0000 UTC m=+2753.687822820" observedRunningTime="2026-01-29 14:48:40.405757185 +0000 UTC m=+2755.100491587" watchObservedRunningTime="2026-01-29 14:48:40.413024312 +0000 UTC m=+2755.107758724" Jan 29 14:48:41 crc kubenswrapper[4753]: I0129 14:48:41.672378 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:41 crc kubenswrapper[4753]: I0129 14:48:41.672476 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:41 crc kubenswrapper[4753]: I0129 14:48:41.748426 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:51 crc kubenswrapper[4753]: I0129 14:48:51.751742 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:51 crc kubenswrapper[4753]: I0129 14:48:51.824443 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9xk8"] Jan 29 14:48:52 crc kubenswrapper[4753]: I0129 14:48:52.490800 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w9xk8" podUID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerName="registry-server" containerID="cri-o://6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639" gracePeriod=2 Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.104511 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.180831 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-catalog-content\") pod \"99c9f5af-80f2-44ae-88b3-604c39032aa7\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.180886 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg4gx\" (UniqueName: \"kubernetes.io/projected/99c9f5af-80f2-44ae-88b3-604c39032aa7-kube-api-access-qg4gx\") pod \"99c9f5af-80f2-44ae-88b3-604c39032aa7\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.181098 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-utilities\") pod \"99c9f5af-80f2-44ae-88b3-604c39032aa7\" (UID: \"99c9f5af-80f2-44ae-88b3-604c39032aa7\") " Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.182031 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-utilities" (OuterVolumeSpecName: "utilities") pod "99c9f5af-80f2-44ae-88b3-604c39032aa7" (UID: "99c9f5af-80f2-44ae-88b3-604c39032aa7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.186999 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c9f5af-80f2-44ae-88b3-604c39032aa7-kube-api-access-qg4gx" (OuterVolumeSpecName: "kube-api-access-qg4gx") pod "99c9f5af-80f2-44ae-88b3-604c39032aa7" (UID: "99c9f5af-80f2-44ae-88b3-604c39032aa7"). InnerVolumeSpecName "kube-api-access-qg4gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.202065 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99c9f5af-80f2-44ae-88b3-604c39032aa7" (UID: "99c9f5af-80f2-44ae-88b3-604c39032aa7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.284492 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.284522 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c9f5af-80f2-44ae-88b3-604c39032aa7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.284534 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg4gx\" (UniqueName: \"kubernetes.io/projected/99c9f5af-80f2-44ae-88b3-604c39032aa7-kube-api-access-qg4gx\") on node \"crc\" DevicePath \"\"" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.516844 4753 generic.go:334] "Generic (PLEG): container finished" podID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerID="6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639" exitCode=0 Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.516899 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9xk8" event={"ID":"99c9f5af-80f2-44ae-88b3-604c39032aa7","Type":"ContainerDied","Data":"6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639"} Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.516906 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9xk8" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.516950 4753 scope.go:117] "RemoveContainer" containerID="6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.516936 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9xk8" event={"ID":"99c9f5af-80f2-44ae-88b3-604c39032aa7","Type":"ContainerDied","Data":"a01a9d1c48a848f9e87e74be6ef902bc61168b62a5d868533c4921a57b52f581"} Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.554236 4753 scope.go:117] "RemoveContainer" containerID="79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.555446 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9xk8"] Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.563430 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9xk8"] Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.576009 4753 scope.go:117] "RemoveContainer" containerID="1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.622298 4753 scope.go:117] "RemoveContainer" containerID="6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639" Jan 29 14:48:55 crc kubenswrapper[4753]: E0129 14:48:55.623105 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639\": container with ID starting with 6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639 not found: ID does not exist" containerID="6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.623193 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639"} err="failed to get container status \"6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639\": rpc error: code = NotFound desc = could not find container \"6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639\": container with ID starting with 6768a7d50a7adef8c9a309cc568d2ece5d75310ebed5874182b3c5ff377b0639 not found: ID does not exist" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.623223 4753 scope.go:117] "RemoveContainer" containerID="79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5" Jan 29 14:48:55 crc kubenswrapper[4753]: E0129 14:48:55.623615 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5\": container with ID starting with 79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5 not found: ID does not exist" containerID="79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.623636 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5"} err="failed to get container status \"79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5\": rpc error: code = NotFound desc = could not find container \"79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5\": container with ID starting with 79c4168390b8a375782c4b3dc39eb9e77a740c6d7c3595a33b9f65e2e1d1a7e5 not found: ID does not exist" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.623656 4753 scope.go:117] "RemoveContainer" containerID="1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a" Jan 29 14:48:55 crc kubenswrapper[4753]: E0129 14:48:55.623944 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a\": container with ID starting with 1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a not found: ID does not exist" containerID="1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a" Jan 29 14:48:55 crc kubenswrapper[4753]: I0129 14:48:55.623966 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a"} err="failed to get container status \"1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a\": rpc error: code = NotFound desc = could not find container \"1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a\": container with ID starting with 1225f49358fa4316d8f32aaab731072a2640cda199c9dcd590eff8f2bd7d377a not found: ID does not exist" Jan 29 14:48:56 crc kubenswrapper[4753]: I0129 14:48:56.166483 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c9f5af-80f2-44ae-88b3-604c39032aa7" path="/var/lib/kubelet/pods/99c9f5af-80f2-44ae-88b3-604c39032aa7/volumes" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.047688 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7zb55"] Jan 29 14:50:31 crc kubenswrapper[4753]: E0129 14:50:31.049069 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerName="extract-content" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.049102 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerName="extract-content" Jan 29 14:50:31 crc kubenswrapper[4753]: E0129 14:50:31.049189 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerName="registry-server" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.049209 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerName="registry-server" Jan 29 14:50:31 crc kubenswrapper[4753]: E0129 14:50:31.049240 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerName="extract-utilities" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.049259 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerName="extract-utilities" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.049583 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c9f5af-80f2-44ae-88b3-604c39032aa7" containerName="registry-server" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.051947 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.067662 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7zb55"] Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.190595 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-utilities\") pod \"community-operators-7zb55\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.190659 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xd9h\" (UniqueName: \"kubernetes.io/projected/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-kube-api-access-2xd9h\") pod \"community-operators-7zb55\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.190713 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-catalog-content\") pod \"community-operators-7zb55\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.291973 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-utilities\") pod \"community-operators-7zb55\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.292022 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xd9h\" (UniqueName: \"kubernetes.io/projected/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-kube-api-access-2xd9h\") pod \"community-operators-7zb55\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.292063 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-catalog-content\") pod \"community-operators-7zb55\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.293113 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-catalog-content\") pod \"community-operators-7zb55\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.293574 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-utilities\") pod \"community-operators-7zb55\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.315388 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xd9h\" (UniqueName: \"kubernetes.io/projected/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-kube-api-access-2xd9h\") pod \"community-operators-7zb55\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.382139 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:31 crc kubenswrapper[4753]: I0129 14:50:31.900527 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7zb55"] Jan 29 14:50:32 crc kubenswrapper[4753]: I0129 14:50:32.283815 4753 generic.go:334] "Generic (PLEG): container finished" podID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerID="196db026bc9dbc688e65f37459c9aef022b21f5141c763bb92dd3ba4893512d6" exitCode=0 Jan 29 14:50:32 crc kubenswrapper[4753]: I0129 14:50:32.283938 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zb55" event={"ID":"7b2b03b7-f109-4a7f-90f0-6ef498e4f100","Type":"ContainerDied","Data":"196db026bc9dbc688e65f37459c9aef022b21f5141c763bb92dd3ba4893512d6"} Jan 29 14:50:32 crc kubenswrapper[4753]: I0129 14:50:32.284143 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zb55" event={"ID":"7b2b03b7-f109-4a7f-90f0-6ef498e4f100","Type":"ContainerStarted","Data":"8b7cdd068d591ea1f5dfbb83fc7b6bb460a357618950074d156ee9a2ee167154"} Jan 29 14:50:34 crc kubenswrapper[4753]: I0129 14:50:34.299528 4753 generic.go:334] "Generic (PLEG): container finished" podID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerID="fc1e378ea87051197f5d3f758f3e9ade6086009d0c90ef8e691a880623e7613a" exitCode=0 Jan 29 14:50:34 crc kubenswrapper[4753]: I0129 14:50:34.299638 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zb55" event={"ID":"7b2b03b7-f109-4a7f-90f0-6ef498e4f100","Type":"ContainerDied","Data":"fc1e378ea87051197f5d3f758f3e9ade6086009d0c90ef8e691a880623e7613a"} Jan 29 14:50:36 crc kubenswrapper[4753]: I0129 14:50:36.316948 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zb55" event={"ID":"7b2b03b7-f109-4a7f-90f0-6ef498e4f100","Type":"ContainerStarted","Data":"1aa0ca78247046bfe0a18be19768face55e60233762635e276537568958221d4"} Jan 29 14:50:36 crc kubenswrapper[4753]: I0129 14:50:36.336347 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7zb55" podStartSLOduration=2.407239311 podStartE2EDuration="5.336330535s" podCreationTimestamp="2026-01-29 14:50:31 +0000 UTC" firstStartedPulling="2026-01-29 14:50:32.285395292 +0000 UTC m=+2866.980129674" lastFinishedPulling="2026-01-29 14:50:35.214486516 +0000 UTC m=+2869.909220898" observedRunningTime="2026-01-29 14:50:36.33130262 +0000 UTC m=+2871.026036992" watchObservedRunningTime="2026-01-29 14:50:36.336330535 +0000 UTC m=+2871.031064937" Jan 29 14:50:41 crc kubenswrapper[4753]: I0129 14:50:41.384196 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:41 crc kubenswrapper[4753]: I0129 14:50:41.384483 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:41 crc kubenswrapper[4753]: I0129 14:50:41.451930 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:42 crc kubenswrapper[4753]: I0129 14:50:42.411550 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:42 crc kubenswrapper[4753]: I0129 14:50:42.833447 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7zb55"] Jan 29 14:50:44 crc kubenswrapper[4753]: I0129 14:50:44.372803 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7zb55" podUID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerName="registry-server" containerID="cri-o://1aa0ca78247046bfe0a18be19768face55e60233762635e276537568958221d4" gracePeriod=2 Jan 29 14:50:45 crc kubenswrapper[4753]: I0129 14:50:45.382075 4753 generic.go:334] "Generic (PLEG): container finished" podID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerID="1aa0ca78247046bfe0a18be19768face55e60233762635e276537568958221d4" exitCode=0 Jan 29 14:50:45 crc kubenswrapper[4753]: I0129 14:50:45.382376 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zb55" event={"ID":"7b2b03b7-f109-4a7f-90f0-6ef498e4f100","Type":"ContainerDied","Data":"1aa0ca78247046bfe0a18be19768face55e60233762635e276537568958221d4"} Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.279774 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.334507 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-utilities\") pod \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.334562 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xd9h\" (UniqueName: \"kubernetes.io/projected/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-kube-api-access-2xd9h\") pod \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.334600 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-catalog-content\") pod \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\" (UID: \"7b2b03b7-f109-4a7f-90f0-6ef498e4f100\") " Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.335436 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-utilities" (OuterVolumeSpecName: "utilities") pod "7b2b03b7-f109-4a7f-90f0-6ef498e4f100" (UID: "7b2b03b7-f109-4a7f-90f0-6ef498e4f100"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.340407 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-kube-api-access-2xd9h" (OuterVolumeSpecName: "kube-api-access-2xd9h") pod "7b2b03b7-f109-4a7f-90f0-6ef498e4f100" (UID: "7b2b03b7-f109-4a7f-90f0-6ef498e4f100"). InnerVolumeSpecName "kube-api-access-2xd9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.402689 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zb55" event={"ID":"7b2b03b7-f109-4a7f-90f0-6ef498e4f100","Type":"ContainerDied","Data":"8b7cdd068d591ea1f5dfbb83fc7b6bb460a357618950074d156ee9a2ee167154"} Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.402734 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zb55" Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.402749 4753 scope.go:117] "RemoveContainer" containerID="1aa0ca78247046bfe0a18be19768face55e60233762635e276537568958221d4" Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.420921 4753 scope.go:117] "RemoveContainer" containerID="fc1e378ea87051197f5d3f758f3e9ade6086009d0c90ef8e691a880623e7613a" Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.435255 4753 scope.go:117] "RemoveContainer" containerID="196db026bc9dbc688e65f37459c9aef022b21f5141c763bb92dd3ba4893512d6" Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.436177 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:50:47 crc kubenswrapper[4753]: I0129 14:50:47.436334 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xd9h\" (UniqueName: \"kubernetes.io/projected/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-kube-api-access-2xd9h\") on node \"crc\" DevicePath \"\"" Jan 29 14:50:48 crc kubenswrapper[4753]: I0129 14:50:48.485959 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b2b03b7-f109-4a7f-90f0-6ef498e4f100" (UID: "7b2b03b7-f109-4a7f-90f0-6ef498e4f100"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:50:48 crc kubenswrapper[4753]: I0129 14:50:48.554612 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b2b03b7-f109-4a7f-90f0-6ef498e4f100-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:50:48 crc kubenswrapper[4753]: I0129 14:50:48.630786 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7zb55"] Jan 29 14:50:48 crc kubenswrapper[4753]: I0129 14:50:48.635996 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7zb55"] Jan 29 14:50:50 crc kubenswrapper[4753]: I0129 14:50:50.159282 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" path="/var/lib/kubelet/pods/7b2b03b7-f109-4a7f-90f0-6ef498e4f100/volumes" Jan 29 14:50:57 crc kubenswrapper[4753]: I0129 14:50:57.054822 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:50:57 crc kubenswrapper[4753]: I0129 14:50:57.055459 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:51:27 crc kubenswrapper[4753]: I0129 14:51:27.054523 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:51:27 crc kubenswrapper[4753]: I0129 14:51:27.055120 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:51:57 crc kubenswrapper[4753]: I0129 14:51:57.054520 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:51:57 crc kubenswrapper[4753]: I0129 14:51:57.054952 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:51:57 crc kubenswrapper[4753]: I0129 14:51:57.055083 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 14:51:57 crc kubenswrapper[4753]: I0129 14:51:57.055887 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 14:51:57 crc kubenswrapper[4753]: I0129 14:51:57.055957 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" gracePeriod=600 Jan 29 14:51:57 crc kubenswrapper[4753]: E0129 14:51:57.197902 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:51:58 crc kubenswrapper[4753]: I0129 14:51:58.043754 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" exitCode=0 Jan 29 14:51:58 crc kubenswrapper[4753]: I0129 14:51:58.043799 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8"} Jan 29 14:51:58 crc kubenswrapper[4753]: I0129 14:51:58.043833 4753 scope.go:117] "RemoveContainer" containerID="f2942e28242220ecf81b3ccb242246ca35d9dd454dc42c62353d1a49f59b34e7" Jan 29 14:51:58 crc kubenswrapper[4753]: I0129 14:51:58.044297 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:51:58 crc kubenswrapper[4753]: E0129 14:51:58.044482 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:52:09 crc kubenswrapper[4753]: I0129 14:52:09.149657 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:52:09 crc kubenswrapper[4753]: E0129 14:52:09.150392 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:52:21 crc kubenswrapper[4753]: I0129 14:52:21.149550 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:52:21 crc kubenswrapper[4753]: E0129 14:52:21.150399 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:52:36 crc kubenswrapper[4753]: I0129 14:52:36.156864 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:52:36 crc kubenswrapper[4753]: E0129 14:52:36.158134 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:52:47 crc kubenswrapper[4753]: I0129 14:52:47.149602 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:52:47 crc kubenswrapper[4753]: E0129 14:52:47.150404 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:53:01 crc kubenswrapper[4753]: I0129 14:53:01.149525 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:53:01 crc kubenswrapper[4753]: E0129 14:53:01.150355 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:53:05 crc kubenswrapper[4753]: I0129 14:53:05.326208 4753 scope.go:117] "RemoveContainer" containerID="ace5a993cad320757dfc776540fa39ffde2456e458219679ee197e5df0cc7494" Jan 29 14:53:15 crc kubenswrapper[4753]: I0129 14:53:15.150065 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:53:15 crc kubenswrapper[4753]: E0129 14:53:15.150786 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:53:26 crc kubenswrapper[4753]: I0129 14:53:26.173085 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:53:26 crc kubenswrapper[4753]: E0129 14:53:26.174318 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:53:39 crc kubenswrapper[4753]: I0129 14:53:39.150039 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:53:39 crc kubenswrapper[4753]: E0129 14:53:39.150808 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:53:53 crc kubenswrapper[4753]: I0129 14:53:53.149331 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:53:53 crc kubenswrapper[4753]: E0129 14:53:53.151205 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:54:05 crc kubenswrapper[4753]: I0129 14:54:05.377215 4753 scope.go:117] "RemoveContainer" containerID="bcd7e52980f0b15b4996a3ff951306ba0b2f5d0bd2c0831664d2ce02864fbcc5" Jan 29 14:54:05 crc kubenswrapper[4753]: I0129 14:54:05.411797 4753 scope.go:117] "RemoveContainer" containerID="6ebb7345b6668de094c41541213c5bbd89952a5d046391785daf56d2a83c3b98" Jan 29 14:54:06 crc kubenswrapper[4753]: I0129 14:54:06.159207 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:54:06 crc kubenswrapper[4753]: E0129 14:54:06.160371 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:54:18 crc kubenswrapper[4753]: I0129 14:54:18.150146 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:54:18 crc kubenswrapper[4753]: E0129 14:54:18.151495 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:54:30 crc kubenswrapper[4753]: I0129 14:54:30.150107 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:54:30 crc kubenswrapper[4753]: E0129 14:54:30.151405 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:54:44 crc kubenswrapper[4753]: I0129 14:54:44.149992 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:54:44 crc kubenswrapper[4753]: E0129 14:54:44.150744 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:54:58 crc kubenswrapper[4753]: I0129 14:54:58.150656 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:54:58 crc kubenswrapper[4753]: E0129 14:54:58.151365 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:55:09 crc kubenswrapper[4753]: I0129 14:55:09.149957 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:55:09 crc kubenswrapper[4753]: E0129 14:55:09.150860 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:55:23 crc kubenswrapper[4753]: I0129 14:55:23.149513 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:55:23 crc kubenswrapper[4753]: E0129 14:55:23.150312 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:55:35 crc kubenswrapper[4753]: I0129 14:55:35.151129 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:55:35 crc kubenswrapper[4753]: E0129 14:55:35.152409 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:55:49 crc kubenswrapper[4753]: I0129 14:55:49.149855 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:55:49 crc kubenswrapper[4753]: E0129 14:55:49.150978 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:56:04 crc kubenswrapper[4753]: I0129 14:56:04.150413 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:56:04 crc kubenswrapper[4753]: E0129 14:56:04.151566 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:56:16 crc kubenswrapper[4753]: I0129 14:56:16.157715 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:56:16 crc kubenswrapper[4753]: E0129 14:56:16.158491 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:56:30 crc kubenswrapper[4753]: I0129 14:56:30.149193 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:56:30 crc kubenswrapper[4753]: E0129 14:56:30.149981 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:56:43 crc kubenswrapper[4753]: I0129 14:56:43.149308 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:56:43 crc kubenswrapper[4753]: E0129 14:56:43.150213 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:56:55 crc kubenswrapper[4753]: I0129 14:56:55.149705 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:56:55 crc kubenswrapper[4753]: E0129 14:56:55.151652 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 14:57:08 crc kubenswrapper[4753]: I0129 14:57:08.150098 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 14:57:09 crc kubenswrapper[4753]: I0129 14:57:09.365113 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"ce8ef4063602211ddb25c697cc29c8c5034bf98ea655922724a788bd505605e4"} Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.493383 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hwktw"] Jan 29 14:57:36 crc kubenswrapper[4753]: E0129 14:57:36.494201 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerName="extract-content" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.494216 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerName="extract-content" Jan 29 14:57:36 crc kubenswrapper[4753]: E0129 14:57:36.494232 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerName="registry-server" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.494238 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerName="registry-server" Jan 29 14:57:36 crc kubenswrapper[4753]: E0129 14:57:36.494246 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerName="extract-utilities" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.494253 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerName="extract-utilities" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.494587 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2b03b7-f109-4a7f-90f0-6ef498e4f100" containerName="registry-server" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.495739 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.502983 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hwktw"] Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.584581 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d4n2\" (UniqueName: \"kubernetes.io/projected/178caac8-a41e-4db5-933b-6720b2aa4e17-kube-api-access-7d4n2\") pod \"redhat-operators-hwktw\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.584752 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-catalog-content\") pod \"redhat-operators-hwktw\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.584794 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-utilities\") pod \"redhat-operators-hwktw\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.686097 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d4n2\" (UniqueName: \"kubernetes.io/projected/178caac8-a41e-4db5-933b-6720b2aa4e17-kube-api-access-7d4n2\") pod \"redhat-operators-hwktw\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.686185 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-catalog-content\") pod \"redhat-operators-hwktw\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.686220 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-utilities\") pod \"redhat-operators-hwktw\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.686692 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-catalog-content\") pod \"redhat-operators-hwktw\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.686745 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-utilities\") pod \"redhat-operators-hwktw\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.714589 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d4n2\" (UniqueName: \"kubernetes.io/projected/178caac8-a41e-4db5-933b-6720b2aa4e17-kube-api-access-7d4n2\") pod \"redhat-operators-hwktw\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:36 crc kubenswrapper[4753]: I0129 14:57:36.814944 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:37 crc kubenswrapper[4753]: I0129 14:57:37.277353 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hwktw"] Jan 29 14:57:37 crc kubenswrapper[4753]: I0129 14:57:37.559188 4753 generic.go:334] "Generic (PLEG): container finished" podID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerID="2c2ad826f6276518a50ae8ca72a5abf25fb37b5ff3da0857f33df74ce0076f59" exitCode=0 Jan 29 14:57:37 crc kubenswrapper[4753]: I0129 14:57:37.559250 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwktw" event={"ID":"178caac8-a41e-4db5-933b-6720b2aa4e17","Type":"ContainerDied","Data":"2c2ad826f6276518a50ae8ca72a5abf25fb37b5ff3da0857f33df74ce0076f59"} Jan 29 14:57:37 crc kubenswrapper[4753]: I0129 14:57:37.559286 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwktw" event={"ID":"178caac8-a41e-4db5-933b-6720b2aa4e17","Type":"ContainerStarted","Data":"961f7de7baf181aaa103bcfaf19d03a2e02da08ab319b419b8b7f70a5cd11ee0"} Jan 29 14:57:37 crc kubenswrapper[4753]: I0129 14:57:37.562535 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 14:57:38 crc kubenswrapper[4753]: I0129 14:57:38.569162 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwktw" event={"ID":"178caac8-a41e-4db5-933b-6720b2aa4e17","Type":"ContainerStarted","Data":"fb94c680496af4cfaeb7b9ce3511c95b819d025f42ee7111a5bc27dc20ebf58e"} Jan 29 14:57:39 crc kubenswrapper[4753]: I0129 14:57:39.585185 4753 generic.go:334] "Generic (PLEG): container finished" podID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerID="fb94c680496af4cfaeb7b9ce3511c95b819d025f42ee7111a5bc27dc20ebf58e" exitCode=0 Jan 29 14:57:39 crc kubenswrapper[4753]: I0129 14:57:39.585321 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwktw" event={"ID":"178caac8-a41e-4db5-933b-6720b2aa4e17","Type":"ContainerDied","Data":"fb94c680496af4cfaeb7b9ce3511c95b819d025f42ee7111a5bc27dc20ebf58e"} Jan 29 14:57:40 crc kubenswrapper[4753]: I0129 14:57:40.595837 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwktw" event={"ID":"178caac8-a41e-4db5-933b-6720b2aa4e17","Type":"ContainerStarted","Data":"5d6574829c02367a133260546854989f3081bfad012988fd428547de1f3d4271"} Jan 29 14:57:40 crc kubenswrapper[4753]: I0129 14:57:40.614336 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hwktw" podStartSLOduration=2.132161348 podStartE2EDuration="4.614314237s" podCreationTimestamp="2026-01-29 14:57:36 +0000 UTC" firstStartedPulling="2026-01-29 14:57:37.562209802 +0000 UTC m=+3292.256944184" lastFinishedPulling="2026-01-29 14:57:40.044362691 +0000 UTC m=+3294.739097073" observedRunningTime="2026-01-29 14:57:40.611599973 +0000 UTC m=+3295.306334365" watchObservedRunningTime="2026-01-29 14:57:40.614314237 +0000 UTC m=+3295.309048639" Jan 29 14:57:46 crc kubenswrapper[4753]: I0129 14:57:46.815265 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:46 crc kubenswrapper[4753]: I0129 14:57:46.815636 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:46 crc kubenswrapper[4753]: I0129 14:57:46.864104 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:47 crc kubenswrapper[4753]: I0129 14:57:47.689656 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:47 crc kubenswrapper[4753]: I0129 14:57:47.739887 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hwktw"] Jan 29 14:57:49 crc kubenswrapper[4753]: I0129 14:57:49.653343 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hwktw" podUID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerName="registry-server" containerID="cri-o://5d6574829c02367a133260546854989f3081bfad012988fd428547de1f3d4271" gracePeriod=2 Jan 29 14:57:52 crc kubenswrapper[4753]: I0129 14:57:52.677760 4753 generic.go:334] "Generic (PLEG): container finished" podID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerID="5d6574829c02367a133260546854989f3081bfad012988fd428547de1f3d4271" exitCode=0 Jan 29 14:57:52 crc kubenswrapper[4753]: I0129 14:57:52.677848 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwktw" event={"ID":"178caac8-a41e-4db5-933b-6720b2aa4e17","Type":"ContainerDied","Data":"5d6574829c02367a133260546854989f3081bfad012988fd428547de1f3d4271"} Jan 29 14:57:52 crc kubenswrapper[4753]: I0129 14:57:52.806242 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:52 crc kubenswrapper[4753]: I0129 14:57:52.834853 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d4n2\" (UniqueName: \"kubernetes.io/projected/178caac8-a41e-4db5-933b-6720b2aa4e17-kube-api-access-7d4n2\") pod \"178caac8-a41e-4db5-933b-6720b2aa4e17\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " Jan 29 14:57:52 crc kubenswrapper[4753]: I0129 14:57:52.835046 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-catalog-content\") pod \"178caac8-a41e-4db5-933b-6720b2aa4e17\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " Jan 29 14:57:52 crc kubenswrapper[4753]: I0129 14:57:52.835183 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-utilities\") pod \"178caac8-a41e-4db5-933b-6720b2aa4e17\" (UID: \"178caac8-a41e-4db5-933b-6720b2aa4e17\") " Jan 29 14:57:52 crc kubenswrapper[4753]: I0129 14:57:52.836052 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-utilities" (OuterVolumeSpecName: "utilities") pod "178caac8-a41e-4db5-933b-6720b2aa4e17" (UID: "178caac8-a41e-4db5-933b-6720b2aa4e17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:57:52 crc kubenswrapper[4753]: I0129 14:57:52.841780 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/178caac8-a41e-4db5-933b-6720b2aa4e17-kube-api-access-7d4n2" (OuterVolumeSpecName: "kube-api-access-7d4n2") pod "178caac8-a41e-4db5-933b-6720b2aa4e17" (UID: "178caac8-a41e-4db5-933b-6720b2aa4e17"). InnerVolumeSpecName "kube-api-access-7d4n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:57:52 crc kubenswrapper[4753]: I0129 14:57:52.936782 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d4n2\" (UniqueName: \"kubernetes.io/projected/178caac8-a41e-4db5-933b-6720b2aa4e17-kube-api-access-7d4n2\") on node \"crc\" DevicePath \"\"" Jan 29 14:57:52 crc kubenswrapper[4753]: I0129 14:57:52.936834 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:57:53 crc kubenswrapper[4753]: I0129 14:57:53.014284 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "178caac8-a41e-4db5-933b-6720b2aa4e17" (UID: "178caac8-a41e-4db5-933b-6720b2aa4e17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:57:53 crc kubenswrapper[4753]: I0129 14:57:53.038039 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/178caac8-a41e-4db5-933b-6720b2aa4e17-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:57:53 crc kubenswrapper[4753]: I0129 14:57:53.688092 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwktw" event={"ID":"178caac8-a41e-4db5-933b-6720b2aa4e17","Type":"ContainerDied","Data":"961f7de7baf181aaa103bcfaf19d03a2e02da08ab319b419b8b7f70a5cd11ee0"} Jan 29 14:57:53 crc kubenswrapper[4753]: I0129 14:57:53.688165 4753 scope.go:117] "RemoveContainer" containerID="5d6574829c02367a133260546854989f3081bfad012988fd428547de1f3d4271" Jan 29 14:57:53 crc kubenswrapper[4753]: I0129 14:57:53.688187 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwktw" Jan 29 14:57:53 crc kubenswrapper[4753]: I0129 14:57:53.706947 4753 scope.go:117] "RemoveContainer" containerID="fb94c680496af4cfaeb7b9ce3511c95b819d025f42ee7111a5bc27dc20ebf58e" Jan 29 14:57:53 crc kubenswrapper[4753]: I0129 14:57:53.723508 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hwktw"] Jan 29 14:57:53 crc kubenswrapper[4753]: I0129 14:57:53.730630 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hwktw"] Jan 29 14:57:53 crc kubenswrapper[4753]: I0129 14:57:53.734384 4753 scope.go:117] "RemoveContainer" containerID="2c2ad826f6276518a50ae8ca72a5abf25fb37b5ff3da0857f33df74ce0076f59" Jan 29 14:57:54 crc kubenswrapper[4753]: I0129 14:57:54.160432 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="178caac8-a41e-4db5-933b-6720b2aa4e17" path="/var/lib/kubelet/pods/178caac8-a41e-4db5-933b-6720b2aa4e17/volumes" Jan 29 14:58:28 crc kubenswrapper[4753]: I0129 14:58:28.847438 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s4d8g"] Jan 29 14:58:28 crc kubenswrapper[4753]: E0129 14:58:28.848621 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerName="extract-content" Jan 29 14:58:28 crc kubenswrapper[4753]: I0129 14:58:28.848641 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerName="extract-content" Jan 29 14:58:28 crc kubenswrapper[4753]: E0129 14:58:28.848658 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerName="extract-utilities" Jan 29 14:58:28 crc kubenswrapper[4753]: I0129 14:58:28.848666 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerName="extract-utilities" Jan 29 14:58:28 crc kubenswrapper[4753]: E0129 14:58:28.848686 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerName="registry-server" Jan 29 14:58:28 crc kubenswrapper[4753]: I0129 14:58:28.848694 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerName="registry-server" Jan 29 14:58:28 crc kubenswrapper[4753]: I0129 14:58:28.848868 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="178caac8-a41e-4db5-933b-6720b2aa4e17" containerName="registry-server" Jan 29 14:58:28 crc kubenswrapper[4753]: I0129 14:58:28.850072 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:28 crc kubenswrapper[4753]: I0129 14:58:28.864034 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4d8g"] Jan 29 14:58:28 crc kubenswrapper[4753]: I0129 14:58:28.955377 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-catalog-content\") pod \"certified-operators-s4d8g\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:28 crc kubenswrapper[4753]: I0129 14:58:28.955440 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-utilities\") pod \"certified-operators-s4d8g\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:28 crc kubenswrapper[4753]: I0129 14:58:28.955583 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44fh5\" (UniqueName: \"kubernetes.io/projected/e8334c18-90fd-4856-aa9c-24a59b35e96f-kube-api-access-44fh5\") pod \"certified-operators-s4d8g\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.056554 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-catalog-content\") pod \"certified-operators-s4d8g\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.056635 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-utilities\") pod \"certified-operators-s4d8g\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.056686 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44fh5\" (UniqueName: \"kubernetes.io/projected/e8334c18-90fd-4856-aa9c-24a59b35e96f-kube-api-access-44fh5\") pod \"certified-operators-s4d8g\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.057382 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-catalog-content\") pod \"certified-operators-s4d8g\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.057506 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-utilities\") pod \"certified-operators-s4d8g\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.077119 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44fh5\" (UniqueName: \"kubernetes.io/projected/e8334c18-90fd-4856-aa9c-24a59b35e96f-kube-api-access-44fh5\") pod \"certified-operators-s4d8g\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.180941 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.697118 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4d8g"] Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.949308 4753 generic.go:334] "Generic (PLEG): container finished" podID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerID="c201969b7597a627f75b448fa05f08db03f9ca529f7a746d101e2fb337b333c8" exitCode=0 Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.949377 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4d8g" event={"ID":"e8334c18-90fd-4856-aa9c-24a59b35e96f","Type":"ContainerDied","Data":"c201969b7597a627f75b448fa05f08db03f9ca529f7a746d101e2fb337b333c8"} Jan 29 14:58:29 crc kubenswrapper[4753]: I0129 14:58:29.949435 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4d8g" event={"ID":"e8334c18-90fd-4856-aa9c-24a59b35e96f","Type":"ContainerStarted","Data":"96fe38755266fd1b79c29a7eea7dc57d3e96002f00ea580de9aa5c3ad9dc0a2c"} Jan 29 14:58:31 crc kubenswrapper[4753]: I0129 14:58:31.964017 4753 generic.go:334] "Generic (PLEG): container finished" podID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerID="edd0704dd1e3343ed82fd5eee43a306266615ae4ab4f305474247a248cf6ce9d" exitCode=0 Jan 29 14:58:31 crc kubenswrapper[4753]: I0129 14:58:31.964104 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4d8g" event={"ID":"e8334c18-90fd-4856-aa9c-24a59b35e96f","Type":"ContainerDied","Data":"edd0704dd1e3343ed82fd5eee43a306266615ae4ab4f305474247a248cf6ce9d"} Jan 29 14:58:32 crc kubenswrapper[4753]: I0129 14:58:32.974560 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4d8g" event={"ID":"e8334c18-90fd-4856-aa9c-24a59b35e96f","Type":"ContainerStarted","Data":"3610dd4225475a43fe5cf3eebd6e8217510fc5813a41a584bc9d1e614e8aae67"} Jan 29 14:58:33 crc kubenswrapper[4753]: I0129 14:58:33.016515 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s4d8g" podStartSLOduration=2.604411081 podStartE2EDuration="5.016498942s" podCreationTimestamp="2026-01-29 14:58:28 +0000 UTC" firstStartedPulling="2026-01-29 14:58:29.951051791 +0000 UTC m=+3344.645786183" lastFinishedPulling="2026-01-29 14:58:32.363139662 +0000 UTC m=+3347.057874044" observedRunningTime="2026-01-29 14:58:33.007545062 +0000 UTC m=+3347.702279444" watchObservedRunningTime="2026-01-29 14:58:33.016498942 +0000 UTC m=+3347.711233324" Jan 29 14:58:39 crc kubenswrapper[4753]: I0129 14:58:39.181986 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:39 crc kubenswrapper[4753]: I0129 14:58:39.182710 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:39 crc kubenswrapper[4753]: I0129 14:58:39.222107 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:40 crc kubenswrapper[4753]: I0129 14:58:40.067397 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:40 crc kubenswrapper[4753]: I0129 14:58:40.116121 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s4d8g"] Jan 29 14:58:42 crc kubenswrapper[4753]: I0129 14:58:42.036773 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s4d8g" podUID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerName="registry-server" containerID="cri-o://3610dd4225475a43fe5cf3eebd6e8217510fc5813a41a584bc9d1e614e8aae67" gracePeriod=2 Jan 29 14:58:43 crc kubenswrapper[4753]: I0129 14:58:43.047149 4753 generic.go:334] "Generic (PLEG): container finished" podID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerID="3610dd4225475a43fe5cf3eebd6e8217510fc5813a41a584bc9d1e614e8aae67" exitCode=0 Jan 29 14:58:43 crc kubenswrapper[4753]: I0129 14:58:43.047236 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4d8g" event={"ID":"e8334c18-90fd-4856-aa9c-24a59b35e96f","Type":"ContainerDied","Data":"3610dd4225475a43fe5cf3eebd6e8217510fc5813a41a584bc9d1e614e8aae67"} Jan 29 14:58:44 crc kubenswrapper[4753]: I0129 14:58:44.764624 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:44 crc kubenswrapper[4753]: I0129 14:58:44.886438 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-catalog-content\") pod \"e8334c18-90fd-4856-aa9c-24a59b35e96f\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " Jan 29 14:58:44 crc kubenswrapper[4753]: I0129 14:58:44.886777 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-utilities\") pod \"e8334c18-90fd-4856-aa9c-24a59b35e96f\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " Jan 29 14:58:44 crc kubenswrapper[4753]: I0129 14:58:44.886964 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44fh5\" (UniqueName: \"kubernetes.io/projected/e8334c18-90fd-4856-aa9c-24a59b35e96f-kube-api-access-44fh5\") pod \"e8334c18-90fd-4856-aa9c-24a59b35e96f\" (UID: \"e8334c18-90fd-4856-aa9c-24a59b35e96f\") " Jan 29 14:58:44 crc kubenswrapper[4753]: I0129 14:58:44.889089 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-utilities" (OuterVolumeSpecName: "utilities") pod "e8334c18-90fd-4856-aa9c-24a59b35e96f" (UID: "e8334c18-90fd-4856-aa9c-24a59b35e96f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:58:44 crc kubenswrapper[4753]: I0129 14:58:44.892487 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8334c18-90fd-4856-aa9c-24a59b35e96f-kube-api-access-44fh5" (OuterVolumeSpecName: "kube-api-access-44fh5") pod "e8334c18-90fd-4856-aa9c-24a59b35e96f" (UID: "e8334c18-90fd-4856-aa9c-24a59b35e96f"). InnerVolumeSpecName "kube-api-access-44fh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 14:58:44 crc kubenswrapper[4753]: I0129 14:58:44.990111 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 14:58:44 crc kubenswrapper[4753]: I0129 14:58:44.990150 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44fh5\" (UniqueName: \"kubernetes.io/projected/e8334c18-90fd-4856-aa9c-24a59b35e96f-kube-api-access-44fh5\") on node \"crc\" DevicePath \"\"" Jan 29 14:58:45 crc kubenswrapper[4753]: I0129 14:58:45.032190 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8334c18-90fd-4856-aa9c-24a59b35e96f" (UID: "e8334c18-90fd-4856-aa9c-24a59b35e96f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 14:58:45 crc kubenswrapper[4753]: I0129 14:58:45.064023 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4d8g" event={"ID":"e8334c18-90fd-4856-aa9c-24a59b35e96f","Type":"ContainerDied","Data":"96fe38755266fd1b79c29a7eea7dc57d3e96002f00ea580de9aa5c3ad9dc0a2c"} Jan 29 14:58:45 crc kubenswrapper[4753]: I0129 14:58:45.064103 4753 scope.go:117] "RemoveContainer" containerID="3610dd4225475a43fe5cf3eebd6e8217510fc5813a41a584bc9d1e614e8aae67" Jan 29 14:58:45 crc kubenswrapper[4753]: I0129 14:58:45.064118 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4d8g" Jan 29 14:58:45 crc kubenswrapper[4753]: I0129 14:58:45.091436 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8334c18-90fd-4856-aa9c-24a59b35e96f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 14:58:45 crc kubenswrapper[4753]: I0129 14:58:45.098466 4753 scope.go:117] "RemoveContainer" containerID="edd0704dd1e3343ed82fd5eee43a306266615ae4ab4f305474247a248cf6ce9d" Jan 29 14:58:45 crc kubenswrapper[4753]: I0129 14:58:45.113005 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s4d8g"] Jan 29 14:58:45 crc kubenswrapper[4753]: I0129 14:58:45.123005 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s4d8g"] Jan 29 14:58:45 crc kubenswrapper[4753]: I0129 14:58:45.132507 4753 scope.go:117] "RemoveContainer" containerID="c201969b7597a627f75b448fa05f08db03f9ca529f7a746d101e2fb337b333c8" Jan 29 14:58:46 crc kubenswrapper[4753]: I0129 14:58:46.158007 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8334c18-90fd-4856-aa9c-24a59b35e96f" path="/var/lib/kubelet/pods/e8334c18-90fd-4856-aa9c-24a59b35e96f/volumes" Jan 29 14:59:27 crc kubenswrapper[4753]: I0129 14:59:27.054549 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:59:27 crc kubenswrapper[4753]: I0129 14:59:27.055196 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.496043 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5nf"] Jan 29 14:59:53 crc kubenswrapper[4753]: E0129 14:59:53.496846 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerName="registry-server" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.496859 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerName="registry-server" Jan 29 14:59:53 crc kubenswrapper[4753]: E0129 14:59:53.496878 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerName="extract-utilities" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.496884 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerName="extract-utilities" Jan 29 14:59:53 crc kubenswrapper[4753]: E0129 14:59:53.496897 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerName="extract-content" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.496903 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerName="extract-content" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.497034 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8334c18-90fd-4856-aa9c-24a59b35e96f" containerName="registry-server" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.497946 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.522005 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5nf"] Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.586578 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-utilities\") pod \"redhat-marketplace-fj5nf\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.586705 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98n2r\" (UniqueName: \"kubernetes.io/projected/42fd6e07-17e1-47a5-bb1c-6dea325233be-kube-api-access-98n2r\") pod \"redhat-marketplace-fj5nf\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.586771 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-catalog-content\") pod \"redhat-marketplace-fj5nf\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.688609 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-catalog-content\") pod \"redhat-marketplace-fj5nf\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.688724 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-utilities\") pod \"redhat-marketplace-fj5nf\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.688841 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98n2r\" (UniqueName: \"kubernetes.io/projected/42fd6e07-17e1-47a5-bb1c-6dea325233be-kube-api-access-98n2r\") pod \"redhat-marketplace-fj5nf\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.689359 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-utilities\") pod \"redhat-marketplace-fj5nf\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.689461 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-catalog-content\") pod \"redhat-marketplace-fj5nf\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.709058 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98n2r\" (UniqueName: \"kubernetes.io/projected/42fd6e07-17e1-47a5-bb1c-6dea325233be-kube-api-access-98n2r\") pod \"redhat-marketplace-fj5nf\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:53 crc kubenswrapper[4753]: I0129 14:59:53.819390 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 14:59:54 crc kubenswrapper[4753]: I0129 14:59:54.221746 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5nf"] Jan 29 14:59:54 crc kubenswrapper[4753]: I0129 14:59:54.584141 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5nf" event={"ID":"42fd6e07-17e1-47a5-bb1c-6dea325233be","Type":"ContainerStarted","Data":"52631898c65d0f8b282139e906803d8d31c9248b904cd2f630d67ef77ec7ea54"} Jan 29 14:59:55 crc kubenswrapper[4753]: I0129 14:59:55.591127 4753 generic.go:334] "Generic (PLEG): container finished" podID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerID="7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002" exitCode=0 Jan 29 14:59:55 crc kubenswrapper[4753]: I0129 14:59:55.591197 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5nf" event={"ID":"42fd6e07-17e1-47a5-bb1c-6dea325233be","Type":"ContainerDied","Data":"7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002"} Jan 29 14:59:56 crc kubenswrapper[4753]: I0129 14:59:56.598619 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5nf" event={"ID":"42fd6e07-17e1-47a5-bb1c-6dea325233be","Type":"ContainerStarted","Data":"8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d"} Jan 29 14:59:57 crc kubenswrapper[4753]: I0129 14:59:57.054476 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 14:59:57 crc kubenswrapper[4753]: I0129 14:59:57.054535 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 14:59:57 crc kubenswrapper[4753]: I0129 14:59:57.606387 4753 generic.go:334] "Generic (PLEG): container finished" podID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerID="8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d" exitCode=0 Jan 29 14:59:57 crc kubenswrapper[4753]: I0129 14:59:57.606429 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5nf" event={"ID":"42fd6e07-17e1-47a5-bb1c-6dea325233be","Type":"ContainerDied","Data":"8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d"} Jan 29 14:59:58 crc kubenswrapper[4753]: I0129 14:59:58.617779 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5nf" event={"ID":"42fd6e07-17e1-47a5-bb1c-6dea325233be","Type":"ContainerStarted","Data":"bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824"} Jan 29 14:59:58 crc kubenswrapper[4753]: I0129 14:59:58.648922 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fj5nf" podStartSLOduration=3.226010252 podStartE2EDuration="5.648899409s" podCreationTimestamp="2026-01-29 14:59:53 +0000 UTC" firstStartedPulling="2026-01-29 14:59:55.592550148 +0000 UTC m=+3430.287284530" lastFinishedPulling="2026-01-29 14:59:58.015439295 +0000 UTC m=+3432.710173687" observedRunningTime="2026-01-29 14:59:58.637942715 +0000 UTC m=+3433.332677127" watchObservedRunningTime="2026-01-29 14:59:58.648899409 +0000 UTC m=+3433.343633791" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.177411 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg"] Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.178459 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.180980 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.180986 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.197558 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg"] Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.286076 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bdf128f-e268-4db9-8d94-73ff14869f6f-secret-volume\") pod \"collect-profiles-29494980-bwkhg\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.286296 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bdf128f-e268-4db9-8d94-73ff14869f6f-config-volume\") pod \"collect-profiles-29494980-bwkhg\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.286395 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrj5\" (UniqueName: \"kubernetes.io/projected/4bdf128f-e268-4db9-8d94-73ff14869f6f-kube-api-access-jvrj5\") pod \"collect-profiles-29494980-bwkhg\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.388501 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bdf128f-e268-4db9-8d94-73ff14869f6f-secret-volume\") pod \"collect-profiles-29494980-bwkhg\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.388796 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bdf128f-e268-4db9-8d94-73ff14869f6f-config-volume\") pod \"collect-profiles-29494980-bwkhg\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.388864 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrj5\" (UniqueName: \"kubernetes.io/projected/4bdf128f-e268-4db9-8d94-73ff14869f6f-kube-api-access-jvrj5\") pod \"collect-profiles-29494980-bwkhg\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.389849 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bdf128f-e268-4db9-8d94-73ff14869f6f-config-volume\") pod \"collect-profiles-29494980-bwkhg\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.394770 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bdf128f-e268-4db9-8d94-73ff14869f6f-secret-volume\") pod \"collect-profiles-29494980-bwkhg\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.405677 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrj5\" (UniqueName: \"kubernetes.io/projected/4bdf128f-e268-4db9-8d94-73ff14869f6f-kube-api-access-jvrj5\") pod \"collect-profiles-29494980-bwkhg\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.496773 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:00 crc kubenswrapper[4753]: I0129 15:00:00.722505 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg"] Jan 29 15:00:00 crc kubenswrapper[4753]: W0129 15:00:00.733666 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bdf128f_e268_4db9_8d94_73ff14869f6f.slice/crio-a62049ee42766cb828c86de7930930636ee6c3cdbef3ccdec0c7c94ec2dbe2ca WatchSource:0}: Error finding container a62049ee42766cb828c86de7930930636ee6c3cdbef3ccdec0c7c94ec2dbe2ca: Status 404 returned error can't find the container with id a62049ee42766cb828c86de7930930636ee6c3cdbef3ccdec0c7c94ec2dbe2ca Jan 29 15:00:01 crc kubenswrapper[4753]: I0129 15:00:01.638845 4753 generic.go:334] "Generic (PLEG): container finished" podID="4bdf128f-e268-4db9-8d94-73ff14869f6f" containerID="6fdb622d7cac9c4eaa8e3ffb1f44bffc115ed0d4186720085dff27d18687c99a" exitCode=0 Jan 29 15:00:01 crc kubenswrapper[4753]: I0129 15:00:01.638937 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" event={"ID":"4bdf128f-e268-4db9-8d94-73ff14869f6f","Type":"ContainerDied","Data":"6fdb622d7cac9c4eaa8e3ffb1f44bffc115ed0d4186720085dff27d18687c99a"} Jan 29 15:00:01 crc kubenswrapper[4753]: I0129 15:00:01.639169 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" event={"ID":"4bdf128f-e268-4db9-8d94-73ff14869f6f","Type":"ContainerStarted","Data":"a62049ee42766cb828c86de7930930636ee6c3cdbef3ccdec0c7c94ec2dbe2ca"} Jan 29 15:00:02 crc kubenswrapper[4753]: I0129 15:00:02.934646 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.023585 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvrj5\" (UniqueName: \"kubernetes.io/projected/4bdf128f-e268-4db9-8d94-73ff14869f6f-kube-api-access-jvrj5\") pod \"4bdf128f-e268-4db9-8d94-73ff14869f6f\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.023706 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bdf128f-e268-4db9-8d94-73ff14869f6f-secret-volume\") pod \"4bdf128f-e268-4db9-8d94-73ff14869f6f\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.023768 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bdf128f-e268-4db9-8d94-73ff14869f6f-config-volume\") pod \"4bdf128f-e268-4db9-8d94-73ff14869f6f\" (UID: \"4bdf128f-e268-4db9-8d94-73ff14869f6f\") " Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.024612 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bdf128f-e268-4db9-8d94-73ff14869f6f-config-volume" (OuterVolumeSpecName: "config-volume") pod "4bdf128f-e268-4db9-8d94-73ff14869f6f" (UID: "4bdf128f-e268-4db9-8d94-73ff14869f6f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.028439 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdf128f-e268-4db9-8d94-73ff14869f6f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4bdf128f-e268-4db9-8d94-73ff14869f6f" (UID: "4bdf128f-e268-4db9-8d94-73ff14869f6f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.028552 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bdf128f-e268-4db9-8d94-73ff14869f6f-kube-api-access-jvrj5" (OuterVolumeSpecName: "kube-api-access-jvrj5") pod "4bdf128f-e268-4db9-8d94-73ff14869f6f" (UID: "4bdf128f-e268-4db9-8d94-73ff14869f6f"). InnerVolumeSpecName "kube-api-access-jvrj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.125722 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvrj5\" (UniqueName: \"kubernetes.io/projected/4bdf128f-e268-4db9-8d94-73ff14869f6f-kube-api-access-jvrj5\") on node \"crc\" DevicePath \"\"" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.125763 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4bdf128f-e268-4db9-8d94-73ff14869f6f-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.125776 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4bdf128f-e268-4db9-8d94-73ff14869f6f-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.653719 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" event={"ID":"4bdf128f-e268-4db9-8d94-73ff14869f6f","Type":"ContainerDied","Data":"a62049ee42766cb828c86de7930930636ee6c3cdbef3ccdec0c7c94ec2dbe2ca"} Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.653759 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a62049ee42766cb828c86de7930930636ee6c3cdbef3ccdec0c7c94ec2dbe2ca" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.653854 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.819986 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.820043 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 15:00:03 crc kubenswrapper[4753]: I0129 15:00:03.864021 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 15:00:04 crc kubenswrapper[4753]: I0129 15:00:04.002975 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8"] Jan 29 15:00:04 crc kubenswrapper[4753]: I0129 15:00:04.007845 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494935-n4hw8"] Jan 29 15:00:04 crc kubenswrapper[4753]: I0129 15:00:04.164299 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f981cab8-0955-4cb1-98ed-a7aecbca702c" path="/var/lib/kubelet/pods/f981cab8-0955-4cb1-98ed-a7aecbca702c/volumes" Jan 29 15:00:04 crc kubenswrapper[4753]: I0129 15:00:04.760884 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 15:00:04 crc kubenswrapper[4753]: I0129 15:00:04.846010 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5nf"] Jan 29 15:00:05 crc kubenswrapper[4753]: I0129 15:00:05.579113 4753 scope.go:117] "RemoveContainer" containerID="f73006e1eedd78c74143a97093d2d84e767137c79d54aec2c86268166a6321db" Jan 29 15:00:06 crc kubenswrapper[4753]: I0129 15:00:06.691344 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fj5nf" podUID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerName="registry-server" containerID="cri-o://bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824" gracePeriod=2 Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.137651 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.177827 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98n2r\" (UniqueName: \"kubernetes.io/projected/42fd6e07-17e1-47a5-bb1c-6dea325233be-kube-api-access-98n2r\") pod \"42fd6e07-17e1-47a5-bb1c-6dea325233be\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.177923 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-utilities\") pod \"42fd6e07-17e1-47a5-bb1c-6dea325233be\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.178036 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-catalog-content\") pod \"42fd6e07-17e1-47a5-bb1c-6dea325233be\" (UID: \"42fd6e07-17e1-47a5-bb1c-6dea325233be\") " Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.179097 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-utilities" (OuterVolumeSpecName: "utilities") pod "42fd6e07-17e1-47a5-bb1c-6dea325233be" (UID: "42fd6e07-17e1-47a5-bb1c-6dea325233be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.183750 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42fd6e07-17e1-47a5-bb1c-6dea325233be-kube-api-access-98n2r" (OuterVolumeSpecName: "kube-api-access-98n2r") pod "42fd6e07-17e1-47a5-bb1c-6dea325233be" (UID: "42fd6e07-17e1-47a5-bb1c-6dea325233be"). InnerVolumeSpecName "kube-api-access-98n2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.253124 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42fd6e07-17e1-47a5-bb1c-6dea325233be" (UID: "42fd6e07-17e1-47a5-bb1c-6dea325233be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.279613 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.279651 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98n2r\" (UniqueName: \"kubernetes.io/projected/42fd6e07-17e1-47a5-bb1c-6dea325233be-kube-api-access-98n2r\") on node \"crc\" DevicePath \"\"" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.279668 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42fd6e07-17e1-47a5-bb1c-6dea325233be-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.711930 4753 generic.go:334] "Generic (PLEG): container finished" podID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerID="bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824" exitCode=0 Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.712023 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj5nf" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.712023 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5nf" event={"ID":"42fd6e07-17e1-47a5-bb1c-6dea325233be","Type":"ContainerDied","Data":"bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824"} Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.713352 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5nf" event={"ID":"42fd6e07-17e1-47a5-bb1c-6dea325233be","Type":"ContainerDied","Data":"52631898c65d0f8b282139e906803d8d31c9248b904cd2f630d67ef77ec7ea54"} Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.713376 4753 scope.go:117] "RemoveContainer" containerID="bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.736175 4753 scope.go:117] "RemoveContainer" containerID="8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.750342 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5nf"] Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.765347 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5nf"] Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.769984 4753 scope.go:117] "RemoveContainer" containerID="7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.802950 4753 scope.go:117] "RemoveContainer" containerID="bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824" Jan 29 15:00:07 crc kubenswrapper[4753]: E0129 15:00:07.803248 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824\": container with ID starting with bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824 not found: ID does not exist" containerID="bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.803277 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824"} err="failed to get container status \"bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824\": rpc error: code = NotFound desc = could not find container \"bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824\": container with ID starting with bb860a0c7231f0796dbbccef4d1734484ea8122a212b8000ff6d83626c1d6824 not found: ID does not exist" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.803298 4753 scope.go:117] "RemoveContainer" containerID="8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d" Jan 29 15:00:07 crc kubenswrapper[4753]: E0129 15:00:07.803487 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d\": container with ID starting with 8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d not found: ID does not exist" containerID="8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.803583 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d"} err="failed to get container status \"8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d\": rpc error: code = NotFound desc = could not find container \"8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d\": container with ID starting with 8ac4817df087a8c44f8f8864b897e7c14d928de941f3e834829a9259e698ba7d not found: ID does not exist" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.803661 4753 scope.go:117] "RemoveContainer" containerID="7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002" Jan 29 15:00:07 crc kubenswrapper[4753]: E0129 15:00:07.803906 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002\": container with ID starting with 7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002 not found: ID does not exist" containerID="7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002" Jan 29 15:00:07 crc kubenswrapper[4753]: I0129 15:00:07.803962 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002"} err="failed to get container status \"7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002\": rpc error: code = NotFound desc = could not find container \"7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002\": container with ID starting with 7f8b3563bed875dc8cfc58e154c4e8b2c5e9168b55d2dcb2167197aeff6e4002 not found: ID does not exist" Jan 29 15:00:08 crc kubenswrapper[4753]: I0129 15:00:08.157966 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42fd6e07-17e1-47a5-bb1c-6dea325233be" path="/var/lib/kubelet/pods/42fd6e07-17e1-47a5-bb1c-6dea325233be/volumes" Jan 29 15:00:27 crc kubenswrapper[4753]: I0129 15:00:27.055054 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:00:27 crc kubenswrapper[4753]: I0129 15:00:27.055841 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:00:27 crc kubenswrapper[4753]: I0129 15:00:27.055917 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:00:27 crc kubenswrapper[4753]: I0129 15:00:27.056844 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce8ef4063602211ddb25c697cc29c8c5034bf98ea655922724a788bd505605e4"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:00:27 crc kubenswrapper[4753]: I0129 15:00:27.056944 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://ce8ef4063602211ddb25c697cc29c8c5034bf98ea655922724a788bd505605e4" gracePeriod=600 Jan 29 15:00:27 crc kubenswrapper[4753]: I0129 15:00:27.856519 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="ce8ef4063602211ddb25c697cc29c8c5034bf98ea655922724a788bd505605e4" exitCode=0 Jan 29 15:00:27 crc kubenswrapper[4753]: I0129 15:00:27.856613 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"ce8ef4063602211ddb25c697cc29c8c5034bf98ea655922724a788bd505605e4"} Jan 29 15:00:27 crc kubenswrapper[4753]: I0129 15:00:27.856914 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e"} Jan 29 15:00:27 crc kubenswrapper[4753]: I0129 15:00:27.856933 4753 scope.go:117] "RemoveContainer" containerID="ea37537eaf675c292721245d6dd331f56290203e93b20b62b70cff896c40f2a8" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.480924 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zlzw6"] Jan 29 15:00:51 crc kubenswrapper[4753]: E0129 15:00:51.481854 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdf128f-e268-4db9-8d94-73ff14869f6f" containerName="collect-profiles" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.481869 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdf128f-e268-4db9-8d94-73ff14869f6f" containerName="collect-profiles" Jan 29 15:00:51 crc kubenswrapper[4753]: E0129 15:00:51.481882 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerName="extract-utilities" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.481890 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerName="extract-utilities" Jan 29 15:00:51 crc kubenswrapper[4753]: E0129 15:00:51.481903 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerName="extract-content" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.481910 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerName="extract-content" Jan 29 15:00:51 crc kubenswrapper[4753]: E0129 15:00:51.481928 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerName="registry-server" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.481936 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerName="registry-server" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.482105 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="42fd6e07-17e1-47a5-bb1c-6dea325233be" containerName="registry-server" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.482136 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bdf128f-e268-4db9-8d94-73ff14869f6f" containerName="collect-profiles" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.483460 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.501928 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zlzw6"] Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.551842 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-utilities\") pod \"community-operators-zlzw6\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.552099 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-catalog-content\") pod \"community-operators-zlzw6\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.552211 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkr72\" (UniqueName: \"kubernetes.io/projected/c74c7127-84ee-4b27-9915-59054432f7c4-kube-api-access-fkr72\") pod \"community-operators-zlzw6\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.653924 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-utilities\") pod \"community-operators-zlzw6\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.654055 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-catalog-content\") pod \"community-operators-zlzw6\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.654091 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkr72\" (UniqueName: \"kubernetes.io/projected/c74c7127-84ee-4b27-9915-59054432f7c4-kube-api-access-fkr72\") pod \"community-operators-zlzw6\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.654751 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-utilities\") pod \"community-operators-zlzw6\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.654785 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-catalog-content\") pod \"community-operators-zlzw6\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.675076 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkr72\" (UniqueName: \"kubernetes.io/projected/c74c7127-84ee-4b27-9915-59054432f7c4-kube-api-access-fkr72\") pod \"community-operators-zlzw6\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:51 crc kubenswrapper[4753]: I0129 15:00:51.802561 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:00:52 crc kubenswrapper[4753]: I0129 15:00:52.280468 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zlzw6"] Jan 29 15:00:53 crc kubenswrapper[4753]: I0129 15:00:53.153038 4753 generic.go:334] "Generic (PLEG): container finished" podID="c74c7127-84ee-4b27-9915-59054432f7c4" containerID="f7f320896f543286c1326232263e2813b32374f78664b23af10586a62f3ba23c" exitCode=0 Jan 29 15:00:53 crc kubenswrapper[4753]: I0129 15:00:53.153099 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlzw6" event={"ID":"c74c7127-84ee-4b27-9915-59054432f7c4","Type":"ContainerDied","Data":"f7f320896f543286c1326232263e2813b32374f78664b23af10586a62f3ba23c"} Jan 29 15:00:53 crc kubenswrapper[4753]: I0129 15:00:53.153438 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlzw6" event={"ID":"c74c7127-84ee-4b27-9915-59054432f7c4","Type":"ContainerStarted","Data":"32a320b8b8ee8d96b6c017ba050f3e811d846b58ba7ad2f13ff755d899233bf9"} Jan 29 15:00:53 crc kubenswrapper[4753]: E0129 15:00:53.281348 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 15:00:53 crc kubenswrapper[4753]: E0129 15:00:53.281478 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkr72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zlzw6_openshift-marketplace(c74c7127-84ee-4b27-9915-59054432f7c4): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:00:53 crc kubenswrapper[4753]: E0129 15:00:53.282760 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-zlzw6" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" Jan 29 15:00:54 crc kubenswrapper[4753]: E0129 15:00:54.170248 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zlzw6" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" Jan 29 15:01:07 crc kubenswrapper[4753]: E0129 15:01:07.278212 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 15:01:07 crc kubenswrapper[4753]: E0129 15:01:07.278691 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkr72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zlzw6_openshift-marketplace(c74c7127-84ee-4b27-9915-59054432f7c4): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:01:07 crc kubenswrapper[4753]: E0129 15:01:07.280047 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-zlzw6" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" Jan 29 15:01:19 crc kubenswrapper[4753]: E0129 15:01:19.151689 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zlzw6" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" Jan 29 15:01:31 crc kubenswrapper[4753]: E0129 15:01:31.281518 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 15:01:31 crc kubenswrapper[4753]: E0129 15:01:31.282226 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkr72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zlzw6_openshift-marketplace(c74c7127-84ee-4b27-9915-59054432f7c4): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:01:31 crc kubenswrapper[4753]: E0129 15:01:31.283851 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-zlzw6" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" Jan 29 15:01:46 crc kubenswrapper[4753]: E0129 15:01:46.160562 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zlzw6" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" Jan 29 15:01:57 crc kubenswrapper[4753]: E0129 15:01:57.152498 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zlzw6" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" Jan 29 15:02:11 crc kubenswrapper[4753]: E0129 15:02:11.153471 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zlzw6" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" Jan 29 15:02:23 crc kubenswrapper[4753]: I0129 15:02:23.903514 4753 generic.go:334] "Generic (PLEG): container finished" podID="c74c7127-84ee-4b27-9915-59054432f7c4" containerID="da26c0fc027e15496a3112ce09763825e3f8ad21c47b09c62effd5b60da9b6a1" exitCode=0 Jan 29 15:02:23 crc kubenswrapper[4753]: I0129 15:02:23.903555 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlzw6" event={"ID":"c74c7127-84ee-4b27-9915-59054432f7c4","Type":"ContainerDied","Data":"da26c0fc027e15496a3112ce09763825e3f8ad21c47b09c62effd5b60da9b6a1"} Jan 29 15:02:24 crc kubenswrapper[4753]: I0129 15:02:24.911601 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlzw6" event={"ID":"c74c7127-84ee-4b27-9915-59054432f7c4","Type":"ContainerStarted","Data":"b317fd122f1c0d5e9e4d7718cf929319233caf22ed8f9e37d2760682eaef76c6"} Jan 29 15:02:24 crc kubenswrapper[4753]: I0129 15:02:24.933765 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zlzw6" podStartSLOduration=2.438413646 podStartE2EDuration="1m33.933741197s" podCreationTimestamp="2026-01-29 15:00:51 +0000 UTC" firstStartedPulling="2026-01-29 15:00:53.1545984 +0000 UTC m=+3487.849332792" lastFinishedPulling="2026-01-29 15:02:24.649925961 +0000 UTC m=+3579.344660343" observedRunningTime="2026-01-29 15:02:24.92979468 +0000 UTC m=+3579.624529072" watchObservedRunningTime="2026-01-29 15:02:24.933741197 +0000 UTC m=+3579.628475579" Jan 29 15:02:27 crc kubenswrapper[4753]: I0129 15:02:27.055423 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:02:27 crc kubenswrapper[4753]: I0129 15:02:27.055866 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:02:31 crc kubenswrapper[4753]: I0129 15:02:31.802787 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:02:31 crc kubenswrapper[4753]: I0129 15:02:31.803446 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:02:31 crc kubenswrapper[4753]: I0129 15:02:31.863723 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:02:32 crc kubenswrapper[4753]: I0129 15:02:32.007530 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:02:32 crc kubenswrapper[4753]: I0129 15:02:32.113346 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zlzw6"] Jan 29 15:02:33 crc kubenswrapper[4753]: I0129 15:02:33.981402 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zlzw6" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" containerName="registry-server" containerID="cri-o://b317fd122f1c0d5e9e4d7718cf929319233caf22ed8f9e37d2760682eaef76c6" gracePeriod=2 Jan 29 15:02:34 crc kubenswrapper[4753]: I0129 15:02:34.990683 4753 generic.go:334] "Generic (PLEG): container finished" podID="c74c7127-84ee-4b27-9915-59054432f7c4" containerID="b317fd122f1c0d5e9e4d7718cf929319233caf22ed8f9e37d2760682eaef76c6" exitCode=0 Jan 29 15:02:34 crc kubenswrapper[4753]: I0129 15:02:34.990780 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlzw6" event={"ID":"c74c7127-84ee-4b27-9915-59054432f7c4","Type":"ContainerDied","Data":"b317fd122f1c0d5e9e4d7718cf929319233caf22ed8f9e37d2760682eaef76c6"} Jan 29 15:02:35 crc kubenswrapper[4753]: I0129 15:02:35.052978 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:02:35 crc kubenswrapper[4753]: I0129 15:02:35.157465 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-utilities\") pod \"c74c7127-84ee-4b27-9915-59054432f7c4\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " Jan 29 15:02:35 crc kubenswrapper[4753]: I0129 15:02:35.157777 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkr72\" (UniqueName: \"kubernetes.io/projected/c74c7127-84ee-4b27-9915-59054432f7c4-kube-api-access-fkr72\") pod \"c74c7127-84ee-4b27-9915-59054432f7c4\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " Jan 29 15:02:35 crc kubenswrapper[4753]: I0129 15:02:35.157915 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-catalog-content\") pod \"c74c7127-84ee-4b27-9915-59054432f7c4\" (UID: \"c74c7127-84ee-4b27-9915-59054432f7c4\") " Jan 29 15:02:35 crc kubenswrapper[4753]: I0129 15:02:35.158441 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-utilities" (OuterVolumeSpecName: "utilities") pod "c74c7127-84ee-4b27-9915-59054432f7c4" (UID: "c74c7127-84ee-4b27-9915-59054432f7c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:02:35 crc kubenswrapper[4753]: I0129 15:02:35.163225 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74c7127-84ee-4b27-9915-59054432f7c4-kube-api-access-fkr72" (OuterVolumeSpecName: "kube-api-access-fkr72") pod "c74c7127-84ee-4b27-9915-59054432f7c4" (UID: "c74c7127-84ee-4b27-9915-59054432f7c4"). InnerVolumeSpecName "kube-api-access-fkr72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:02:35 crc kubenswrapper[4753]: I0129 15:02:35.216801 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c74c7127-84ee-4b27-9915-59054432f7c4" (UID: "c74c7127-84ee-4b27-9915-59054432f7c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:02:35 crc kubenswrapper[4753]: I0129 15:02:35.259835 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:02:35 crc kubenswrapper[4753]: I0129 15:02:35.259871 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c74c7127-84ee-4b27-9915-59054432f7c4-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:02:35 crc kubenswrapper[4753]: I0129 15:02:35.259885 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkr72\" (UniqueName: \"kubernetes.io/projected/c74c7127-84ee-4b27-9915-59054432f7c4-kube-api-access-fkr72\") on node \"crc\" DevicePath \"\"" Jan 29 15:02:36 crc kubenswrapper[4753]: I0129 15:02:36.004191 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlzw6" event={"ID":"c74c7127-84ee-4b27-9915-59054432f7c4","Type":"ContainerDied","Data":"32a320b8b8ee8d96b6c017ba050f3e811d846b58ba7ad2f13ff755d899233bf9"} Jan 29 15:02:36 crc kubenswrapper[4753]: I0129 15:02:36.004321 4753 scope.go:117] "RemoveContainer" containerID="b317fd122f1c0d5e9e4d7718cf929319233caf22ed8f9e37d2760682eaef76c6" Jan 29 15:02:36 crc kubenswrapper[4753]: I0129 15:02:36.005432 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlzw6" Jan 29 15:02:36 crc kubenswrapper[4753]: I0129 15:02:36.037102 4753 scope.go:117] "RemoveContainer" containerID="da26c0fc027e15496a3112ce09763825e3f8ad21c47b09c62effd5b60da9b6a1" Jan 29 15:02:36 crc kubenswrapper[4753]: I0129 15:02:36.060976 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zlzw6"] Jan 29 15:02:36 crc kubenswrapper[4753]: I0129 15:02:36.062258 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zlzw6"] Jan 29 15:02:36 crc kubenswrapper[4753]: I0129 15:02:36.090712 4753 scope.go:117] "RemoveContainer" containerID="f7f320896f543286c1326232263e2813b32374f78664b23af10586a62f3ba23c" Jan 29 15:02:36 crc kubenswrapper[4753]: I0129 15:02:36.156928 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" path="/var/lib/kubelet/pods/c74c7127-84ee-4b27-9915-59054432f7c4/volumes" Jan 29 15:02:57 crc kubenswrapper[4753]: I0129 15:02:57.054356 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:02:57 crc kubenswrapper[4753]: I0129 15:02:57.054862 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:03:27 crc kubenswrapper[4753]: I0129 15:03:27.055016 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:03:27 crc kubenswrapper[4753]: I0129 15:03:27.055836 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:03:27 crc kubenswrapper[4753]: I0129 15:03:27.055888 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:03:27 crc kubenswrapper[4753]: I0129 15:03:27.056844 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:03:27 crc kubenswrapper[4753]: I0129 15:03:27.056933 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" gracePeriod=600 Jan 29 15:03:27 crc kubenswrapper[4753]: E0129 15:03:27.235135 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:03:27 crc kubenswrapper[4753]: I0129 15:03:27.436879 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" exitCode=0 Jan 29 15:03:27 crc kubenswrapper[4753]: I0129 15:03:27.436922 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e"} Jan 29 15:03:27 crc kubenswrapper[4753]: I0129 15:03:27.436950 4753 scope.go:117] "RemoveContainer" containerID="ce8ef4063602211ddb25c697cc29c8c5034bf98ea655922724a788bd505605e4" Jan 29 15:03:27 crc kubenswrapper[4753]: I0129 15:03:27.437499 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:03:27 crc kubenswrapper[4753]: E0129 15:03:27.437778 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:03:39 crc kubenswrapper[4753]: I0129 15:03:39.149530 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:03:39 crc kubenswrapper[4753]: E0129 15:03:39.150377 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:03:50 crc kubenswrapper[4753]: I0129 15:03:50.149565 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:03:50 crc kubenswrapper[4753]: E0129 15:03:50.150406 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:04:04 crc kubenswrapper[4753]: I0129 15:04:04.150130 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:04:04 crc kubenswrapper[4753]: E0129 15:04:04.150906 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:04:18 crc kubenswrapper[4753]: I0129 15:04:18.150063 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:04:18 crc kubenswrapper[4753]: E0129 15:04:18.150978 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:04:32 crc kubenswrapper[4753]: I0129 15:04:32.149197 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:04:32 crc kubenswrapper[4753]: E0129 15:04:32.150035 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:04:43 crc kubenswrapper[4753]: I0129 15:04:43.149810 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:04:43 crc kubenswrapper[4753]: E0129 15:04:43.150811 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:04:56 crc kubenswrapper[4753]: I0129 15:04:56.158298 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:04:56 crc kubenswrapper[4753]: E0129 15:04:56.159426 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:05:10 crc kubenswrapper[4753]: I0129 15:05:10.150687 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:05:10 crc kubenswrapper[4753]: E0129 15:05:10.152512 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:05:23 crc kubenswrapper[4753]: I0129 15:05:23.149133 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:05:23 crc kubenswrapper[4753]: E0129 15:05:23.149954 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:05:38 crc kubenswrapper[4753]: I0129 15:05:38.151413 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:05:38 crc kubenswrapper[4753]: E0129 15:05:38.152469 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:05:50 crc kubenswrapper[4753]: I0129 15:05:50.149680 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:05:50 crc kubenswrapper[4753]: E0129 15:05:50.150559 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:06:05 crc kubenswrapper[4753]: I0129 15:06:05.149844 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:06:05 crc kubenswrapper[4753]: E0129 15:06:05.150671 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:06:20 crc kubenswrapper[4753]: I0129 15:06:20.149941 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:06:20 crc kubenswrapper[4753]: E0129 15:06:20.151202 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:06:32 crc kubenswrapper[4753]: I0129 15:06:32.150261 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:06:32 crc kubenswrapper[4753]: E0129 15:06:32.151100 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:06:46 crc kubenswrapper[4753]: I0129 15:06:46.157123 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:06:46 crc kubenswrapper[4753]: E0129 15:06:46.158184 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:07:00 crc kubenswrapper[4753]: I0129 15:07:00.151077 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:07:00 crc kubenswrapper[4753]: E0129 15:07:00.151788 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:07:14 crc kubenswrapper[4753]: I0129 15:07:14.149676 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:07:14 crc kubenswrapper[4753]: E0129 15:07:14.150446 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:07:25 crc kubenswrapper[4753]: I0129 15:07:25.149886 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:07:25 crc kubenswrapper[4753]: E0129 15:07:25.150687 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:07:39 crc kubenswrapper[4753]: I0129 15:07:39.150465 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:07:39 crc kubenswrapper[4753]: E0129 15:07:39.151707 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.172572 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lj66p"] Jan 29 15:07:53 crc kubenswrapper[4753]: E0129 15:07:53.173588 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" containerName="extract-content" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.173610 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" containerName="extract-content" Jan 29 15:07:53 crc kubenswrapper[4753]: E0129 15:07:53.173629 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" containerName="extract-utilities" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.173640 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" containerName="extract-utilities" Jan 29 15:07:53 crc kubenswrapper[4753]: E0129 15:07:53.173658 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" containerName="registry-server" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.173671 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" containerName="registry-server" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.173926 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74c7127-84ee-4b27-9915-59054432f7c4" containerName="registry-server" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.176624 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.199785 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lj66p"] Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.334651 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-catalog-content\") pod \"redhat-operators-lj66p\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.334762 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwg7m\" (UniqueName: \"kubernetes.io/projected/7a179da2-a3c8-4c5f-aedf-be464d70fddc-kube-api-access-qwg7m\") pod \"redhat-operators-lj66p\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.334814 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-utilities\") pod \"redhat-operators-lj66p\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.436357 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-catalog-content\") pod \"redhat-operators-lj66p\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.436416 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwg7m\" (UniqueName: \"kubernetes.io/projected/7a179da2-a3c8-4c5f-aedf-be464d70fddc-kube-api-access-qwg7m\") pod \"redhat-operators-lj66p\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.436445 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-utilities\") pod \"redhat-operators-lj66p\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.436916 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-catalog-content\") pod \"redhat-operators-lj66p\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.436992 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-utilities\") pod \"redhat-operators-lj66p\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.460761 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwg7m\" (UniqueName: \"kubernetes.io/projected/7a179da2-a3c8-4c5f-aedf-be464d70fddc-kube-api-access-qwg7m\") pod \"redhat-operators-lj66p\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:53 crc kubenswrapper[4753]: I0129 15:07:53.507498 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:07:54 crc kubenswrapper[4753]: I0129 15:07:54.000134 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lj66p"] Jan 29 15:07:54 crc kubenswrapper[4753]: I0129 15:07:54.149311 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:07:54 crc kubenswrapper[4753]: E0129 15:07:54.149658 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:07:54 crc kubenswrapper[4753]: I0129 15:07:54.471288 4753 generic.go:334] "Generic (PLEG): container finished" podID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerID="a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4" exitCode=0 Jan 29 15:07:54 crc kubenswrapper[4753]: I0129 15:07:54.471588 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj66p" event={"ID":"7a179da2-a3c8-4c5f-aedf-be464d70fddc","Type":"ContainerDied","Data":"a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4"} Jan 29 15:07:54 crc kubenswrapper[4753]: I0129 15:07:54.472289 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj66p" event={"ID":"7a179da2-a3c8-4c5f-aedf-be464d70fddc","Type":"ContainerStarted","Data":"64fb66bd99dd4e39483bc82c83e234623ab036a6ad51909f68ab728430c3a7bc"} Jan 29 15:07:54 crc kubenswrapper[4753]: I0129 15:07:54.473113 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 15:07:54 crc kubenswrapper[4753]: E0129 15:07:54.606357 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 15:07:54 crc kubenswrapper[4753]: E0129 15:07:54.606508 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qwg7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lj66p_openshift-marketplace(7a179da2-a3c8-4c5f-aedf-be464d70fddc): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:07:54 crc kubenswrapper[4753]: E0129 15:07:54.607698 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:07:55 crc kubenswrapper[4753]: E0129 15:07:55.483912 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:08:05 crc kubenswrapper[4753]: I0129 15:08:05.149885 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:08:05 crc kubenswrapper[4753]: E0129 15:08:05.150617 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:08:06 crc kubenswrapper[4753]: E0129 15:08:06.285553 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 15:08:06 crc kubenswrapper[4753]: E0129 15:08:06.286395 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qwg7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lj66p_openshift-marketplace(7a179da2-a3c8-4c5f-aedf-be464d70fddc): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:08:06 crc kubenswrapper[4753]: E0129 15:08:06.287881 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:08:19 crc kubenswrapper[4753]: E0129 15:08:19.153719 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:08:20 crc kubenswrapper[4753]: I0129 15:08:20.150523 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:08:20 crc kubenswrapper[4753]: E0129 15:08:20.151948 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:08:30 crc kubenswrapper[4753]: E0129 15:08:30.300480 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 15:08:30 crc kubenswrapper[4753]: E0129 15:08:30.301136 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qwg7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lj66p_openshift-marketplace(7a179da2-a3c8-4c5f-aedf-be464d70fddc): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:08:30 crc kubenswrapper[4753]: E0129 15:08:30.302383 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:08:32 crc kubenswrapper[4753]: I0129 15:08:32.149691 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:08:33 crc kubenswrapper[4753]: I0129 15:08:33.797323 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"3dcb3a962df2116db3467e2a4acd6e1502acdc60563dc696461415f1f8a1f591"} Jan 29 15:08:44 crc kubenswrapper[4753]: E0129 15:08:44.152689 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:08:58 crc kubenswrapper[4753]: E0129 15:08:58.151678 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:09:05 crc kubenswrapper[4753]: I0129 15:09:05.874997 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7hdjw"] Jan 29 15:09:05 crc kubenswrapper[4753]: I0129 15:09:05.876946 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:05 crc kubenswrapper[4753]: I0129 15:09:05.892602 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7hdjw"] Jan 29 15:09:05 crc kubenswrapper[4753]: I0129 15:09:05.956008 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-catalog-content\") pod \"certified-operators-7hdjw\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:05 crc kubenswrapper[4753]: I0129 15:09:05.956061 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg4lz\" (UniqueName: \"kubernetes.io/projected/beeada0f-43d9-4a2b-8faf-08a90f6223a6-kube-api-access-cg4lz\") pod \"certified-operators-7hdjw\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:05 crc kubenswrapper[4753]: I0129 15:09:05.956138 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-utilities\") pod \"certified-operators-7hdjw\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:06 crc kubenswrapper[4753]: I0129 15:09:06.057800 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-catalog-content\") pod \"certified-operators-7hdjw\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:06 crc kubenswrapper[4753]: I0129 15:09:06.058092 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg4lz\" (UniqueName: \"kubernetes.io/projected/beeada0f-43d9-4a2b-8faf-08a90f6223a6-kube-api-access-cg4lz\") pod \"certified-operators-7hdjw\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:06 crc kubenswrapper[4753]: I0129 15:09:06.058264 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-utilities\") pod \"certified-operators-7hdjw\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:06 crc kubenswrapper[4753]: I0129 15:09:06.058412 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-catalog-content\") pod \"certified-operators-7hdjw\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:06 crc kubenswrapper[4753]: I0129 15:09:06.058793 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-utilities\") pod \"certified-operators-7hdjw\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:06 crc kubenswrapper[4753]: I0129 15:09:06.081944 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg4lz\" (UniqueName: \"kubernetes.io/projected/beeada0f-43d9-4a2b-8faf-08a90f6223a6-kube-api-access-cg4lz\") pod \"certified-operators-7hdjw\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:06 crc kubenswrapper[4753]: I0129 15:09:06.200959 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:09:06 crc kubenswrapper[4753]: I0129 15:09:06.673425 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7hdjw"] Jan 29 15:09:06 crc kubenswrapper[4753]: W0129 15:09:06.676939 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeeada0f_43d9_4a2b_8faf_08a90f6223a6.slice/crio-531cad93af8a3e60bcd67a01716b872abcc8e5da92c8bcb60a0930404245f820 WatchSource:0}: Error finding container 531cad93af8a3e60bcd67a01716b872abcc8e5da92c8bcb60a0930404245f820: Status 404 returned error can't find the container with id 531cad93af8a3e60bcd67a01716b872abcc8e5da92c8bcb60a0930404245f820 Jan 29 15:09:07 crc kubenswrapper[4753]: I0129 15:09:07.062738 4753 generic.go:334] "Generic (PLEG): container finished" podID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerID="5bb67c75953cd913c691f9316db2e8e1d91944bc68610a76f05211bed010adbf" exitCode=0 Jan 29 15:09:07 crc kubenswrapper[4753]: I0129 15:09:07.062801 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hdjw" event={"ID":"beeada0f-43d9-4a2b-8faf-08a90f6223a6","Type":"ContainerDied","Data":"5bb67c75953cd913c691f9316db2e8e1d91944bc68610a76f05211bed010adbf"} Jan 29 15:09:07 crc kubenswrapper[4753]: I0129 15:09:07.064652 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hdjw" event={"ID":"beeada0f-43d9-4a2b-8faf-08a90f6223a6","Type":"ContainerStarted","Data":"531cad93af8a3e60bcd67a01716b872abcc8e5da92c8bcb60a0930404245f820"} Jan 29 15:09:07 crc kubenswrapper[4753]: E0129 15:09:07.197664 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 15:09:07 crc kubenswrapper[4753]: E0129 15:09:07.197875 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cg4lz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7hdjw_openshift-marketplace(beeada0f-43d9-4a2b-8faf-08a90f6223a6): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:09:07 crc kubenswrapper[4753]: E0129 15:09:07.205633 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:09:08 crc kubenswrapper[4753]: E0129 15:09:08.077592 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:09:10 crc kubenswrapper[4753]: E0129 15:09:10.151842 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:09:19 crc kubenswrapper[4753]: E0129 15:09:19.279958 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 15:09:19 crc kubenswrapper[4753]: E0129 15:09:19.280726 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cg4lz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7hdjw_openshift-marketplace(beeada0f-43d9-4a2b-8faf-08a90f6223a6): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:09:19 crc kubenswrapper[4753]: E0129 15:09:19.281961 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:09:21 crc kubenswrapper[4753]: E0129 15:09:21.270082 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 15:09:21 crc kubenswrapper[4753]: E0129 15:09:21.270302 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qwg7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lj66p_openshift-marketplace(7a179da2-a3c8-4c5f-aedf-be464d70fddc): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:09:21 crc kubenswrapper[4753]: E0129 15:09:21.271529 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:09:31 crc kubenswrapper[4753]: E0129 15:09:31.152640 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:09:33 crc kubenswrapper[4753]: E0129 15:09:33.150828 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:09:43 crc kubenswrapper[4753]: E0129 15:09:43.748863 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 15:09:43 crc kubenswrapper[4753]: E0129 15:09:43.749604 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cg4lz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7hdjw_openshift-marketplace(beeada0f-43d9-4a2b-8faf-08a90f6223a6): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:09:43 crc kubenswrapper[4753]: E0129 15:09:43.750836 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:09:45 crc kubenswrapper[4753]: E0129 15:09:45.151287 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:09:56 crc kubenswrapper[4753]: E0129 15:09:56.151964 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:09:58 crc kubenswrapper[4753]: E0129 15:09:58.151744 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:10:07 crc kubenswrapper[4753]: E0129 15:10:07.151531 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:10:09 crc kubenswrapper[4753]: E0129 15:10:09.151737 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:10:18 crc kubenswrapper[4753]: E0129 15:10:18.151734 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:10:20 crc kubenswrapper[4753]: E0129 15:10:20.151206 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:10:30 crc kubenswrapper[4753]: E0129 15:10:30.153463 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:10:33 crc kubenswrapper[4753]: E0129 15:10:33.295055 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 15:10:33 crc kubenswrapper[4753]: E0129 15:10:33.295324 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cg4lz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7hdjw_openshift-marketplace(beeada0f-43d9-4a2b-8faf-08a90f6223a6): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:10:33 crc kubenswrapper[4753]: E0129 15:10:33.296600 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:10:41 crc kubenswrapper[4753]: E0129 15:10:41.152094 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" Jan 29 15:10:46 crc kubenswrapper[4753]: E0129 15:10:46.159057 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:10:54 crc kubenswrapper[4753]: I0129 15:10:54.844741 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj66p" event={"ID":"7a179da2-a3c8-4c5f-aedf-be464d70fddc","Type":"ContainerStarted","Data":"292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735"} Jan 29 15:10:55 crc kubenswrapper[4753]: I0129 15:10:55.860877 4753 generic.go:334] "Generic (PLEG): container finished" podID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerID="292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735" exitCode=0 Jan 29 15:10:55 crc kubenswrapper[4753]: I0129 15:10:55.861115 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj66p" event={"ID":"7a179da2-a3c8-4c5f-aedf-be464d70fddc","Type":"ContainerDied","Data":"292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735"} Jan 29 15:10:57 crc kubenswrapper[4753]: I0129 15:10:57.054864 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:10:57 crc kubenswrapper[4753]: I0129 15:10:57.054981 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:10:57 crc kubenswrapper[4753]: I0129 15:10:57.887210 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj66p" event={"ID":"7a179da2-a3c8-4c5f-aedf-be464d70fddc","Type":"ContainerStarted","Data":"c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3"} Jan 29 15:10:59 crc kubenswrapper[4753]: E0129 15:10:59.151817 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:10:59 crc kubenswrapper[4753]: I0129 15:10:59.184436 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lj66p" podStartSLOduration=3.424345667 podStartE2EDuration="3m6.18441369s" podCreationTimestamp="2026-01-29 15:07:53 +0000 UTC" firstStartedPulling="2026-01-29 15:07:54.472877094 +0000 UTC m=+3909.167611476" lastFinishedPulling="2026-01-29 15:10:57.232945107 +0000 UTC m=+4091.927679499" observedRunningTime="2026-01-29 15:10:57.909917884 +0000 UTC m=+4092.604652276" watchObservedRunningTime="2026-01-29 15:10:59.18441369 +0000 UTC m=+4093.879148082" Jan 29 15:11:03 crc kubenswrapper[4753]: I0129 15:11:03.508045 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:11:03 crc kubenswrapper[4753]: I0129 15:11:03.508550 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:11:04 crc kubenswrapper[4753]: I0129 15:11:04.549498 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerName="registry-server" probeResult="failure" output=< Jan 29 15:11:04 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 15:11:04 crc kubenswrapper[4753]: > Jan 29 15:11:13 crc kubenswrapper[4753]: E0129 15:11:13.151453 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:11:13 crc kubenswrapper[4753]: I0129 15:11:13.554468 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:11:13 crc kubenswrapper[4753]: I0129 15:11:13.594093 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:11:13 crc kubenswrapper[4753]: I0129 15:11:13.802182 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lj66p"] Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.021915 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lj66p" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerName="registry-server" containerID="cri-o://c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3" gracePeriod=2 Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.584960 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.606596 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-catalog-content\") pod \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.606677 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwg7m\" (UniqueName: \"kubernetes.io/projected/7a179da2-a3c8-4c5f-aedf-be464d70fddc-kube-api-access-qwg7m\") pod \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.606778 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-utilities\") pod \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\" (UID: \"7a179da2-a3c8-4c5f-aedf-be464d70fddc\") " Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.607943 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-utilities" (OuterVolumeSpecName: "utilities") pod "7a179da2-a3c8-4c5f-aedf-be464d70fddc" (UID: "7a179da2-a3c8-4c5f-aedf-be464d70fddc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.621570 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a179da2-a3c8-4c5f-aedf-be464d70fddc-kube-api-access-qwg7m" (OuterVolumeSpecName: "kube-api-access-qwg7m") pod "7a179da2-a3c8-4c5f-aedf-be464d70fddc" (UID: "7a179da2-a3c8-4c5f-aedf-be464d70fddc"). InnerVolumeSpecName "kube-api-access-qwg7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.708867 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.708911 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwg7m\" (UniqueName: \"kubernetes.io/projected/7a179da2-a3c8-4c5f-aedf-be464d70fddc-kube-api-access-qwg7m\") on node \"crc\" DevicePath \"\"" Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.738887 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a179da2-a3c8-4c5f-aedf-be464d70fddc" (UID: "7a179da2-a3c8-4c5f-aedf-be464d70fddc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:11:15 crc kubenswrapper[4753]: I0129 15:11:15.810387 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a179da2-a3c8-4c5f-aedf-be464d70fddc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.029548 4753 generic.go:334] "Generic (PLEG): container finished" podID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerID="c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3" exitCode=0 Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.029610 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj66p" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.029608 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj66p" event={"ID":"7a179da2-a3c8-4c5f-aedf-be464d70fddc","Type":"ContainerDied","Data":"c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3"} Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.029663 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj66p" event={"ID":"7a179da2-a3c8-4c5f-aedf-be464d70fddc","Type":"ContainerDied","Data":"64fb66bd99dd4e39483bc82c83e234623ab036a6ad51909f68ab728430c3a7bc"} Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.029689 4753 scope.go:117] "RemoveContainer" containerID="c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.050881 4753 scope.go:117] "RemoveContainer" containerID="292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.058079 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lj66p"] Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.063074 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lj66p"] Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.080305 4753 scope.go:117] "RemoveContainer" containerID="a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.096431 4753 scope.go:117] "RemoveContainer" containerID="c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3" Jan 29 15:11:16 crc kubenswrapper[4753]: E0129 15:11:16.096960 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3\": container with ID starting with c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3 not found: ID does not exist" containerID="c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.096991 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3"} err="failed to get container status \"c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3\": rpc error: code = NotFound desc = could not find container \"c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3\": container with ID starting with c16d998819f7b0c70e795cf9a14248e9d8d39f5f0dba2cff66716ff99d1d3ab3 not found: ID does not exist" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.097013 4753 scope.go:117] "RemoveContainer" containerID="292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735" Jan 29 15:11:16 crc kubenswrapper[4753]: E0129 15:11:16.097444 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735\": container with ID starting with 292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735 not found: ID does not exist" containerID="292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.097467 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735"} err="failed to get container status \"292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735\": rpc error: code = NotFound desc = could not find container \"292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735\": container with ID starting with 292332c2645918acbd254c73c569342aead651f9758b88c0f7bba5958b864735 not found: ID does not exist" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.097487 4753 scope.go:117] "RemoveContainer" containerID="a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4" Jan 29 15:11:16 crc kubenswrapper[4753]: E0129 15:11:16.097852 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4\": container with ID starting with a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4 not found: ID does not exist" containerID="a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.097875 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4"} err="failed to get container status \"a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4\": rpc error: code = NotFound desc = could not find container \"a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4\": container with ID starting with a9d31721eae6b62df02a9c4700c16a6299214ac21ec0fa1f31c3f91e7e66fed4 not found: ID does not exist" Jan 29 15:11:16 crc kubenswrapper[4753]: I0129 15:11:16.159760 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" path="/var/lib/kubelet/pods/7a179da2-a3c8-4c5f-aedf-be464d70fddc/volumes" Jan 29 15:11:27 crc kubenswrapper[4753]: I0129 15:11:27.054893 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:11:27 crc kubenswrapper[4753]: I0129 15:11:27.055569 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:11:27 crc kubenswrapper[4753]: E0129 15:11:27.150877 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:11:38 crc kubenswrapper[4753]: E0129 15:11:38.152004 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:11:53 crc kubenswrapper[4753]: E0129 15:11:53.152330 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" Jan 29 15:11:57 crc kubenswrapper[4753]: I0129 15:11:57.054578 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:11:57 crc kubenswrapper[4753]: I0129 15:11:57.054899 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:11:57 crc kubenswrapper[4753]: I0129 15:11:57.054962 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:11:57 crc kubenswrapper[4753]: I0129 15:11:57.055703 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3dcb3a962df2116db3467e2a4acd6e1502acdc60563dc696461415f1f8a1f591"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:11:57 crc kubenswrapper[4753]: I0129 15:11:57.055773 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://3dcb3a962df2116db3467e2a4acd6e1502acdc60563dc696461415f1f8a1f591" gracePeriod=600 Jan 29 15:11:58 crc kubenswrapper[4753]: I0129 15:11:58.341264 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="3dcb3a962df2116db3467e2a4acd6e1502acdc60563dc696461415f1f8a1f591" exitCode=0 Jan 29 15:11:58 crc kubenswrapper[4753]: I0129 15:11:58.341574 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"3dcb3a962df2116db3467e2a4acd6e1502acdc60563dc696461415f1f8a1f591"} Jan 29 15:11:58 crc kubenswrapper[4753]: I0129 15:11:58.341715 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20"} Jan 29 15:11:58 crc kubenswrapper[4753]: I0129 15:11:58.341736 4753 scope.go:117] "RemoveContainer" containerID="57c73db73634b86faa8e72eff06a358c972236f3bb860a5c5a35e836033dd44e" Jan 29 15:12:07 crc kubenswrapper[4753]: I0129 15:12:07.417079 4753 generic.go:334] "Generic (PLEG): container finished" podID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerID="cea4758166f50094857752636076cdc3972d6ae079d13b2a50d25508bbe0373a" exitCode=0 Jan 29 15:12:07 crc kubenswrapper[4753]: I0129 15:12:07.417209 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hdjw" event={"ID":"beeada0f-43d9-4a2b-8faf-08a90f6223a6","Type":"ContainerDied","Data":"cea4758166f50094857752636076cdc3972d6ae079d13b2a50d25508bbe0373a"} Jan 29 15:12:09 crc kubenswrapper[4753]: I0129 15:12:09.450251 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hdjw" event={"ID":"beeada0f-43d9-4a2b-8faf-08a90f6223a6","Type":"ContainerStarted","Data":"5b35a1c70b4d5b47339382266646196be494f1f594dd551233dc34e82a2c6d41"} Jan 29 15:12:09 crc kubenswrapper[4753]: I0129 15:12:09.469581 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7hdjw" podStartSLOduration=3.093498155 podStartE2EDuration="3m4.469564837s" podCreationTimestamp="2026-01-29 15:09:05 +0000 UTC" firstStartedPulling="2026-01-29 15:09:07.063955355 +0000 UTC m=+3981.758689737" lastFinishedPulling="2026-01-29 15:12:08.440021997 +0000 UTC m=+4163.134756419" observedRunningTime="2026-01-29 15:12:09.466276816 +0000 UTC m=+4164.161011218" watchObservedRunningTime="2026-01-29 15:12:09.469564837 +0000 UTC m=+4164.164299219" Jan 29 15:12:16 crc kubenswrapper[4753]: I0129 15:12:16.202474 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:12:16 crc kubenswrapper[4753]: I0129 15:12:16.205215 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:12:16 crc kubenswrapper[4753]: I0129 15:12:16.249548 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:12:16 crc kubenswrapper[4753]: I0129 15:12:16.547414 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:12:16 crc kubenswrapper[4753]: I0129 15:12:16.599621 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7hdjw"] Jan 29 15:12:18 crc kubenswrapper[4753]: I0129 15:12:18.517173 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7hdjw" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerName="registry-server" containerID="cri-o://5b35a1c70b4d5b47339382266646196be494f1f594dd551233dc34e82a2c6d41" gracePeriod=2 Jan 29 15:12:21 crc kubenswrapper[4753]: I0129 15:12:21.543987 4753 generic.go:334] "Generic (PLEG): container finished" podID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerID="5b35a1c70b4d5b47339382266646196be494f1f594dd551233dc34e82a2c6d41" exitCode=0 Jan 29 15:12:21 crc kubenswrapper[4753]: I0129 15:12:21.544069 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hdjw" event={"ID":"beeada0f-43d9-4a2b-8faf-08a90f6223a6","Type":"ContainerDied","Data":"5b35a1c70b4d5b47339382266646196be494f1f594dd551233dc34e82a2c6d41"} Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.036541 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.208568 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg4lz\" (UniqueName: \"kubernetes.io/projected/beeada0f-43d9-4a2b-8faf-08a90f6223a6-kube-api-access-cg4lz\") pod \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.209243 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-catalog-content\") pod \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.210952 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-utilities\") pod \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\" (UID: \"beeada0f-43d9-4a2b-8faf-08a90f6223a6\") " Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.212816 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-utilities" (OuterVolumeSpecName: "utilities") pod "beeada0f-43d9-4a2b-8faf-08a90f6223a6" (UID: "beeada0f-43d9-4a2b-8faf-08a90f6223a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.214588 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beeada0f-43d9-4a2b-8faf-08a90f6223a6-kube-api-access-cg4lz" (OuterVolumeSpecName: "kube-api-access-cg4lz") pod "beeada0f-43d9-4a2b-8faf-08a90f6223a6" (UID: "beeada0f-43d9-4a2b-8faf-08a90f6223a6"). InnerVolumeSpecName "kube-api-access-cg4lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.272113 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "beeada0f-43d9-4a2b-8faf-08a90f6223a6" (UID: "beeada0f-43d9-4a2b-8faf-08a90f6223a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.313622 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.313670 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg4lz\" (UniqueName: \"kubernetes.io/projected/beeada0f-43d9-4a2b-8faf-08a90f6223a6-kube-api-access-cg4lz\") on node \"crc\" DevicePath \"\"" Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.313680 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beeada0f-43d9-4a2b-8faf-08a90f6223a6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.556022 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7hdjw" event={"ID":"beeada0f-43d9-4a2b-8faf-08a90f6223a6","Type":"ContainerDied","Data":"531cad93af8a3e60bcd67a01716b872abcc8e5da92c8bcb60a0930404245f820"} Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.556078 4753 scope.go:117] "RemoveContainer" containerID="5b35a1c70b4d5b47339382266646196be494f1f594dd551233dc34e82a2c6d41" Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.556184 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7hdjw" Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.582794 4753 scope.go:117] "RemoveContainer" containerID="cea4758166f50094857752636076cdc3972d6ae079d13b2a50d25508bbe0373a" Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.604695 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7hdjw"] Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.610675 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7hdjw"] Jan 29 15:12:22 crc kubenswrapper[4753]: I0129 15:12:22.631375 4753 scope.go:117] "RemoveContainer" containerID="5bb67c75953cd913c691f9316db2e8e1d91944bc68610a76f05211bed010adbf" Jan 29 15:12:24 crc kubenswrapper[4753]: I0129 15:12:24.164284 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" path="/var/lib/kubelet/pods/beeada0f-43d9-4a2b-8faf-08a90f6223a6/volumes" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.779413 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-57sk9"] Jan 29 15:12:45 crc kubenswrapper[4753]: E0129 15:12:45.780444 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerName="registry-server" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.780466 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerName="registry-server" Jan 29 15:12:45 crc kubenswrapper[4753]: E0129 15:12:45.780499 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerName="extract-utilities" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.780511 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerName="extract-utilities" Jan 29 15:12:45 crc kubenswrapper[4753]: E0129 15:12:45.780523 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerName="extract-utilities" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.780533 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerName="extract-utilities" Jan 29 15:12:45 crc kubenswrapper[4753]: E0129 15:12:45.780550 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerName="extract-content" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.780559 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerName="extract-content" Jan 29 15:12:45 crc kubenswrapper[4753]: E0129 15:12:45.780580 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerName="registry-server" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.780590 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerName="registry-server" Jan 29 15:12:45 crc kubenswrapper[4753]: E0129 15:12:45.780614 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerName="extract-content" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.780624 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerName="extract-content" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.780856 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="beeada0f-43d9-4a2b-8faf-08a90f6223a6" containerName="registry-server" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.780885 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a179da2-a3c8-4c5f-aedf-be464d70fddc" containerName="registry-server" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.782358 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.784512 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57sk9"] Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.869400 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-catalog-content\") pod \"community-operators-57sk9\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.869450 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-utilities\") pod \"community-operators-57sk9\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.869563 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd8n7\" (UniqueName: \"kubernetes.io/projected/b474874e-57a0-484f-ba9a-6bb29d04a93b-kube-api-access-kd8n7\") pod \"community-operators-57sk9\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.971303 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-catalog-content\") pod \"community-operators-57sk9\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.971343 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-utilities\") pod \"community-operators-57sk9\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.971419 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd8n7\" (UniqueName: \"kubernetes.io/projected/b474874e-57a0-484f-ba9a-6bb29d04a93b-kube-api-access-kd8n7\") pod \"community-operators-57sk9\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.971876 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-utilities\") pod \"community-operators-57sk9\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.972184 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-catalog-content\") pod \"community-operators-57sk9\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:45 crc kubenswrapper[4753]: I0129 15:12:45.990583 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd8n7\" (UniqueName: \"kubernetes.io/projected/b474874e-57a0-484f-ba9a-6bb29d04a93b-kube-api-access-kd8n7\") pod \"community-operators-57sk9\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:46 crc kubenswrapper[4753]: I0129 15:12:46.103653 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:46 crc kubenswrapper[4753]: I0129 15:12:46.430999 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57sk9"] Jan 29 15:12:46 crc kubenswrapper[4753]: I0129 15:12:46.777049 4753 generic.go:334] "Generic (PLEG): container finished" podID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerID="1ce3fa95a7135c1e7bc2d58ef8816e7d06309542d1b4995d3bb3d9f56cf93d61" exitCode=0 Jan 29 15:12:46 crc kubenswrapper[4753]: I0129 15:12:46.777139 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57sk9" event={"ID":"b474874e-57a0-484f-ba9a-6bb29d04a93b","Type":"ContainerDied","Data":"1ce3fa95a7135c1e7bc2d58ef8816e7d06309542d1b4995d3bb3d9f56cf93d61"} Jan 29 15:12:46 crc kubenswrapper[4753]: I0129 15:12:46.777206 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57sk9" event={"ID":"b474874e-57a0-484f-ba9a-6bb29d04a93b","Type":"ContainerStarted","Data":"e850c23b1e41e2dd69238b8703ef32a47cef9ff68bd10b3adfc919d50ff83f95"} Jan 29 15:12:48 crc kubenswrapper[4753]: I0129 15:12:48.797055 4753 generic.go:334] "Generic (PLEG): container finished" podID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerID="9e2f8c81688d8731f172586ca72f147c3277c4dc163f63bb490c8cc24f944f3a" exitCode=0 Jan 29 15:12:48 crc kubenswrapper[4753]: I0129 15:12:48.797141 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57sk9" event={"ID":"b474874e-57a0-484f-ba9a-6bb29d04a93b","Type":"ContainerDied","Data":"9e2f8c81688d8731f172586ca72f147c3277c4dc163f63bb490c8cc24f944f3a"} Jan 29 15:12:49 crc kubenswrapper[4753]: I0129 15:12:49.806408 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57sk9" event={"ID":"b474874e-57a0-484f-ba9a-6bb29d04a93b","Type":"ContainerStarted","Data":"88ffc25885b568a48bc3dbe7b8f6974322fd2a5ec22a144c751ce3a4cd724f67"} Jan 29 15:12:49 crc kubenswrapper[4753]: I0129 15:12:49.830308 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-57sk9" podStartSLOduration=2.180303449 podStartE2EDuration="4.830289374s" podCreationTimestamp="2026-01-29 15:12:45 +0000 UTC" firstStartedPulling="2026-01-29 15:12:46.778541761 +0000 UTC m=+4201.473276143" lastFinishedPulling="2026-01-29 15:12:49.428527676 +0000 UTC m=+4204.123262068" observedRunningTime="2026-01-29 15:12:49.822804772 +0000 UTC m=+4204.517539174" watchObservedRunningTime="2026-01-29 15:12:49.830289374 +0000 UTC m=+4204.525023766" Jan 29 15:12:56 crc kubenswrapper[4753]: I0129 15:12:56.103731 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:56 crc kubenswrapper[4753]: I0129 15:12:56.104073 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:56 crc kubenswrapper[4753]: I0129 15:12:56.159087 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:56 crc kubenswrapper[4753]: I0129 15:12:56.907862 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:12:56 crc kubenswrapper[4753]: I0129 15:12:56.957064 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-57sk9"] Jan 29 15:12:58 crc kubenswrapper[4753]: I0129 15:12:58.873971 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-57sk9" podUID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerName="registry-server" containerID="cri-o://88ffc25885b568a48bc3dbe7b8f6974322fd2a5ec22a144c751ce3a4cd724f67" gracePeriod=2 Jan 29 15:13:00 crc kubenswrapper[4753]: I0129 15:13:00.894938 4753 generic.go:334] "Generic (PLEG): container finished" podID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerID="88ffc25885b568a48bc3dbe7b8f6974322fd2a5ec22a144c751ce3a4cd724f67" exitCode=0 Jan 29 15:13:00 crc kubenswrapper[4753]: I0129 15:13:00.895093 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57sk9" event={"ID":"b474874e-57a0-484f-ba9a-6bb29d04a93b","Type":"ContainerDied","Data":"88ffc25885b568a48bc3dbe7b8f6974322fd2a5ec22a144c751ce3a4cd724f67"} Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.690456 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.799057 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd8n7\" (UniqueName: \"kubernetes.io/projected/b474874e-57a0-484f-ba9a-6bb29d04a93b-kube-api-access-kd8n7\") pod \"b474874e-57a0-484f-ba9a-6bb29d04a93b\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.799269 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-catalog-content\") pod \"b474874e-57a0-484f-ba9a-6bb29d04a93b\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.799316 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-utilities\") pod \"b474874e-57a0-484f-ba9a-6bb29d04a93b\" (UID: \"b474874e-57a0-484f-ba9a-6bb29d04a93b\") " Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.800318 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-utilities" (OuterVolumeSpecName: "utilities") pod "b474874e-57a0-484f-ba9a-6bb29d04a93b" (UID: "b474874e-57a0-484f-ba9a-6bb29d04a93b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.810528 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b474874e-57a0-484f-ba9a-6bb29d04a93b-kube-api-access-kd8n7" (OuterVolumeSpecName: "kube-api-access-kd8n7") pod "b474874e-57a0-484f-ba9a-6bb29d04a93b" (UID: "b474874e-57a0-484f-ba9a-6bb29d04a93b"). InnerVolumeSpecName "kube-api-access-kd8n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.850667 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b474874e-57a0-484f-ba9a-6bb29d04a93b" (UID: "b474874e-57a0-484f-ba9a-6bb29d04a93b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.900879 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.900913 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b474874e-57a0-484f-ba9a-6bb29d04a93b-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.900924 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd8n7\" (UniqueName: \"kubernetes.io/projected/b474874e-57a0-484f-ba9a-6bb29d04a93b-kube-api-access-kd8n7\") on node \"crc\" DevicePath \"\"" Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.905040 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57sk9" event={"ID":"b474874e-57a0-484f-ba9a-6bb29d04a93b","Type":"ContainerDied","Data":"e850c23b1e41e2dd69238b8703ef32a47cef9ff68bd10b3adfc919d50ff83f95"} Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.905090 4753 scope.go:117] "RemoveContainer" containerID="88ffc25885b568a48bc3dbe7b8f6974322fd2a5ec22a144c751ce3a4cd724f67" Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.905241 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57sk9" Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.933959 4753 scope.go:117] "RemoveContainer" containerID="9e2f8c81688d8731f172586ca72f147c3277c4dc163f63bb490c8cc24f944f3a" Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.946037 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-57sk9"] Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.952665 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-57sk9"] Jan 29 15:13:01 crc kubenswrapper[4753]: I0129 15:13:01.959807 4753 scope.go:117] "RemoveContainer" containerID="1ce3fa95a7135c1e7bc2d58ef8816e7d06309542d1b4995d3bb3d9f56cf93d61" Jan 29 15:13:02 crc kubenswrapper[4753]: I0129 15:13:02.158185 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b474874e-57a0-484f-ba9a-6bb29d04a93b" path="/var/lib/kubelet/pods/b474874e-57a0-484f-ba9a-6bb29d04a93b/volumes" Jan 29 15:14:27 crc kubenswrapper[4753]: I0129 15:14:27.054695 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:14:27 crc kubenswrapper[4753]: I0129 15:14:27.055275 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:14:57 crc kubenswrapper[4753]: I0129 15:14:57.054651 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:14:57 crc kubenswrapper[4753]: I0129 15:14:57.055923 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.181764 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l"] Jan 29 15:15:00 crc kubenswrapper[4753]: E0129 15:15:00.182451 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerName="extract-utilities" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.182469 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerName="extract-utilities" Jan 29 15:15:00 crc kubenswrapper[4753]: E0129 15:15:00.182492 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerName="registry-server" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.182502 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerName="registry-server" Jan 29 15:15:00 crc kubenswrapper[4753]: E0129 15:15:00.182522 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerName="extract-content" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.182532 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerName="extract-content" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.182682 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b474874e-57a0-484f-ba9a-6bb29d04a93b" containerName="registry-server" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.183257 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.187399 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.187554 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.194825 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l"] Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.318906 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c5586a2-57df-4d76-ba74-6ca3b94f38af-secret-volume\") pod \"collect-profiles-29494995-8qk2l\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.318967 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbd2f\" (UniqueName: \"kubernetes.io/projected/2c5586a2-57df-4d76-ba74-6ca3b94f38af-kube-api-access-tbd2f\") pod \"collect-profiles-29494995-8qk2l\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.319004 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c5586a2-57df-4d76-ba74-6ca3b94f38af-config-volume\") pod \"collect-profiles-29494995-8qk2l\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.420465 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c5586a2-57df-4d76-ba74-6ca3b94f38af-secret-volume\") pod \"collect-profiles-29494995-8qk2l\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.420536 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbd2f\" (UniqueName: \"kubernetes.io/projected/2c5586a2-57df-4d76-ba74-6ca3b94f38af-kube-api-access-tbd2f\") pod \"collect-profiles-29494995-8qk2l\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.420573 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c5586a2-57df-4d76-ba74-6ca3b94f38af-config-volume\") pod \"collect-profiles-29494995-8qk2l\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.421494 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c5586a2-57df-4d76-ba74-6ca3b94f38af-config-volume\") pod \"collect-profiles-29494995-8qk2l\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.427477 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c5586a2-57df-4d76-ba74-6ca3b94f38af-secret-volume\") pod \"collect-profiles-29494995-8qk2l\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.437106 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbd2f\" (UniqueName: \"kubernetes.io/projected/2c5586a2-57df-4d76-ba74-6ca3b94f38af-kube-api-access-tbd2f\") pod \"collect-profiles-29494995-8qk2l\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.509096 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:00 crc kubenswrapper[4753]: I0129 15:15:00.943545 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l"] Jan 29 15:15:01 crc kubenswrapper[4753]: I0129 15:15:01.756735 4753 generic.go:334] "Generic (PLEG): container finished" podID="2c5586a2-57df-4d76-ba74-6ca3b94f38af" containerID="f1440e5380ae4009ed02f3227bfb94292d746bb52469074d42f9623e8355a950" exitCode=0 Jan 29 15:15:01 crc kubenswrapper[4753]: I0129 15:15:01.756808 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" event={"ID":"2c5586a2-57df-4d76-ba74-6ca3b94f38af","Type":"ContainerDied","Data":"f1440e5380ae4009ed02f3227bfb94292d746bb52469074d42f9623e8355a950"} Jan 29 15:15:01 crc kubenswrapper[4753]: I0129 15:15:01.757104 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" event={"ID":"2c5586a2-57df-4d76-ba74-6ca3b94f38af","Type":"ContainerStarted","Data":"3daac6b227baac1a419d3fa90e14145c5b03064b9ec29500368d2bf1673e8256"} Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.010576 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.162355 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbd2f\" (UniqueName: \"kubernetes.io/projected/2c5586a2-57df-4d76-ba74-6ca3b94f38af-kube-api-access-tbd2f\") pod \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.162425 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c5586a2-57df-4d76-ba74-6ca3b94f38af-secret-volume\") pod \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.162489 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c5586a2-57df-4d76-ba74-6ca3b94f38af-config-volume\") pod \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\" (UID: \"2c5586a2-57df-4d76-ba74-6ca3b94f38af\") " Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.163296 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c5586a2-57df-4d76-ba74-6ca3b94f38af-config-volume" (OuterVolumeSpecName: "config-volume") pod "2c5586a2-57df-4d76-ba74-6ca3b94f38af" (UID: "2c5586a2-57df-4d76-ba74-6ca3b94f38af"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.167381 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c5586a2-57df-4d76-ba74-6ca3b94f38af-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2c5586a2-57df-4d76-ba74-6ca3b94f38af" (UID: "2c5586a2-57df-4d76-ba74-6ca3b94f38af"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.167543 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c5586a2-57df-4d76-ba74-6ca3b94f38af-kube-api-access-tbd2f" (OuterVolumeSpecName: "kube-api-access-tbd2f") pod "2c5586a2-57df-4d76-ba74-6ca3b94f38af" (UID: "2c5586a2-57df-4d76-ba74-6ca3b94f38af"). InnerVolumeSpecName "kube-api-access-tbd2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.265340 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c5586a2-57df-4d76-ba74-6ca3b94f38af-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.265369 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c5586a2-57df-4d76-ba74-6ca3b94f38af-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.265382 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbd2f\" (UniqueName: \"kubernetes.io/projected/2c5586a2-57df-4d76-ba74-6ca3b94f38af-kube-api-access-tbd2f\") on node \"crc\" DevicePath \"\"" Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.770410 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" event={"ID":"2c5586a2-57df-4d76-ba74-6ca3b94f38af","Type":"ContainerDied","Data":"3daac6b227baac1a419d3fa90e14145c5b03064b9ec29500368d2bf1673e8256"} Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.770448 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29494995-8qk2l" Jan 29 15:15:03 crc kubenswrapper[4753]: I0129 15:15:03.770458 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3daac6b227baac1a419d3fa90e14145c5b03064b9ec29500368d2bf1673e8256" Jan 29 15:15:04 crc kubenswrapper[4753]: I0129 15:15:04.123004 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22"] Jan 29 15:15:04 crc kubenswrapper[4753]: I0129 15:15:04.129401 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494950-62n22"] Jan 29 15:15:04 crc kubenswrapper[4753]: I0129 15:15:04.159718 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f636be-69c3-4afd-a2f0-fbf0e47c0e83" path="/var/lib/kubelet/pods/41f636be-69c3-4afd-a2f0-fbf0e47c0e83/volumes" Jan 29 15:15:06 crc kubenswrapper[4753]: I0129 15:15:06.025440 4753 scope.go:117] "RemoveContainer" containerID="c173db9ae7bc291d121b51324e26a1f9182e273f61a8730452f19be66d914eb4" Jan 29 15:15:27 crc kubenswrapper[4753]: I0129 15:15:27.055119 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:15:27 crc kubenswrapper[4753]: I0129 15:15:27.055713 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:15:27 crc kubenswrapper[4753]: I0129 15:15:27.055775 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:15:27 crc kubenswrapper[4753]: I0129 15:15:27.056358 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:15:27 crc kubenswrapper[4753]: I0129 15:15:27.056411 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" gracePeriod=600 Jan 29 15:15:27 crc kubenswrapper[4753]: E0129 15:15:27.887560 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:15:27 crc kubenswrapper[4753]: I0129 15:15:27.945120 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" exitCode=0 Jan 29 15:15:27 crc kubenswrapper[4753]: I0129 15:15:27.945193 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20"} Jan 29 15:15:27 crc kubenswrapper[4753]: I0129 15:15:27.945262 4753 scope.go:117] "RemoveContainer" containerID="3dcb3a962df2116db3467e2a4acd6e1502acdc60563dc696461415f1f8a1f591" Jan 29 15:15:27 crc kubenswrapper[4753]: I0129 15:15:27.945764 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:15:27 crc kubenswrapper[4753]: E0129 15:15:27.946073 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:15:42 crc kubenswrapper[4753]: I0129 15:15:42.150435 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:15:42 crc kubenswrapper[4753]: E0129 15:15:42.151246 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:15:53 crc kubenswrapper[4753]: I0129 15:15:53.149778 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:15:53 crc kubenswrapper[4753]: E0129 15:15:53.150505 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:16:06 crc kubenswrapper[4753]: I0129 15:16:06.154323 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:16:06 crc kubenswrapper[4753]: E0129 15:16:06.155326 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:16:17 crc kubenswrapper[4753]: I0129 15:16:17.149400 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:16:17 crc kubenswrapper[4753]: E0129 15:16:17.150005 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:16:32 crc kubenswrapper[4753]: I0129 15:16:32.148996 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:16:32 crc kubenswrapper[4753]: E0129 15:16:32.149805 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:16:46 crc kubenswrapper[4753]: I0129 15:16:46.153252 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:16:46 crc kubenswrapper[4753]: E0129 15:16:46.154470 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.257677 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-9wffl"] Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.263181 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-9wffl"] Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.367059 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-zgl9q"] Jan 29 15:16:49 crc kubenswrapper[4753]: E0129 15:16:49.367368 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c5586a2-57df-4d76-ba74-6ca3b94f38af" containerName="collect-profiles" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.367390 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c5586a2-57df-4d76-ba74-6ca3b94f38af" containerName="collect-profiles" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.367530 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c5586a2-57df-4d76-ba74-6ca3b94f38af" containerName="collect-profiles" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.367967 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.370636 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.370836 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.371143 4753 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-x2h76" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.375348 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.382452 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zgl9q"] Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.526819 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a605828-cc86-43a0-8a38-2a224cd946bc-crc-storage\") pod \"crc-storage-crc-zgl9q\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.526911 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a605828-cc86-43a0-8a38-2a224cd946bc-node-mnt\") pod \"crc-storage-crc-zgl9q\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.526940 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkvqm\" (UniqueName: \"kubernetes.io/projected/1a605828-cc86-43a0-8a38-2a224cd946bc-kube-api-access-hkvqm\") pod \"crc-storage-crc-zgl9q\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.628609 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a605828-cc86-43a0-8a38-2a224cd946bc-crc-storage\") pod \"crc-storage-crc-zgl9q\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.628735 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a605828-cc86-43a0-8a38-2a224cd946bc-node-mnt\") pod \"crc-storage-crc-zgl9q\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.628768 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkvqm\" (UniqueName: \"kubernetes.io/projected/1a605828-cc86-43a0-8a38-2a224cd946bc-kube-api-access-hkvqm\") pod \"crc-storage-crc-zgl9q\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.629093 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a605828-cc86-43a0-8a38-2a224cd946bc-node-mnt\") pod \"crc-storage-crc-zgl9q\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.629648 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a605828-cc86-43a0-8a38-2a224cd946bc-crc-storage\") pod \"crc-storage-crc-zgl9q\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.647350 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkvqm\" (UniqueName: \"kubernetes.io/projected/1a605828-cc86-43a0-8a38-2a224cd946bc-kube-api-access-hkvqm\") pod \"crc-storage-crc-zgl9q\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:49 crc kubenswrapper[4753]: I0129 15:16:49.683969 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:50 crc kubenswrapper[4753]: I0129 15:16:50.084631 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zgl9q"] Jan 29 15:16:50 crc kubenswrapper[4753]: I0129 15:16:50.102186 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 15:16:50 crc kubenswrapper[4753]: I0129 15:16:50.158477 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28beaa45-8794-4da4-9715-05084646e567" path="/var/lib/kubelet/pods/28beaa45-8794-4da4-9715-05084646e567/volumes" Jan 29 15:16:50 crc kubenswrapper[4753]: I0129 15:16:50.588325 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zgl9q" event={"ID":"1a605828-cc86-43a0-8a38-2a224cd946bc","Type":"ContainerStarted","Data":"fdb046b847c49aa1393a52d581ee6b993f538bd47e2117b0a19de325cd2e7b6d"} Jan 29 15:16:51 crc kubenswrapper[4753]: I0129 15:16:51.601750 4753 generic.go:334] "Generic (PLEG): container finished" podID="1a605828-cc86-43a0-8a38-2a224cd946bc" containerID="0505363f7b613cbedabfefdb2d26ee3018c6869379b11973da0d4ce0196c990f" exitCode=0 Jan 29 15:16:51 crc kubenswrapper[4753]: I0129 15:16:51.602299 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zgl9q" event={"ID":"1a605828-cc86-43a0-8a38-2a224cd946bc","Type":"ContainerDied","Data":"0505363f7b613cbedabfefdb2d26ee3018c6869379b11973da0d4ce0196c990f"} Jan 29 15:16:52 crc kubenswrapper[4753]: I0129 15:16:52.923364 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.109325 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkvqm\" (UniqueName: \"kubernetes.io/projected/1a605828-cc86-43a0-8a38-2a224cd946bc-kube-api-access-hkvqm\") pod \"1a605828-cc86-43a0-8a38-2a224cd946bc\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.109409 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a605828-cc86-43a0-8a38-2a224cd946bc-crc-storage\") pod \"1a605828-cc86-43a0-8a38-2a224cd946bc\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.109438 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a605828-cc86-43a0-8a38-2a224cd946bc-node-mnt\") pod \"1a605828-cc86-43a0-8a38-2a224cd946bc\" (UID: \"1a605828-cc86-43a0-8a38-2a224cd946bc\") " Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.109621 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a605828-cc86-43a0-8a38-2a224cd946bc-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1a605828-cc86-43a0-8a38-2a224cd946bc" (UID: "1a605828-cc86-43a0-8a38-2a224cd946bc"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.109828 4753 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a605828-cc86-43a0-8a38-2a224cd946bc-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.116570 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a605828-cc86-43a0-8a38-2a224cd946bc-kube-api-access-hkvqm" (OuterVolumeSpecName: "kube-api-access-hkvqm") pod "1a605828-cc86-43a0-8a38-2a224cd946bc" (UID: "1a605828-cc86-43a0-8a38-2a224cd946bc"). InnerVolumeSpecName "kube-api-access-hkvqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.133931 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a605828-cc86-43a0-8a38-2a224cd946bc-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1a605828-cc86-43a0-8a38-2a224cd946bc" (UID: "1a605828-cc86-43a0-8a38-2a224cd946bc"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.211410 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkvqm\" (UniqueName: \"kubernetes.io/projected/1a605828-cc86-43a0-8a38-2a224cd946bc-kube-api-access-hkvqm\") on node \"crc\" DevicePath \"\"" Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.211702 4753 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a605828-cc86-43a0-8a38-2a224cd946bc-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.631370 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zgl9q" event={"ID":"1a605828-cc86-43a0-8a38-2a224cd946bc","Type":"ContainerDied","Data":"fdb046b847c49aa1393a52d581ee6b993f538bd47e2117b0a19de325cd2e7b6d"} Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.631412 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdb046b847c49aa1393a52d581ee6b993f538bd47e2117b0a19de325cd2e7b6d" Jan 29 15:16:53 crc kubenswrapper[4753]: I0129 15:16:53.631445 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zgl9q" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.110348 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-zgl9q"] Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.118118 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-zgl9q"] Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.237651 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tlr2n"] Jan 29 15:16:55 crc kubenswrapper[4753]: E0129 15:16:55.238475 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a605828-cc86-43a0-8a38-2a224cd946bc" containerName="storage" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.238510 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a605828-cc86-43a0-8a38-2a224cd946bc" containerName="storage" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.238960 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a605828-cc86-43a0-8a38-2a224cd946bc" containerName="storage" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.240568 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.242939 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.243294 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.243442 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.243790 4753 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-x2h76" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.253819 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tlr2n"] Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.343178 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-crc-storage\") pod \"crc-storage-crc-tlr2n\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.343267 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2b4p\" (UniqueName: \"kubernetes.io/projected/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-kube-api-access-n2b4p\") pod \"crc-storage-crc-tlr2n\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.343301 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-node-mnt\") pod \"crc-storage-crc-tlr2n\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.444376 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-crc-storage\") pod \"crc-storage-crc-tlr2n\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.444703 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2b4p\" (UniqueName: \"kubernetes.io/projected/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-kube-api-access-n2b4p\") pod \"crc-storage-crc-tlr2n\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.444730 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-node-mnt\") pod \"crc-storage-crc-tlr2n\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.445007 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-node-mnt\") pod \"crc-storage-crc-tlr2n\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.445100 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-crc-storage\") pod \"crc-storage-crc-tlr2n\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.465689 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2b4p\" (UniqueName: \"kubernetes.io/projected/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-kube-api-access-n2b4p\") pod \"crc-storage-crc-tlr2n\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:55 crc kubenswrapper[4753]: I0129 15:16:55.575505 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:56 crc kubenswrapper[4753]: I0129 15:16:56.024650 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tlr2n"] Jan 29 15:16:56 crc kubenswrapper[4753]: W0129 15:16:56.030068 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7de80a9d_9ae6_4d73_abb5_a0d604c5ee6d.slice/crio-5303bc6d4f6b4da8a20cd020f4a43d780a057004d0e092bee7ce0009d90b03bf WatchSource:0}: Error finding container 5303bc6d4f6b4da8a20cd020f4a43d780a057004d0e092bee7ce0009d90b03bf: Status 404 returned error can't find the container with id 5303bc6d4f6b4da8a20cd020f4a43d780a057004d0e092bee7ce0009d90b03bf Jan 29 15:16:56 crc kubenswrapper[4753]: I0129 15:16:56.164220 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a605828-cc86-43a0-8a38-2a224cd946bc" path="/var/lib/kubelet/pods/1a605828-cc86-43a0-8a38-2a224cd946bc/volumes" Jan 29 15:16:56 crc kubenswrapper[4753]: I0129 15:16:56.655792 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tlr2n" event={"ID":"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d","Type":"ContainerStarted","Data":"5303bc6d4f6b4da8a20cd020f4a43d780a057004d0e092bee7ce0009d90b03bf"} Jan 29 15:16:57 crc kubenswrapper[4753]: I0129 15:16:57.665418 4753 generic.go:334] "Generic (PLEG): container finished" podID="7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d" containerID="07b1f1e004c727d705361a41c38705eeb6852dae4e9513894ca4a60131c980a4" exitCode=0 Jan 29 15:16:57 crc kubenswrapper[4753]: I0129 15:16:57.665555 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tlr2n" event={"ID":"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d","Type":"ContainerDied","Data":"07b1f1e004c727d705361a41c38705eeb6852dae4e9513894ca4a60131c980a4"} Jan 29 15:16:58 crc kubenswrapper[4753]: I0129 15:16:58.935481 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:16:58 crc kubenswrapper[4753]: I0129 15:16:58.994571 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-crc-storage\") pod \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " Jan 29 15:16:58 crc kubenswrapper[4753]: I0129 15:16:58.994640 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2b4p\" (UniqueName: \"kubernetes.io/projected/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-kube-api-access-n2b4p\") pod \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " Jan 29 15:16:58 crc kubenswrapper[4753]: I0129 15:16:58.994732 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-node-mnt\") pod \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\" (UID: \"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d\") " Jan 29 15:16:58 crc kubenswrapper[4753]: I0129 15:16:58.995013 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d" (UID: "7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 15:16:58 crc kubenswrapper[4753]: I0129 15:16:58.999479 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-kube-api-access-n2b4p" (OuterVolumeSpecName: "kube-api-access-n2b4p") pod "7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d" (UID: "7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d"). InnerVolumeSpecName "kube-api-access-n2b4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:16:59 crc kubenswrapper[4753]: I0129 15:16:59.012274 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d" (UID: "7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:16:59 crc kubenswrapper[4753]: I0129 15:16:59.095865 4753 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 29 15:16:59 crc kubenswrapper[4753]: I0129 15:16:59.095902 4753 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 29 15:16:59 crc kubenswrapper[4753]: I0129 15:16:59.095915 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2b4p\" (UniqueName: \"kubernetes.io/projected/7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d-kube-api-access-n2b4p\") on node \"crc\" DevicePath \"\"" Jan 29 15:16:59 crc kubenswrapper[4753]: I0129 15:16:59.689966 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tlr2n" event={"ID":"7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d","Type":"ContainerDied","Data":"5303bc6d4f6b4da8a20cd020f4a43d780a057004d0e092bee7ce0009d90b03bf"} Jan 29 15:16:59 crc kubenswrapper[4753]: I0129 15:16:59.690010 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5303bc6d4f6b4da8a20cd020f4a43d780a057004d0e092bee7ce0009d90b03bf" Jan 29 15:16:59 crc kubenswrapper[4753]: I0129 15:16:59.690036 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tlr2n" Jan 29 15:17:02 crc kubenswrapper[4753]: I0129 15:17:02.149877 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:17:02 crc kubenswrapper[4753]: E0129 15:17:02.151717 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:17:06 crc kubenswrapper[4753]: I0129 15:17:06.243005 4753 scope.go:117] "RemoveContainer" containerID="1cb2ef0e742ed0db993d4566bd709a32826d688b67e66c0349a74a83f03bd6ec" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.646571 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krbm8"] Jan 29 15:17:13 crc kubenswrapper[4753]: E0129 15:17:13.647543 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d" containerName="storage" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.647561 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d" containerName="storage" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.647736 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de80a9d-9ae6-4d73-abb5-a0d604c5ee6d" containerName="storage" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.648911 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.658991 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krbm8"] Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.728274 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-catalog-content\") pod \"redhat-marketplace-krbm8\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.728373 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x94zs\" (UniqueName: \"kubernetes.io/projected/fa366dbb-2a1c-40fa-b5c3-b136c673731a-kube-api-access-x94zs\") pod \"redhat-marketplace-krbm8\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.728402 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-utilities\") pod \"redhat-marketplace-krbm8\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.829856 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x94zs\" (UniqueName: \"kubernetes.io/projected/fa366dbb-2a1c-40fa-b5c3-b136c673731a-kube-api-access-x94zs\") pod \"redhat-marketplace-krbm8\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.829924 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-utilities\") pod \"redhat-marketplace-krbm8\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.830079 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-catalog-content\") pod \"redhat-marketplace-krbm8\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.830603 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-catalog-content\") pod \"redhat-marketplace-krbm8\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.830668 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-utilities\") pod \"redhat-marketplace-krbm8\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:13 crc kubenswrapper[4753]: I0129 15:17:13.850998 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x94zs\" (UniqueName: \"kubernetes.io/projected/fa366dbb-2a1c-40fa-b5c3-b136c673731a-kube-api-access-x94zs\") pod \"redhat-marketplace-krbm8\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:14 crc kubenswrapper[4753]: I0129 15:17:14.033195 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:17:14 crc kubenswrapper[4753]: I0129 15:17:14.276809 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krbm8"] Jan 29 15:17:14 crc kubenswrapper[4753]: I0129 15:17:14.806018 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerID="76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608" exitCode=0 Jan 29 15:17:14 crc kubenswrapper[4753]: I0129 15:17:14.806063 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krbm8" event={"ID":"fa366dbb-2a1c-40fa-b5c3-b136c673731a","Type":"ContainerDied","Data":"76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608"} Jan 29 15:17:14 crc kubenswrapper[4753]: I0129 15:17:14.806116 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krbm8" event={"ID":"fa366dbb-2a1c-40fa-b5c3-b136c673731a","Type":"ContainerStarted","Data":"9301f8747aaf9e063703cf05135f3a6c76e6cdec2241ac955746182316e7bd59"} Jan 29 15:17:14 crc kubenswrapper[4753]: E0129 15:17:14.974882 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 15:17:14 crc kubenswrapper[4753]: E0129 15:17:14.975043 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x94zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-krbm8_openshift-marketplace(fa366dbb-2a1c-40fa-b5c3-b136c673731a): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:17:14 crc kubenswrapper[4753]: E0129 15:17:14.976303 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-krbm8" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" Jan 29 15:17:15 crc kubenswrapper[4753]: E0129 15:17:15.816058 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-krbm8" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" Jan 29 15:17:17 crc kubenswrapper[4753]: I0129 15:17:17.149772 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:17:17 crc kubenswrapper[4753]: E0129 15:17:17.149999 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:17:30 crc kubenswrapper[4753]: E0129 15:17:30.283634 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 15:17:30 crc kubenswrapper[4753]: E0129 15:17:30.284542 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x94zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-krbm8_openshift-marketplace(fa366dbb-2a1c-40fa-b5c3-b136c673731a): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:17:30 crc kubenswrapper[4753]: E0129 15:17:30.285899 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-krbm8" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" Jan 29 15:17:31 crc kubenswrapper[4753]: I0129 15:17:31.150057 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:17:31 crc kubenswrapper[4753]: E0129 15:17:31.150427 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:17:42 crc kubenswrapper[4753]: E0129 15:17:42.153962 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-krbm8" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" Jan 29 15:17:46 crc kubenswrapper[4753]: I0129 15:17:46.152756 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:17:46 crc kubenswrapper[4753]: E0129 15:17:46.153279 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:17:55 crc kubenswrapper[4753]: E0129 15:17:55.283346 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 29 15:17:55 crc kubenswrapper[4753]: E0129 15:17:55.284034 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x94zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-krbm8_openshift-marketplace(fa366dbb-2a1c-40fa-b5c3-b136c673731a): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:17:55 crc kubenswrapper[4753]: E0129 15:17:55.285216 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-marketplace-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-marketplace-krbm8" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" Jan 29 15:17:57 crc kubenswrapper[4753]: I0129 15:17:57.149570 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:17:57 crc kubenswrapper[4753]: E0129 15:17:57.150079 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:18:06 crc kubenswrapper[4753]: E0129 15:18:06.155450 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-krbm8" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" Jan 29 15:18:11 crc kubenswrapper[4753]: I0129 15:18:11.149724 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:18:11 crc kubenswrapper[4753]: E0129 15:18:11.151630 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:18:18 crc kubenswrapper[4753]: E0129 15:18:18.151790 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-krbm8" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" Jan 29 15:18:23 crc kubenswrapper[4753]: I0129 15:18:23.149958 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:18:23 crc kubenswrapper[4753]: E0129 15:18:23.151311 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:18:29 crc kubenswrapper[4753]: E0129 15:18:29.152745 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-krbm8" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" Jan 29 15:18:34 crc kubenswrapper[4753]: I0129 15:18:34.149490 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:18:34 crc kubenswrapper[4753]: E0129 15:18:34.150551 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:18:43 crc kubenswrapper[4753]: I0129 15:18:43.504993 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerID="dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c" exitCode=0 Jan 29 15:18:43 crc kubenswrapper[4753]: I0129 15:18:43.505116 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krbm8" event={"ID":"fa366dbb-2a1c-40fa-b5c3-b136c673731a","Type":"ContainerDied","Data":"dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c"} Jan 29 15:18:44 crc kubenswrapper[4753]: I0129 15:18:44.516294 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krbm8" event={"ID":"fa366dbb-2a1c-40fa-b5c3-b136c673731a","Type":"ContainerStarted","Data":"b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8"} Jan 29 15:18:45 crc kubenswrapper[4753]: I0129 15:18:45.149644 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:18:45 crc kubenswrapper[4753]: E0129 15:18:45.149937 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:18:54 crc kubenswrapper[4753]: I0129 15:18:54.034296 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:18:54 crc kubenswrapper[4753]: I0129 15:18:54.035567 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:18:54 crc kubenswrapper[4753]: I0129 15:18:54.113661 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:18:54 crc kubenswrapper[4753]: I0129 15:18:54.141452 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krbm8" podStartSLOduration=11.876708116 podStartE2EDuration="1m41.141435938s" podCreationTimestamp="2026-01-29 15:17:13 +0000 UTC" firstStartedPulling="2026-01-29 15:17:14.807412343 +0000 UTC m=+4469.502146715" lastFinishedPulling="2026-01-29 15:18:44.072140145 +0000 UTC m=+4558.766874537" observedRunningTime="2026-01-29 15:18:44.541732058 +0000 UTC m=+4559.236466450" watchObservedRunningTime="2026-01-29 15:18:54.141435938 +0000 UTC m=+4568.836170310" Jan 29 15:18:54 crc kubenswrapper[4753]: I0129 15:18:54.656126 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:18:54 crc kubenswrapper[4753]: I0129 15:18:54.715559 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krbm8"] Jan 29 15:18:56 crc kubenswrapper[4753]: I0129 15:18:56.625417 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-krbm8" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerName="registry-server" containerID="cri-o://b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8" gracePeriod=2 Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.024813 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.135865 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x94zs\" (UniqueName: \"kubernetes.io/projected/fa366dbb-2a1c-40fa-b5c3-b136c673731a-kube-api-access-x94zs\") pod \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.135927 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-catalog-content\") pod \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.136017 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-utilities\") pod \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\" (UID: \"fa366dbb-2a1c-40fa-b5c3-b136c673731a\") " Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.137310 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-utilities" (OuterVolumeSpecName: "utilities") pod "fa366dbb-2a1c-40fa-b5c3-b136c673731a" (UID: "fa366dbb-2a1c-40fa-b5c3-b136c673731a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.142114 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa366dbb-2a1c-40fa-b5c3-b136c673731a-kube-api-access-x94zs" (OuterVolumeSpecName: "kube-api-access-x94zs") pod "fa366dbb-2a1c-40fa-b5c3-b136c673731a" (UID: "fa366dbb-2a1c-40fa-b5c3-b136c673731a"). InnerVolumeSpecName "kube-api-access-x94zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.163840 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa366dbb-2a1c-40fa-b5c3-b136c673731a" (UID: "fa366dbb-2a1c-40fa-b5c3-b136c673731a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.237107 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x94zs\" (UniqueName: \"kubernetes.io/projected/fa366dbb-2a1c-40fa-b5c3-b136c673731a-kube-api-access-x94zs\") on node \"crc\" DevicePath \"\"" Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.237172 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.237188 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa366dbb-2a1c-40fa-b5c3-b136c673731a-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.638068 4753 generic.go:334] "Generic (PLEG): container finished" podID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerID="b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8" exitCode=0 Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.638543 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krbm8" event={"ID":"fa366dbb-2a1c-40fa-b5c3-b136c673731a","Type":"ContainerDied","Data":"b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8"} Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.638643 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krbm8" event={"ID":"fa366dbb-2a1c-40fa-b5c3-b136c673731a","Type":"ContainerDied","Data":"9301f8747aaf9e063703cf05135f3a6c76e6cdec2241ac955746182316e7bd59"} Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.638675 4753 scope.go:117] "RemoveContainer" containerID="b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8" Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.638700 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krbm8" Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.660358 4753 scope.go:117] "RemoveContainer" containerID="dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c" Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.685258 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krbm8"] Jan 29 15:18:57 crc kubenswrapper[4753]: I0129 15:18:57.692198 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-krbm8"] Jan 29 15:18:58 crc kubenswrapper[4753]: I0129 15:18:58.167287 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" path="/var/lib/kubelet/pods/fa366dbb-2a1c-40fa-b5c3-b136c673731a/volumes" Jan 29 15:18:58 crc kubenswrapper[4753]: I0129 15:18:58.211512 4753 scope.go:117] "RemoveContainer" containerID="76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608" Jan 29 15:18:58 crc kubenswrapper[4753]: I0129 15:18:58.229925 4753 scope.go:117] "RemoveContainer" containerID="b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8" Jan 29 15:18:58 crc kubenswrapper[4753]: E0129 15:18:58.230423 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8\": container with ID starting with b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8 not found: ID does not exist" containerID="b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8" Jan 29 15:18:58 crc kubenswrapper[4753]: I0129 15:18:58.230464 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8"} err="failed to get container status \"b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8\": rpc error: code = NotFound desc = could not find container \"b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8\": container with ID starting with b614934d57752c1606554c64e1cdcb915e45773679c51852d4beb808118412e8 not found: ID does not exist" Jan 29 15:18:58 crc kubenswrapper[4753]: I0129 15:18:58.230492 4753 scope.go:117] "RemoveContainer" containerID="dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c" Jan 29 15:18:58 crc kubenswrapper[4753]: E0129 15:18:58.230815 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c\": container with ID starting with dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c not found: ID does not exist" containerID="dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c" Jan 29 15:18:58 crc kubenswrapper[4753]: I0129 15:18:58.230890 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c"} err="failed to get container status \"dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c\": rpc error: code = NotFound desc = could not find container \"dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c\": container with ID starting with dbfecde040cf49d0318d2e7dfbd7065a655eae6ae7845f69cb3880477b706b7c not found: ID does not exist" Jan 29 15:18:58 crc kubenswrapper[4753]: I0129 15:18:58.230923 4753 scope.go:117] "RemoveContainer" containerID="76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608" Jan 29 15:18:58 crc kubenswrapper[4753]: E0129 15:18:58.231274 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608\": container with ID starting with 76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608 not found: ID does not exist" containerID="76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608" Jan 29 15:18:58 crc kubenswrapper[4753]: I0129 15:18:58.231351 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608"} err="failed to get container status \"76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608\": rpc error: code = NotFound desc = could not find container \"76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608\": container with ID starting with 76ae7d0240195461b315f6d1743e0aa04f288d1c895711303ec0de28db361608 not found: ID does not exist" Jan 29 15:18:59 crc kubenswrapper[4753]: I0129 15:18:59.149491 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:18:59 crc kubenswrapper[4753]: E0129 15:18:59.151083 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:19:13 crc kubenswrapper[4753]: I0129 15:19:13.149522 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:19:13 crc kubenswrapper[4753]: E0129 15:19:13.150317 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:19:28 crc kubenswrapper[4753]: I0129 15:19:28.149360 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:19:28 crc kubenswrapper[4753]: E0129 15:19:28.150499 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.724998 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9wxf5"] Jan 29 15:19:29 crc kubenswrapper[4753]: E0129 15:19:29.725612 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerName="registry-server" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.725625 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerName="registry-server" Jan 29 15:19:29 crc kubenswrapper[4753]: E0129 15:19:29.725655 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerName="extract-utilities" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.725663 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerName="extract-utilities" Jan 29 15:19:29 crc kubenswrapper[4753]: E0129 15:19:29.725674 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerName="extract-content" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.725681 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerName="extract-content" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.725802 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa366dbb-2a1c-40fa-b5c3-b136c673731a" containerName="registry-server" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.726741 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.741579 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9wxf5"] Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.798326 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-catalog-content\") pod \"certified-operators-9wxf5\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.798478 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qwgf\" (UniqueName: \"kubernetes.io/projected/2dfdc10e-dc31-4565-b790-9b778061ba36-kube-api-access-9qwgf\") pod \"certified-operators-9wxf5\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.798570 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-utilities\") pod \"certified-operators-9wxf5\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.899686 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-catalog-content\") pod \"certified-operators-9wxf5\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.900207 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-catalog-content\") pod \"certified-operators-9wxf5\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.900360 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qwgf\" (UniqueName: \"kubernetes.io/projected/2dfdc10e-dc31-4565-b790-9b778061ba36-kube-api-access-9qwgf\") pod \"certified-operators-9wxf5\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.900752 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-utilities\") pod \"certified-operators-9wxf5\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:29 crc kubenswrapper[4753]: I0129 15:19:29.901045 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-utilities\") pod \"certified-operators-9wxf5\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:30 crc kubenswrapper[4753]: I0129 15:19:30.095400 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qwgf\" (UniqueName: \"kubernetes.io/projected/2dfdc10e-dc31-4565-b790-9b778061ba36-kube-api-access-9qwgf\") pod \"certified-operators-9wxf5\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:30 crc kubenswrapper[4753]: I0129 15:19:30.348969 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:19:30 crc kubenswrapper[4753]: I0129 15:19:30.866011 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9wxf5"] Jan 29 15:19:30 crc kubenswrapper[4753]: I0129 15:19:30.890383 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wxf5" event={"ID":"2dfdc10e-dc31-4565-b790-9b778061ba36","Type":"ContainerStarted","Data":"64804fdedb8b5f372aaf28c349a3b67da65b715fae1a28d0bc9f9c7a4a6ec392"} Jan 29 15:19:31 crc kubenswrapper[4753]: I0129 15:19:31.900479 4753 generic.go:334] "Generic (PLEG): container finished" podID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerID="05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa" exitCode=0 Jan 29 15:19:31 crc kubenswrapper[4753]: I0129 15:19:31.900548 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wxf5" event={"ID":"2dfdc10e-dc31-4565-b790-9b778061ba36","Type":"ContainerDied","Data":"05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa"} Jan 29 15:19:32 crc kubenswrapper[4753]: E0129 15:19:32.042088 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 15:19:32 crc kubenswrapper[4753]: E0129 15:19:32.042600 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qwgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9wxf5_openshift-marketplace(2dfdc10e-dc31-4565-b790-9b778061ba36): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:19:32 crc kubenswrapper[4753]: E0129 15:19:32.043835 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:19:32 crc kubenswrapper[4753]: E0129 15:19:32.908230 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.133273 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cxsh4"] Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.136566 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.142680 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxsh4"] Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.149708 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6bn6\" (UniqueName: \"kubernetes.io/projected/d7388aa3-c682-48f9-b7d2-db1beee6393e-kube-api-access-x6bn6\") pod \"redhat-operators-cxsh4\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.149814 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-catalog-content\") pod \"redhat-operators-cxsh4\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.149878 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-utilities\") pod \"redhat-operators-cxsh4\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.251234 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6bn6\" (UniqueName: \"kubernetes.io/projected/d7388aa3-c682-48f9-b7d2-db1beee6393e-kube-api-access-x6bn6\") pod \"redhat-operators-cxsh4\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.251289 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-catalog-content\") pod \"redhat-operators-cxsh4\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.251318 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-utilities\") pod \"redhat-operators-cxsh4\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.251815 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-catalog-content\") pod \"redhat-operators-cxsh4\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.251849 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-utilities\") pod \"redhat-operators-cxsh4\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.270493 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6bn6\" (UniqueName: \"kubernetes.io/projected/d7388aa3-c682-48f9-b7d2-db1beee6393e-kube-api-access-x6bn6\") pod \"redhat-operators-cxsh4\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.470086 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.711645 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxsh4"] Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.913403 4753 generic.go:334] "Generic (PLEG): container finished" podID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerID="83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca" exitCode=0 Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.913456 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxsh4" event={"ID":"d7388aa3-c682-48f9-b7d2-db1beee6393e","Type":"ContainerDied","Data":"83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca"} Jan 29 15:19:33 crc kubenswrapper[4753]: I0129 15:19:33.913748 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxsh4" event={"ID":"d7388aa3-c682-48f9-b7d2-db1beee6393e","Type":"ContainerStarted","Data":"24198fe41fb4f512e0d5435d1b669f34f9cd0df482de5c83cde4d4ca1a373227"} Jan 29 15:19:34 crc kubenswrapper[4753]: E0129 15:19:34.053243 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 15:19:34 crc kubenswrapper[4753]: E0129 15:19:34.053383 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6bn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cxsh4_openshift-marketplace(d7388aa3-c682-48f9-b7d2-db1beee6393e): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:19:34 crc kubenswrapper[4753]: E0129 15:19:34.054577 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:19:34 crc kubenswrapper[4753]: E0129 15:19:34.922069 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:19:42 crc kubenswrapper[4753]: I0129 15:19:42.150085 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:19:42 crc kubenswrapper[4753]: E0129 15:19:42.153323 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:19:44 crc kubenswrapper[4753]: E0129 15:19:44.280425 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 15:19:44 crc kubenswrapper[4753]: E0129 15:19:44.281227 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qwgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9wxf5_openshift-marketplace(2dfdc10e-dc31-4565-b790-9b778061ba36): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:19:44 crc kubenswrapper[4753]: E0129 15:19:44.282639 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:19:47 crc kubenswrapper[4753]: E0129 15:19:47.277822 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 15:19:47 crc kubenswrapper[4753]: E0129 15:19:47.278336 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6bn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cxsh4_openshift-marketplace(d7388aa3-c682-48f9-b7d2-db1beee6393e): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:19:47 crc kubenswrapper[4753]: E0129 15:19:47.279526 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:19:57 crc kubenswrapper[4753]: I0129 15:19:57.149893 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:19:57 crc kubenswrapper[4753]: E0129 15:19:57.150713 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:19:59 crc kubenswrapper[4753]: E0129 15:19:59.152060 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:20:02 crc kubenswrapper[4753]: E0129 15:20:02.150763 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:20:08 crc kubenswrapper[4753]: I0129 15:20:08.149431 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:20:08 crc kubenswrapper[4753]: E0129 15:20:08.150344 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:20:11 crc kubenswrapper[4753]: E0129 15:20:11.288652 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 15:20:11 crc kubenswrapper[4753]: E0129 15:20:11.289070 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qwgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9wxf5_openshift-marketplace(2dfdc10e-dc31-4565-b790-9b778061ba36): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:20:11 crc kubenswrapper[4753]: E0129 15:20:11.290202 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.215303 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95587bc99-b8w7t"] Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.216578 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.218569 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.218626 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.218710 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.219579 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.219679 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7qxtz" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.238837 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95587bc99-b8w7t"] Jan 29 15:20:13 crc kubenswrapper[4753]: E0129 15:20:13.268614 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 15:20:13 crc kubenswrapper[4753]: E0129 15:20:13.268752 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6bn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cxsh4_openshift-marketplace(d7388aa3-c682-48f9-b7d2-db1beee6393e): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:20:13 crc kubenswrapper[4753]: E0129 15:20:13.269818 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.346789 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-config\") pod \"dnsmasq-dns-95587bc99-b8w7t\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.346837 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfkwp\" (UniqueName: \"kubernetes.io/projected/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-kube-api-access-tfkwp\") pod \"dnsmasq-dns-95587bc99-b8w7t\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.346861 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-dns-svc\") pod \"dnsmasq-dns-95587bc99-b8w7t\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.461593 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-config\") pod \"dnsmasq-dns-95587bc99-b8w7t\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.461657 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfkwp\" (UniqueName: \"kubernetes.io/projected/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-kube-api-access-tfkwp\") pod \"dnsmasq-dns-95587bc99-b8w7t\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.461678 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-dns-svc\") pod \"dnsmasq-dns-95587bc99-b8w7t\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.462581 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-dns-svc\") pod \"dnsmasq-dns-95587bc99-b8w7t\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.470767 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-config\") pod \"dnsmasq-dns-95587bc99-b8w7t\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.492505 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfkwp\" (UniqueName: \"kubernetes.io/projected/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-kube-api-access-tfkwp\") pod \"dnsmasq-dns-95587bc99-b8w7t\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.534384 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.542563 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-9wm7w"] Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.543977 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.562940 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-9wm7w"] Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.664869 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-config\") pod \"dnsmasq-dns-5d79f765b5-9wm7w\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.664917 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f9tp\" (UniqueName: \"kubernetes.io/projected/84d53137-9bfe-4bf0-9703-b7a02bfb45be-kube-api-access-4f9tp\") pod \"dnsmasq-dns-5d79f765b5-9wm7w\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.664969 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-dns-svc\") pod \"dnsmasq-dns-5d79f765b5-9wm7w\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.767060 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-config\") pod \"dnsmasq-dns-5d79f765b5-9wm7w\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.767120 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f9tp\" (UniqueName: \"kubernetes.io/projected/84d53137-9bfe-4bf0-9703-b7a02bfb45be-kube-api-access-4f9tp\") pod \"dnsmasq-dns-5d79f765b5-9wm7w\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.767180 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-dns-svc\") pod \"dnsmasq-dns-5d79f765b5-9wm7w\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.768045 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-config\") pod \"dnsmasq-dns-5d79f765b5-9wm7w\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.768060 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-dns-svc\") pod \"dnsmasq-dns-5d79f765b5-9wm7w\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.786913 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f9tp\" (UniqueName: \"kubernetes.io/projected/84d53137-9bfe-4bf0-9703-b7a02bfb45be-kube-api-access-4f9tp\") pod \"dnsmasq-dns-5d79f765b5-9wm7w\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:13 crc kubenswrapper[4753]: I0129 15:20:13.892171 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.080038 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95587bc99-b8w7t"] Jan 29 15:20:14 crc kubenswrapper[4753]: W0129 15:20:14.088132 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f4a7cbd_9d82_4bf9_b6c1_30547b13a61d.slice/crio-6cd3ad9e964521c86e734cf5d90fcafa8922eeff11e27c7076a9dccfacae20e7 WatchSource:0}: Error finding container 6cd3ad9e964521c86e734cf5d90fcafa8922eeff11e27c7076a9dccfacae20e7: Status 404 returned error can't find the container with id 6cd3ad9e964521c86e734cf5d90fcafa8922eeff11e27c7076a9dccfacae20e7 Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.203116 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" event={"ID":"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d","Type":"ContainerStarted","Data":"6cd3ad9e964521c86e734cf5d90fcafa8922eeff11e27c7076a9dccfacae20e7"} Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.370912 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.372383 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.374917 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.375040 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.375078 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.375389 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.375393 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-l7s95" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.390142 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.448930 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-9wm7w"] Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.475108 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.475176 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbx2w\" (UniqueName: \"kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-kube-api-access-qbx2w\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.475237 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/52033640-0e60-4e49-a14c-fff49b4258ee-pod-info\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.475274 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.475317 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-server-conf\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.475343 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.475415 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.475461 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.475486 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/52033640-0e60-4e49-a14c-fff49b4258ee-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.576235 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/52033640-0e60-4e49-a14c-fff49b4258ee-pod-info\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.576290 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.576336 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-server-conf\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.576359 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.576387 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.576409 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.576432 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/52033640-0e60-4e49-a14c-fff49b4258ee-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.576458 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.576483 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbx2w\" (UniqueName: \"kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-kube-api-access-qbx2w\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.576763 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.577043 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.577521 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.577761 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-server-conf\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.580632 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.581385 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/52033640-0e60-4e49-a14c-fff49b4258ee-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.581925 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.581961 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/71de4f8810de901ef5dd8c0afb3b0cb4e095c83e6fc96df3808fb90c3d031dcc/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.583534 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/52033640-0e60-4e49-a14c-fff49b4258ee-pod-info\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.594730 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbx2w\" (UniqueName: \"kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-kube-api-access-qbx2w\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.610997 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") pod \"rabbitmq-server-0\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.681613 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.683038 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.684956 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-j5d7j" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.685770 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.686019 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.686259 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.686538 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.695938 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.697579 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.778447 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.778527 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.778554 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.778606 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.778634 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.778660 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.778696 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9pwz\" (UniqueName: \"kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-kube-api-access-n9pwz\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.778729 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.778757 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.882312 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.882363 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.882398 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9pwz\" (UniqueName: \"kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-kube-api-access-n9pwz\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.882435 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.882464 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.882507 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.882542 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.882568 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.882614 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.883826 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.884308 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.885420 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.886498 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.890055 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.890672 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.890720 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c9350eec8e6d65f69d55b571e856890720df0c7e44ce5b03367f1258321e98e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.892797 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.901390 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.910690 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9pwz\" (UniqueName: \"kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-kube-api-access-n9pwz\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:14 crc kubenswrapper[4753]: I0129 15:20:14.923861 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") pod \"rabbitmq-cell1-server-0\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.041246 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.175138 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 15:20:15 crc kubenswrapper[4753]: W0129 15:20:15.184409 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52033640_0e60_4e49_a14c_fff49b4258ee.slice/crio-b26b62ec75ce866cd108fbf9fb2deb33f12538a189dfee272bd8c0706c3d889d WatchSource:0}: Error finding container b26b62ec75ce866cd108fbf9fb2deb33f12538a189dfee272bd8c0706c3d889d: Status 404 returned error can't find the container with id b26b62ec75ce866cd108fbf9fb2deb33f12538a189dfee272bd8c0706c3d889d Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.217869 4753 generic.go:334] "Generic (PLEG): container finished" podID="4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" containerID="8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522" exitCode=0 Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.217951 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" event={"ID":"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d","Type":"ContainerDied","Data":"8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522"} Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.221561 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"52033640-0e60-4e49-a14c-fff49b4258ee","Type":"ContainerStarted","Data":"b26b62ec75ce866cd108fbf9fb2deb33f12538a189dfee272bd8c0706c3d889d"} Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.227951 4753 generic.go:334] "Generic (PLEG): container finished" podID="84d53137-9bfe-4bf0-9703-b7a02bfb45be" containerID="1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693" exitCode=0 Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.227994 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" event={"ID":"84d53137-9bfe-4bf0-9703-b7a02bfb45be","Type":"ContainerDied","Data":"1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693"} Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.228015 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" event={"ID":"84d53137-9bfe-4bf0-9703-b7a02bfb45be","Type":"ContainerStarted","Data":"a3c20d369e244e2adaacdcabf5803c3905d9eef7af2147702c4b4e33842151e2"} Jan 29 15:20:15 crc kubenswrapper[4753]: E0129 15:20:15.375797 4753 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 29 15:20:15 crc kubenswrapper[4753]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 29 15:20:15 crc kubenswrapper[4753]: > podSandboxID="6cd3ad9e964521c86e734cf5d90fcafa8922eeff11e27c7076a9dccfacae20e7" Jan 29 15:20:15 crc kubenswrapper[4753]: E0129 15:20:15.375997 4753 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 29 15:20:15 crc kubenswrapper[4753]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfkwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-95587bc99-b8w7t_openstack(4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 29 15:20:15 crc kubenswrapper[4753]: > logger="UnhandledError" Jan 29 15:20:15 crc kubenswrapper[4753]: E0129 15:20:15.377163 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" podUID="4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.488703 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 15:20:15 crc kubenswrapper[4753]: W0129 15:20:15.504772 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5573e3e_7af4_4766_a43c_d9d0f2cf6f42.slice/crio-c44a9967aa4a95151c886b6ce709ff1d747dfcb87cfb53f35b442536bd91c8a1 WatchSource:0}: Error finding container c44a9967aa4a95151c886b6ce709ff1d747dfcb87cfb53f35b442536bd91c8a1: Status 404 returned error can't find the container with id c44a9967aa4a95151c886b6ce709ff1d747dfcb87cfb53f35b442536bd91c8a1 Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.981283 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.982683 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.985044 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-skjj5" Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.985284 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.985367 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.985417 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 29 15:20:15 crc kubenswrapper[4753]: I0129 15:20:15.995608 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.002434 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.098673 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfdd3cb-77ed-4232-898c-8b61bad9c133-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.098729 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edfdd3cb-77ed-4232-898c-8b61bad9c133-operator-scripts\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.098945 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chbvj\" (UniqueName: \"kubernetes.io/projected/edfdd3cb-77ed-4232-898c-8b61bad9c133-kube-api-access-chbvj\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.099056 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-efcb27c0-df74-4020-bc8d-73da499d4754\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efcb27c0-df74-4020-bc8d-73da499d4754\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.099177 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/edfdd3cb-77ed-4232-898c-8b61bad9c133-config-data-default\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.099212 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfdd3cb-77ed-4232-898c-8b61bad9c133-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.099249 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/edfdd3cb-77ed-4232-898c-8b61bad9c133-kolla-config\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.099291 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/edfdd3cb-77ed-4232-898c-8b61bad9c133-config-data-generated\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.200513 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/edfdd3cb-77ed-4232-898c-8b61bad9c133-kolla-config\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.200818 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/edfdd3cb-77ed-4232-898c-8b61bad9c133-config-data-generated\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.200949 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfdd3cb-77ed-4232-898c-8b61bad9c133-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.201023 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edfdd3cb-77ed-4232-898c-8b61bad9c133-operator-scripts\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.201114 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chbvj\" (UniqueName: \"kubernetes.io/projected/edfdd3cb-77ed-4232-898c-8b61bad9c133-kube-api-access-chbvj\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.201227 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-efcb27c0-df74-4020-bc8d-73da499d4754\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efcb27c0-df74-4020-bc8d-73da499d4754\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.201329 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/edfdd3cb-77ed-4232-898c-8b61bad9c133-config-data-default\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.201412 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfdd3cb-77ed-4232-898c-8b61bad9c133-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.201865 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/edfdd3cb-77ed-4232-898c-8b61bad9c133-config-data-generated\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.201368 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/edfdd3cb-77ed-4232-898c-8b61bad9c133-kolla-config\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.202137 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/edfdd3cb-77ed-4232-898c-8b61bad9c133-config-data-default\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.202577 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edfdd3cb-77ed-4232-898c-8b61bad9c133-operator-scripts\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.203501 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.203532 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-efcb27c0-df74-4020-bc8d-73da499d4754\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efcb27c0-df74-4020-bc8d-73da499d4754\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e9447a77ae03f0360d0b54b4dc4c8bdf5cb55e51959a20c92be2d848ac49f9f4/globalmount\"" pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.238842 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" event={"ID":"84d53137-9bfe-4bf0-9703-b7a02bfb45be","Type":"ContainerStarted","Data":"c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19"} Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.240268 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.242321 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42","Type":"ContainerStarted","Data":"c44a9967aa4a95151c886b6ce709ff1d747dfcb87cfb53f35b442536bd91c8a1"} Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.264328 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" podStartSLOduration=3.264307418 podStartE2EDuration="3.264307418s" podCreationTimestamp="2026-01-29 15:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:20:16.26289978 +0000 UTC m=+4650.957634162" watchObservedRunningTime="2026-01-29 15:20:16.264307418 +0000 UTC m=+4650.959041800" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.397888 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.400462 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.402397 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zq58q" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.404462 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.417715 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.509765 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wdt\" (UniqueName: \"kubernetes.io/projected/564d2d65-9b14-436e-9b59-93679ab0466f-kube-api-access-87wdt\") pod \"memcached-0\" (UID: \"564d2d65-9b14-436e-9b59-93679ab0466f\") " pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.509833 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/564d2d65-9b14-436e-9b59-93679ab0466f-kolla-config\") pod \"memcached-0\" (UID: \"564d2d65-9b14-436e-9b59-93679ab0466f\") " pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.509936 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/564d2d65-9b14-436e-9b59-93679ab0466f-config-data\") pod \"memcached-0\" (UID: \"564d2d65-9b14-436e-9b59-93679ab0466f\") " pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.512128 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfdd3cb-77ed-4232-898c-8b61bad9c133-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.512398 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chbvj\" (UniqueName: \"kubernetes.io/projected/edfdd3cb-77ed-4232-898c-8b61bad9c133-kube-api-access-chbvj\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.513626 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfdd3cb-77ed-4232-898c-8b61bad9c133-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.611611 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87wdt\" (UniqueName: \"kubernetes.io/projected/564d2d65-9b14-436e-9b59-93679ab0466f-kube-api-access-87wdt\") pod \"memcached-0\" (UID: \"564d2d65-9b14-436e-9b59-93679ab0466f\") " pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.612070 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/564d2d65-9b14-436e-9b59-93679ab0466f-kolla-config\") pod \"memcached-0\" (UID: \"564d2d65-9b14-436e-9b59-93679ab0466f\") " pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.612264 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/564d2d65-9b14-436e-9b59-93679ab0466f-config-data\") pod \"memcached-0\" (UID: \"564d2d65-9b14-436e-9b59-93679ab0466f\") " pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.612902 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/564d2d65-9b14-436e-9b59-93679ab0466f-kolla-config\") pod \"memcached-0\" (UID: \"564d2d65-9b14-436e-9b59-93679ab0466f\") " pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.613093 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/564d2d65-9b14-436e-9b59-93679ab0466f-config-data\") pod \"memcached-0\" (UID: \"564d2d65-9b14-436e-9b59-93679ab0466f\") " pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.694493 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wdt\" (UniqueName: \"kubernetes.io/projected/564d2d65-9b14-436e-9b59-93679ab0466f-kube-api-access-87wdt\") pod \"memcached-0\" (UID: \"564d2d65-9b14-436e-9b59-93679ab0466f\") " pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.708906 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-efcb27c0-df74-4020-bc8d-73da499d4754\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efcb27c0-df74-4020-bc8d-73da499d4754\") pod \"openstack-galera-0\" (UID: \"edfdd3cb-77ed-4232-898c-8b61bad9c133\") " pod="openstack/openstack-galera-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.727206 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 29 15:20:16 crc kubenswrapper[4753]: I0129 15:20:16.904354 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.276860 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" event={"ID":"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d","Type":"ContainerStarted","Data":"a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d"} Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.278036 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.297458 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42","Type":"ContainerStarted","Data":"7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200"} Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.311203 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"52033640-0e60-4e49-a14c-fff49b4258ee","Type":"ContainerStarted","Data":"dfa2afb048901242468ccf990eac6de707c46ca94fba1edf1367dc3e2e3f2cc0"} Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.318475 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" podStartSLOduration=4.318454445 podStartE2EDuration="4.318454445s" podCreationTimestamp="2026-01-29 15:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:20:17.30751849 +0000 UTC m=+4652.002252872" watchObservedRunningTime="2026-01-29 15:20:17.318454445 +0000 UTC m=+4652.013188827" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.336957 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.469623 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 29 15:20:17 crc kubenswrapper[4753]: W0129 15:20:17.472192 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedfdd3cb_77ed_4232_898c_8b61bad9c133.slice/crio-89b237582e093cecad77b8f7cd389f2aaf3286977b759cf822595f6a89f9d31d WatchSource:0}: Error finding container 89b237582e093cecad77b8f7cd389f2aaf3286977b759cf822595f6a89f9d31d: Status 404 returned error can't find the container with id 89b237582e093cecad77b8f7cd389f2aaf3286977b759cf822595f6a89f9d31d Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.580998 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.582441 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.585328 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-zcjbl" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.585786 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.586033 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.586061 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.597649 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.627467 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0686c722-99c5-44dd-994b-3525d5642d96-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.627521 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0686c722-99c5-44dd-994b-3525d5642d96-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.627571 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0686c722-99c5-44dd-994b-3525d5642d96-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.627610 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5e9753c0-e030-4cd8-84bb-58cf89aa1f52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e9753c0-e030-4cd8-84bb-58cf89aa1f52\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.627629 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0686c722-99c5-44dd-994b-3525d5642d96-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.627818 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6w6p\" (UniqueName: \"kubernetes.io/projected/0686c722-99c5-44dd-994b-3525d5642d96-kube-api-access-v6w6p\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.627958 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0686c722-99c5-44dd-994b-3525d5642d96-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.628024 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0686c722-99c5-44dd-994b-3525d5642d96-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.729557 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0686c722-99c5-44dd-994b-3525d5642d96-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.729685 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0686c722-99c5-44dd-994b-3525d5642d96-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.730835 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0686c722-99c5-44dd-994b-3525d5642d96-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.731290 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5e9753c0-e030-4cd8-84bb-58cf89aa1f52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e9753c0-e030-4cd8-84bb-58cf89aa1f52\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.731367 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0686c722-99c5-44dd-994b-3525d5642d96-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.731368 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0686c722-99c5-44dd-994b-3525d5642d96-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.731410 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6w6p\" (UniqueName: \"kubernetes.io/projected/0686c722-99c5-44dd-994b-3525d5642d96-kube-api-access-v6w6p\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.731495 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0686c722-99c5-44dd-994b-3525d5642d96-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.731554 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0686c722-99c5-44dd-994b-3525d5642d96-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.731626 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0686c722-99c5-44dd-994b-3525d5642d96-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.731856 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0686c722-99c5-44dd-994b-3525d5642d96-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.732414 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0686c722-99c5-44dd-994b-3525d5642d96-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.734176 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.734204 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5e9753c0-e030-4cd8-84bb-58cf89aa1f52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e9753c0-e030-4cd8-84bb-58cf89aa1f52\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d63f33260d14784f6f15e0ed1641be46f5ef6a55bc3c65fd1b7df6494838f689/globalmount\"" pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.795726 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6w6p\" (UniqueName: \"kubernetes.io/projected/0686c722-99c5-44dd-994b-3525d5642d96-kube-api-access-v6w6p\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.795916 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0686c722-99c5-44dd-994b-3525d5642d96-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.796444 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0686c722-99c5-44dd-994b-3525d5642d96-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.818855 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5e9753c0-e030-4cd8-84bb-58cf89aa1f52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e9753c0-e030-4cd8-84bb-58cf89aa1f52\") pod \"openstack-cell1-galera-0\" (UID: \"0686c722-99c5-44dd-994b-3525d5642d96\") " pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:17 crc kubenswrapper[4753]: I0129 15:20:17.882930 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:18 crc kubenswrapper[4753]: I0129 15:20:18.318875 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"564d2d65-9b14-436e-9b59-93679ab0466f","Type":"ContainerStarted","Data":"b03d1d00b97790d9dc31cd00122ccd7fee322120165edc62e40a1c8fddd02980"} Jan 29 15:20:18 crc kubenswrapper[4753]: I0129 15:20:18.319261 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"564d2d65-9b14-436e-9b59-93679ab0466f","Type":"ContainerStarted","Data":"21e29a8a7bc84af5945433652cfab2983df765c5d598ddacdb3f2b760dfaa6bd"} Jan 29 15:20:18 crc kubenswrapper[4753]: I0129 15:20:18.319319 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 29 15:20:18 crc kubenswrapper[4753]: I0129 15:20:18.320863 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"edfdd3cb-77ed-4232-898c-8b61bad9c133","Type":"ContainerStarted","Data":"56eb61d3e62b6a84875a341b9d71a67157e753a63c3d053587ef943be5c7cb94"} Jan 29 15:20:18 crc kubenswrapper[4753]: I0129 15:20:18.320970 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"edfdd3cb-77ed-4232-898c-8b61bad9c133","Type":"ContainerStarted","Data":"89b237582e093cecad77b8f7cd389f2aaf3286977b759cf822595f6a89f9d31d"} Jan 29 15:20:18 crc kubenswrapper[4753]: I0129 15:20:18.338072 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.3380512700000002 podStartE2EDuration="2.33805127s" podCreationTimestamp="2026-01-29 15:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:20:18.335542222 +0000 UTC m=+4653.030276624" watchObservedRunningTime="2026-01-29 15:20:18.33805127 +0000 UTC m=+4653.032785652" Jan 29 15:20:18 crc kubenswrapper[4753]: I0129 15:20:18.357953 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 29 15:20:18 crc kubenswrapper[4753]: W0129 15:20:18.358935 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0686c722_99c5_44dd_994b_3525d5642d96.slice/crio-045621b8de373b7ab22ceb8f81b76b45c8b5af8880184d21fd2730757836f671 WatchSource:0}: Error finding container 045621b8de373b7ab22ceb8f81b76b45c8b5af8880184d21fd2730757836f671: Status 404 returned error can't find the container with id 045621b8de373b7ab22ceb8f81b76b45c8b5af8880184d21fd2730757836f671 Jan 29 15:20:19 crc kubenswrapper[4753]: I0129 15:20:19.150011 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:20:19 crc kubenswrapper[4753]: E0129 15:20:19.150764 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:20:19 crc kubenswrapper[4753]: I0129 15:20:19.344794 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0686c722-99c5-44dd-994b-3525d5642d96","Type":"ContainerStarted","Data":"307a1227713183a4ac031b656a6e41b0a2df5ad78ba52c3397b297f6efcb2da6"} Jan 29 15:20:19 crc kubenswrapper[4753]: I0129 15:20:19.344906 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0686c722-99c5-44dd-994b-3525d5642d96","Type":"ContainerStarted","Data":"045621b8de373b7ab22ceb8f81b76b45c8b5af8880184d21fd2730757836f671"} Jan 29 15:20:22 crc kubenswrapper[4753]: I0129 15:20:22.363496 4753 generic.go:334] "Generic (PLEG): container finished" podID="0686c722-99c5-44dd-994b-3525d5642d96" containerID="307a1227713183a4ac031b656a6e41b0a2df5ad78ba52c3397b297f6efcb2da6" exitCode=0 Jan 29 15:20:22 crc kubenswrapper[4753]: I0129 15:20:22.365023 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0686c722-99c5-44dd-994b-3525d5642d96","Type":"ContainerDied","Data":"307a1227713183a4ac031b656a6e41b0a2df5ad78ba52c3397b297f6efcb2da6"} Jan 29 15:20:22 crc kubenswrapper[4753]: I0129 15:20:22.369091 4753 generic.go:334] "Generic (PLEG): container finished" podID="edfdd3cb-77ed-4232-898c-8b61bad9c133" containerID="56eb61d3e62b6a84875a341b9d71a67157e753a63c3d053587ef943be5c7cb94" exitCode=0 Jan 29 15:20:22 crc kubenswrapper[4753]: I0129 15:20:22.369122 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"edfdd3cb-77ed-4232-898c-8b61bad9c133","Type":"ContainerDied","Data":"56eb61d3e62b6a84875a341b9d71a67157e753a63c3d053587ef943be5c7cb94"} Jan 29 15:20:23 crc kubenswrapper[4753]: I0129 15:20:23.378072 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0686c722-99c5-44dd-994b-3525d5642d96","Type":"ContainerStarted","Data":"846d10e598a9010b1f79240a5922d24745a31c6ef78c7f85ab783c989867db6f"} Jan 29 15:20:23 crc kubenswrapper[4753]: I0129 15:20:23.380089 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"edfdd3cb-77ed-4232-898c-8b61bad9c133","Type":"ContainerStarted","Data":"1102a3fcd9ae51ddd96afb83f3417f0c5e5dee9e25972f1294bc8724ecf356e9"} Jan 29 15:20:23 crc kubenswrapper[4753]: I0129 15:20:23.411734 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.411719711 podStartE2EDuration="7.411719711s" podCreationTimestamp="2026-01-29 15:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:20:23.409543271 +0000 UTC m=+4658.104277673" watchObservedRunningTime="2026-01-29 15:20:23.411719711 +0000 UTC m=+4658.106454093" Jan 29 15:20:23 crc kubenswrapper[4753]: I0129 15:20:23.434412 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.434392602 podStartE2EDuration="9.434392602s" podCreationTimestamp="2026-01-29 15:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:20:23.429020917 +0000 UTC m=+4658.123755299" watchObservedRunningTime="2026-01-29 15:20:23.434392602 +0000 UTC m=+4658.129126984" Jan 29 15:20:23 crc kubenswrapper[4753]: I0129 15:20:23.537432 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:23 crc kubenswrapper[4753]: I0129 15:20:23.893324 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:20:23 crc kubenswrapper[4753]: I0129 15:20:23.952749 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95587bc99-b8w7t"] Jan 29 15:20:24 crc kubenswrapper[4753]: I0129 15:20:24.386627 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" podUID="4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" containerName="dnsmasq-dns" containerID="cri-o://a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d" gracePeriod=10 Jan 29 15:20:24 crc kubenswrapper[4753]: I0129 15:20:24.807426 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:24 crc kubenswrapper[4753]: I0129 15:20:24.958573 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-config\") pod \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " Jan 29 15:20:24 crc kubenswrapper[4753]: I0129 15:20:24.958681 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfkwp\" (UniqueName: \"kubernetes.io/projected/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-kube-api-access-tfkwp\") pod \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " Jan 29 15:20:24 crc kubenswrapper[4753]: I0129 15:20:24.958725 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-dns-svc\") pod \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\" (UID: \"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d\") " Jan 29 15:20:24 crc kubenswrapper[4753]: I0129 15:20:24.965216 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-kube-api-access-tfkwp" (OuterVolumeSpecName: "kube-api-access-tfkwp") pod "4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" (UID: "4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d"). InnerVolumeSpecName "kube-api-access-tfkwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.005249 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" (UID: "4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.005701 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-config" (OuterVolumeSpecName: "config") pod "4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" (UID: "4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.061002 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.061039 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfkwp\" (UniqueName: \"kubernetes.io/projected/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-kube-api-access-tfkwp\") on node \"crc\" DevicePath \"\"" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.061052 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:20:25 crc kubenswrapper[4753]: E0129 15:20:25.150797 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:20:25 crc kubenswrapper[4753]: E0129 15:20:25.150855 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.392925 4753 generic.go:334] "Generic (PLEG): container finished" podID="4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" containerID="a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d" exitCode=0 Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.392966 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" event={"ID":"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d","Type":"ContainerDied","Data":"a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d"} Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.392992 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" event={"ID":"4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d","Type":"ContainerDied","Data":"6cd3ad9e964521c86e734cf5d90fcafa8922eeff11e27c7076a9dccfacae20e7"} Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.393007 4753 scope.go:117] "RemoveContainer" containerID="a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.393119 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95587bc99-b8w7t" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.419533 4753 scope.go:117] "RemoveContainer" containerID="8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.422662 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95587bc99-b8w7t"] Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.433266 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95587bc99-b8w7t"] Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.459913 4753 scope.go:117] "RemoveContainer" containerID="a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d" Jan 29 15:20:25 crc kubenswrapper[4753]: E0129 15:20:25.460315 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d\": container with ID starting with a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d not found: ID does not exist" containerID="a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.460366 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d"} err="failed to get container status \"a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d\": rpc error: code = NotFound desc = could not find container \"a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d\": container with ID starting with a2b69414f32873b3a75e00b63679d4c9dc75b23d84a8a6c07516df01f0ae981d not found: ID does not exist" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.460397 4753 scope.go:117] "RemoveContainer" containerID="8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522" Jan 29 15:20:25 crc kubenswrapper[4753]: E0129 15:20:25.460858 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522\": container with ID starting with 8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522 not found: ID does not exist" containerID="8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522" Jan 29 15:20:25 crc kubenswrapper[4753]: I0129 15:20:25.460891 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522"} err="failed to get container status \"8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522\": rpc error: code = NotFound desc = could not find container \"8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522\": container with ID starting with 8e2c15a7dcd31d36a245d14ed1925454d8271f01eb2ddfb90a8a43df022af522 not found: ID does not exist" Jan 29 15:20:26 crc kubenswrapper[4753]: I0129 15:20:26.163136 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" path="/var/lib/kubelet/pods/4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d/volumes" Jan 29 15:20:26 crc kubenswrapper[4753]: I0129 15:20:26.729303 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 29 15:20:26 crc kubenswrapper[4753]: I0129 15:20:26.904851 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 29 15:20:26 crc kubenswrapper[4753]: I0129 15:20:26.905189 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 29 15:20:27 crc kubenswrapper[4753]: I0129 15:20:27.201324 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 29 15:20:27 crc kubenswrapper[4753]: I0129 15:20:27.487415 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 29 15:20:27 crc kubenswrapper[4753]: I0129 15:20:27.883615 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:27 crc kubenswrapper[4753]: I0129 15:20:27.883670 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:30 crc kubenswrapper[4753]: I0129 15:20:30.777730 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:30 crc kubenswrapper[4753]: I0129 15:20:30.846231 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 29 15:20:31 crc kubenswrapper[4753]: I0129 15:20:31.149869 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:20:31 crc kubenswrapper[4753]: I0129 15:20:31.440960 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"87c9b25e517a2d6c985d3c8b6f73dd009a38a0273f33cbbbc56c00d084557cd5"} Jan 29 15:20:34 crc kubenswrapper[4753]: I0129 15:20:34.937463 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9qj5m"] Jan 29 15:20:34 crc kubenswrapper[4753]: E0129 15:20:34.939929 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" containerName="init" Jan 29 15:20:34 crc kubenswrapper[4753]: I0129 15:20:34.940096 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" containerName="init" Jan 29 15:20:34 crc kubenswrapper[4753]: E0129 15:20:34.940256 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" containerName="dnsmasq-dns" Jan 29 15:20:34 crc kubenswrapper[4753]: I0129 15:20:34.940360 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" containerName="dnsmasq-dns" Jan 29 15:20:34 crc kubenswrapper[4753]: I0129 15:20:34.940715 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4a7cbd-9d82-4bf9-b6c1-30547b13a61d" containerName="dnsmasq-dns" Jan 29 15:20:34 crc kubenswrapper[4753]: I0129 15:20:34.941707 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9qj5m" Jan 29 15:20:34 crc kubenswrapper[4753]: I0129 15:20:34.948112 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9qj5m"] Jan 29 15:20:34 crc kubenswrapper[4753]: I0129 15:20:34.950579 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 29 15:20:35 crc kubenswrapper[4753]: I0129 15:20:35.008633 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph6m4\" (UniqueName: \"kubernetes.io/projected/2f51d374-70e7-4ff6-acef-cda575a5a3b9-kube-api-access-ph6m4\") pod \"root-account-create-update-9qj5m\" (UID: \"2f51d374-70e7-4ff6-acef-cda575a5a3b9\") " pod="openstack/root-account-create-update-9qj5m" Jan 29 15:20:35 crc kubenswrapper[4753]: I0129 15:20:35.009048 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f51d374-70e7-4ff6-acef-cda575a5a3b9-operator-scripts\") pod \"root-account-create-update-9qj5m\" (UID: \"2f51d374-70e7-4ff6-acef-cda575a5a3b9\") " pod="openstack/root-account-create-update-9qj5m" Jan 29 15:20:35 crc kubenswrapper[4753]: I0129 15:20:35.111090 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph6m4\" (UniqueName: \"kubernetes.io/projected/2f51d374-70e7-4ff6-acef-cda575a5a3b9-kube-api-access-ph6m4\") pod \"root-account-create-update-9qj5m\" (UID: \"2f51d374-70e7-4ff6-acef-cda575a5a3b9\") " pod="openstack/root-account-create-update-9qj5m" Jan 29 15:20:35 crc kubenswrapper[4753]: I0129 15:20:35.111232 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f51d374-70e7-4ff6-acef-cda575a5a3b9-operator-scripts\") pod \"root-account-create-update-9qj5m\" (UID: \"2f51d374-70e7-4ff6-acef-cda575a5a3b9\") " pod="openstack/root-account-create-update-9qj5m" Jan 29 15:20:35 crc kubenswrapper[4753]: I0129 15:20:35.112225 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f51d374-70e7-4ff6-acef-cda575a5a3b9-operator-scripts\") pod \"root-account-create-update-9qj5m\" (UID: \"2f51d374-70e7-4ff6-acef-cda575a5a3b9\") " pod="openstack/root-account-create-update-9qj5m" Jan 29 15:20:35 crc kubenswrapper[4753]: I0129 15:20:35.139512 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph6m4\" (UniqueName: \"kubernetes.io/projected/2f51d374-70e7-4ff6-acef-cda575a5a3b9-kube-api-access-ph6m4\") pod \"root-account-create-update-9qj5m\" (UID: \"2f51d374-70e7-4ff6-acef-cda575a5a3b9\") " pod="openstack/root-account-create-update-9qj5m" Jan 29 15:20:35 crc kubenswrapper[4753]: I0129 15:20:35.274324 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9qj5m" Jan 29 15:20:35 crc kubenswrapper[4753]: I0129 15:20:35.761081 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9qj5m"] Jan 29 15:20:35 crc kubenswrapper[4753]: W0129 15:20:35.766126 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f51d374_70e7_4ff6_acef_cda575a5a3b9.slice/crio-a215f75b34d78817d8fa314e860343ef9e8abe9ae576d5c6bd19ef1da849ee10 WatchSource:0}: Error finding container a215f75b34d78817d8fa314e860343ef9e8abe9ae576d5c6bd19ef1da849ee10: Status 404 returned error can't find the container with id a215f75b34d78817d8fa314e860343ef9e8abe9ae576d5c6bd19ef1da849ee10 Jan 29 15:20:36 crc kubenswrapper[4753]: I0129 15:20:36.480901 4753 generic.go:334] "Generic (PLEG): container finished" podID="2f51d374-70e7-4ff6-acef-cda575a5a3b9" containerID="1969da0e5321749e4e89d0fea5bb7854b3710a35c18dccb0896600571dc2014e" exitCode=0 Jan 29 15:20:36 crc kubenswrapper[4753]: I0129 15:20:36.480978 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9qj5m" event={"ID":"2f51d374-70e7-4ff6-acef-cda575a5a3b9","Type":"ContainerDied","Data":"1969da0e5321749e4e89d0fea5bb7854b3710a35c18dccb0896600571dc2014e"} Jan 29 15:20:36 crc kubenswrapper[4753]: I0129 15:20:36.481488 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9qj5m" event={"ID":"2f51d374-70e7-4ff6-acef-cda575a5a3b9","Type":"ContainerStarted","Data":"a215f75b34d78817d8fa314e860343ef9e8abe9ae576d5c6bd19ef1da849ee10"} Jan 29 15:20:37 crc kubenswrapper[4753]: E0129 15:20:37.153066 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:20:37 crc kubenswrapper[4753]: I0129 15:20:37.792233 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9qj5m" Jan 29 15:20:37 crc kubenswrapper[4753]: I0129 15:20:37.875519 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f51d374-70e7-4ff6-acef-cda575a5a3b9-operator-scripts\") pod \"2f51d374-70e7-4ff6-acef-cda575a5a3b9\" (UID: \"2f51d374-70e7-4ff6-acef-cda575a5a3b9\") " Jan 29 15:20:37 crc kubenswrapper[4753]: I0129 15:20:37.875778 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph6m4\" (UniqueName: \"kubernetes.io/projected/2f51d374-70e7-4ff6-acef-cda575a5a3b9-kube-api-access-ph6m4\") pod \"2f51d374-70e7-4ff6-acef-cda575a5a3b9\" (UID: \"2f51d374-70e7-4ff6-acef-cda575a5a3b9\") " Jan 29 15:20:37 crc kubenswrapper[4753]: I0129 15:20:37.876550 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f51d374-70e7-4ff6-acef-cda575a5a3b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f51d374-70e7-4ff6-acef-cda575a5a3b9" (UID: "2f51d374-70e7-4ff6-acef-cda575a5a3b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:20:37 crc kubenswrapper[4753]: I0129 15:20:37.899828 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f51d374-70e7-4ff6-acef-cda575a5a3b9-kube-api-access-ph6m4" (OuterVolumeSpecName: "kube-api-access-ph6m4") pod "2f51d374-70e7-4ff6-acef-cda575a5a3b9" (UID: "2f51d374-70e7-4ff6-acef-cda575a5a3b9"). InnerVolumeSpecName "kube-api-access-ph6m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:20:37 crc kubenswrapper[4753]: I0129 15:20:37.977686 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f51d374-70e7-4ff6-acef-cda575a5a3b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:20:37 crc kubenswrapper[4753]: I0129 15:20:37.977720 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph6m4\" (UniqueName: \"kubernetes.io/projected/2f51d374-70e7-4ff6-acef-cda575a5a3b9-kube-api-access-ph6m4\") on node \"crc\" DevicePath \"\"" Jan 29 15:20:38 crc kubenswrapper[4753]: E0129 15:20:38.150819 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:20:38 crc kubenswrapper[4753]: I0129 15:20:38.502034 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9qj5m" event={"ID":"2f51d374-70e7-4ff6-acef-cda575a5a3b9","Type":"ContainerDied","Data":"a215f75b34d78817d8fa314e860343ef9e8abe9ae576d5c6bd19ef1da849ee10"} Jan 29 15:20:38 crc kubenswrapper[4753]: I0129 15:20:38.502075 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a215f75b34d78817d8fa314e860343ef9e8abe9ae576d5c6bd19ef1da849ee10" Jan 29 15:20:38 crc kubenswrapper[4753]: I0129 15:20:38.502128 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9qj5m" Jan 29 15:20:41 crc kubenswrapper[4753]: I0129 15:20:41.555205 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9qj5m"] Jan 29 15:20:41 crc kubenswrapper[4753]: I0129 15:20:41.562765 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9qj5m"] Jan 29 15:20:42 crc kubenswrapper[4753]: I0129 15:20:42.163981 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f51d374-70e7-4ff6-acef-cda575a5a3b9" path="/var/lib/kubelet/pods/2f51d374-70e7-4ff6-acef-cda575a5a3b9/volumes" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.584956 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-nd6z4"] Jan 29 15:20:46 crc kubenswrapper[4753]: E0129 15:20:46.587525 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f51d374-70e7-4ff6-acef-cda575a5a3b9" containerName="mariadb-account-create-update" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.587710 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f51d374-70e7-4ff6-acef-cda575a5a3b9" containerName="mariadb-account-create-update" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.588061 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f51d374-70e7-4ff6-acef-cda575a5a3b9" containerName="mariadb-account-create-update" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.590186 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nd6z4" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.593583 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.604322 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nd6z4"] Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.716112 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c997c121-d070-4686-ba00-f4a025096b7b-operator-scripts\") pod \"root-account-create-update-nd6z4\" (UID: \"c997c121-d070-4686-ba00-f4a025096b7b\") " pod="openstack/root-account-create-update-nd6z4" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.716189 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kpc2\" (UniqueName: \"kubernetes.io/projected/c997c121-d070-4686-ba00-f4a025096b7b-kube-api-access-9kpc2\") pod \"root-account-create-update-nd6z4\" (UID: \"c997c121-d070-4686-ba00-f4a025096b7b\") " pod="openstack/root-account-create-update-nd6z4" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.817167 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kpc2\" (UniqueName: \"kubernetes.io/projected/c997c121-d070-4686-ba00-f4a025096b7b-kube-api-access-9kpc2\") pod \"root-account-create-update-nd6z4\" (UID: \"c997c121-d070-4686-ba00-f4a025096b7b\") " pod="openstack/root-account-create-update-nd6z4" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.817335 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c997c121-d070-4686-ba00-f4a025096b7b-operator-scripts\") pod \"root-account-create-update-nd6z4\" (UID: \"c997c121-d070-4686-ba00-f4a025096b7b\") " pod="openstack/root-account-create-update-nd6z4" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.818260 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c997c121-d070-4686-ba00-f4a025096b7b-operator-scripts\") pod \"root-account-create-update-nd6z4\" (UID: \"c997c121-d070-4686-ba00-f4a025096b7b\") " pod="openstack/root-account-create-update-nd6z4" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.836627 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kpc2\" (UniqueName: \"kubernetes.io/projected/c997c121-d070-4686-ba00-f4a025096b7b-kube-api-access-9kpc2\") pod \"root-account-create-update-nd6z4\" (UID: \"c997c121-d070-4686-ba00-f4a025096b7b\") " pod="openstack/root-account-create-update-nd6z4" Jan 29 15:20:46 crc kubenswrapper[4753]: I0129 15:20:46.908939 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nd6z4" Jan 29 15:20:47 crc kubenswrapper[4753]: I0129 15:20:47.152810 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nd6z4"] Jan 29 15:20:47 crc kubenswrapper[4753]: I0129 15:20:47.590199 4753 generic.go:334] "Generic (PLEG): container finished" podID="c997c121-d070-4686-ba00-f4a025096b7b" containerID="0bab75a23c795ae9cb14470ba54f6cc6c33394ce0ca512bb1a47e3e31d29f186" exitCode=0 Jan 29 15:20:47 crc kubenswrapper[4753]: I0129 15:20:47.590241 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nd6z4" event={"ID":"c997c121-d070-4686-ba00-f4a025096b7b","Type":"ContainerDied","Data":"0bab75a23c795ae9cb14470ba54f6cc6c33394ce0ca512bb1a47e3e31d29f186"} Jan 29 15:20:47 crc kubenswrapper[4753]: I0129 15:20:47.590265 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nd6z4" event={"ID":"c997c121-d070-4686-ba00-f4a025096b7b","Type":"ContainerStarted","Data":"723a835c5ed47e72baf71c32782a6cd13fae5302d6ba4e259cd7540960a40017"} Jan 29 15:20:48 crc kubenswrapper[4753]: I0129 15:20:48.600792 4753 generic.go:334] "Generic (PLEG): container finished" podID="52033640-0e60-4e49-a14c-fff49b4258ee" containerID="dfa2afb048901242468ccf990eac6de707c46ca94fba1edf1367dc3e2e3f2cc0" exitCode=0 Jan 29 15:20:48 crc kubenswrapper[4753]: I0129 15:20:48.600841 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"52033640-0e60-4e49-a14c-fff49b4258ee","Type":"ContainerDied","Data":"dfa2afb048901242468ccf990eac6de707c46ca94fba1edf1367dc3e2e3f2cc0"} Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.042106 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nd6z4" Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.167733 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c997c121-d070-4686-ba00-f4a025096b7b-operator-scripts\") pod \"c997c121-d070-4686-ba00-f4a025096b7b\" (UID: \"c997c121-d070-4686-ba00-f4a025096b7b\") " Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.167827 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kpc2\" (UniqueName: \"kubernetes.io/projected/c997c121-d070-4686-ba00-f4a025096b7b-kube-api-access-9kpc2\") pod \"c997c121-d070-4686-ba00-f4a025096b7b\" (UID: \"c997c121-d070-4686-ba00-f4a025096b7b\") " Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.168145 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c997c121-d070-4686-ba00-f4a025096b7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c997c121-d070-4686-ba00-f4a025096b7b" (UID: "c997c121-d070-4686-ba00-f4a025096b7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.172351 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c997c121-d070-4686-ba00-f4a025096b7b-kube-api-access-9kpc2" (OuterVolumeSpecName: "kube-api-access-9kpc2") pod "c997c121-d070-4686-ba00-f4a025096b7b" (UID: "c997c121-d070-4686-ba00-f4a025096b7b"). InnerVolumeSpecName "kube-api-access-9kpc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.269906 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c997c121-d070-4686-ba00-f4a025096b7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.269940 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kpc2\" (UniqueName: \"kubernetes.io/projected/c997c121-d070-4686-ba00-f4a025096b7b-kube-api-access-9kpc2\") on node \"crc\" DevicePath \"\"" Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.608571 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nd6z4" Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.608577 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nd6z4" event={"ID":"c997c121-d070-4686-ba00-f4a025096b7b","Type":"ContainerDied","Data":"723a835c5ed47e72baf71c32782a6cd13fae5302d6ba4e259cd7540960a40017"} Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.608670 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="723a835c5ed47e72baf71c32782a6cd13fae5302d6ba4e259cd7540960a40017" Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.610386 4753 generic.go:334] "Generic (PLEG): container finished" podID="e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" containerID="7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200" exitCode=0 Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.610420 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42","Type":"ContainerDied","Data":"7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200"} Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.612347 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"52033640-0e60-4e49-a14c-fff49b4258ee","Type":"ContainerStarted","Data":"16d5ae0313d32c9f7319bd651a136a7fa71050db5c4702a5874b4a67b338ef17"} Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.612767 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 29 15:20:49 crc kubenswrapper[4753]: I0129 15:20:49.685233 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.685213241 podStartE2EDuration="36.685213241s" podCreationTimestamp="2026-01-29 15:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:20:49.679312932 +0000 UTC m=+4684.374047334" watchObservedRunningTime="2026-01-29 15:20:49.685213241 +0000 UTC m=+4684.379947623" Jan 29 15:20:50 crc kubenswrapper[4753]: E0129 15:20:50.150714 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:20:50 crc kubenswrapper[4753]: I0129 15:20:50.622737 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42","Type":"ContainerStarted","Data":"3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8"} Jan 29 15:20:50 crc kubenswrapper[4753]: I0129 15:20:50.623503 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:20:50 crc kubenswrapper[4753]: I0129 15:20:50.650791 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.650771307 podStartE2EDuration="37.650771307s" podCreationTimestamp="2026-01-29 15:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:20:50.640946732 +0000 UTC m=+4685.335681104" watchObservedRunningTime="2026-01-29 15:20:50.650771307 +0000 UTC m=+4685.345505709" Jan 29 15:20:52 crc kubenswrapper[4753]: E0129 15:20:52.155534 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:21:02 crc kubenswrapper[4753]: E0129 15:21:02.281285 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 15:21:02 crc kubenswrapper[4753]: E0129 15:21:02.282047 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qwgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9wxf5_openshift-marketplace(2dfdc10e-dc31-4565-b790-9b778061ba36): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:21:02 crc kubenswrapper[4753]: E0129 15:21:02.283234 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:21:04 crc kubenswrapper[4753]: I0129 15:21:04.700111 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 29 15:21:05 crc kubenswrapper[4753]: I0129 15:21:05.043284 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:06 crc kubenswrapper[4753]: E0129 15:21:06.347590 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 15:21:06 crc kubenswrapper[4753]: E0129 15:21:06.347864 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6bn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cxsh4_openshift-marketplace(d7388aa3-c682-48f9-b7d2-db1beee6393e): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:21:06 crc kubenswrapper[4753]: E0129 15:21:06.349096 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:21:08 crc kubenswrapper[4753]: I0129 15:21:08.877680 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699964fbc-vj7cn"] Jan 29 15:21:08 crc kubenswrapper[4753]: E0129 15:21:08.878420 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c997c121-d070-4686-ba00-f4a025096b7b" containerName="mariadb-account-create-update" Jan 29 15:21:08 crc kubenswrapper[4753]: I0129 15:21:08.878437 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c997c121-d070-4686-ba00-f4a025096b7b" containerName="mariadb-account-create-update" Jan 29 15:21:08 crc kubenswrapper[4753]: I0129 15:21:08.878610 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c997c121-d070-4686-ba00-f4a025096b7b" containerName="mariadb-account-create-update" Jan 29 15:21:08 crc kubenswrapper[4753]: I0129 15:21:08.879424 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:08 crc kubenswrapper[4753]: I0129 15:21:08.887087 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-vj7cn"] Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.027087 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-dns-svc\") pod \"dnsmasq-dns-699964fbc-vj7cn\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.027393 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j47zl\" (UniqueName: \"kubernetes.io/projected/28c0dd77-1fe3-4952-be0d-e9d0ef321273-kube-api-access-j47zl\") pod \"dnsmasq-dns-699964fbc-vj7cn\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.027420 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-config\") pod \"dnsmasq-dns-699964fbc-vj7cn\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.129239 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-dns-svc\") pod \"dnsmasq-dns-699964fbc-vj7cn\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.129284 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j47zl\" (UniqueName: \"kubernetes.io/projected/28c0dd77-1fe3-4952-be0d-e9d0ef321273-kube-api-access-j47zl\") pod \"dnsmasq-dns-699964fbc-vj7cn\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.129310 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-config\") pod \"dnsmasq-dns-699964fbc-vj7cn\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.130030 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-config\") pod \"dnsmasq-dns-699964fbc-vj7cn\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.130167 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-dns-svc\") pod \"dnsmasq-dns-699964fbc-vj7cn\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.152775 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j47zl\" (UniqueName: \"kubernetes.io/projected/28c0dd77-1fe3-4952-be0d-e9d0ef321273-kube-api-access-j47zl\") pod \"dnsmasq-dns-699964fbc-vj7cn\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.239928 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.591049 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 15:21:09 crc kubenswrapper[4753]: I0129 15:21:09.758050 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-vj7cn"] Jan 29 15:21:10 crc kubenswrapper[4753]: I0129 15:21:10.133640 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 15:21:10 crc kubenswrapper[4753]: I0129 15:21:10.773372 4753 generic.go:334] "Generic (PLEG): container finished" podID="28c0dd77-1fe3-4952-be0d-e9d0ef321273" containerID="e085115e5cb3fe81227673565b520de9a5c875c69b73eef6583603ce1dbed345" exitCode=0 Jan 29 15:21:10 crc kubenswrapper[4753]: I0129 15:21:10.773416 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" event={"ID":"28c0dd77-1fe3-4952-be0d-e9d0ef321273","Type":"ContainerDied","Data":"e085115e5cb3fe81227673565b520de9a5c875c69b73eef6583603ce1dbed345"} Jan 29 15:21:10 crc kubenswrapper[4753]: I0129 15:21:10.773441 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" event={"ID":"28c0dd77-1fe3-4952-be0d-e9d0ef321273","Type":"ContainerStarted","Data":"d2851affe45639b5bbe45db9e741eee82c7c22738e5bcc2f61428d3bcb4c6b0a"} Jan 29 15:21:11 crc kubenswrapper[4753]: I0129 15:21:11.538315 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="52033640-0e60-4e49-a14c-fff49b4258ee" containerName="rabbitmq" containerID="cri-o://16d5ae0313d32c9f7319bd651a136a7fa71050db5c4702a5874b4a67b338ef17" gracePeriod=604799 Jan 29 15:21:11 crc kubenswrapper[4753]: I0129 15:21:11.780382 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" event={"ID":"28c0dd77-1fe3-4952-be0d-e9d0ef321273","Type":"ContainerStarted","Data":"2db1d3734e1f9a868fc4df6a2a68fd2f8e137ab50ab72af91cb568f113d708b1"} Jan 29 15:21:11 crc kubenswrapper[4753]: I0129 15:21:11.780765 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:12 crc kubenswrapper[4753]: I0129 15:21:12.046752 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" containerName="rabbitmq" containerID="cri-o://3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8" gracePeriod=604799 Jan 29 15:21:14 crc kubenswrapper[4753]: I0129 15:21:14.697346 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="52033640-0e60-4e49-a14c-fff49b4258ee" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.246:5672: connect: connection refused" Jan 29 15:21:15 crc kubenswrapper[4753]: I0129 15:21:15.042890 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.247:5672: connect: connection refused" Jan 29 15:21:16 crc kubenswrapper[4753]: E0129 15:21:16.163664 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:21:16 crc kubenswrapper[4753]: I0129 15:21:16.199030 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" podStartSLOduration=8.199002197 podStartE2EDuration="8.199002197s" podCreationTimestamp="2026-01-29 15:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:21:11.803179571 +0000 UTC m=+4706.497913963" watchObservedRunningTime="2026-01-29 15:21:16.199002197 +0000 UTC m=+4710.893736599" Jan 29 15:21:17 crc kubenswrapper[4753]: I0129 15:21:17.825901 4753 generic.go:334] "Generic (PLEG): container finished" podID="52033640-0e60-4e49-a14c-fff49b4258ee" containerID="16d5ae0313d32c9f7319bd651a136a7fa71050db5c4702a5874b4a67b338ef17" exitCode=0 Jan 29 15:21:17 crc kubenswrapper[4753]: I0129 15:21:17.825944 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"52033640-0e60-4e49-a14c-fff49b4258ee","Type":"ContainerDied","Data":"16d5ae0313d32c9f7319bd651a136a7fa71050db5c4702a5874b4a67b338ef17"} Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.111238 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 15:21:18 crc kubenswrapper[4753]: E0129 15:21:18.156422 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.282591 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-confd\") pod \"52033640-0e60-4e49-a14c-fff49b4258ee\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.282652 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-erlang-cookie\") pod \"52033640-0e60-4e49-a14c-fff49b4258ee\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.282737 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbx2w\" (UniqueName: \"kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-kube-api-access-qbx2w\") pod \"52033640-0e60-4e49-a14c-fff49b4258ee\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.282937 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") pod \"52033640-0e60-4e49-a14c-fff49b4258ee\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.282975 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-server-conf\") pod \"52033640-0e60-4e49-a14c-fff49b4258ee\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.283021 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-plugins-conf\") pod \"52033640-0e60-4e49-a14c-fff49b4258ee\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.283069 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-plugins\") pod \"52033640-0e60-4e49-a14c-fff49b4258ee\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.283120 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/52033640-0e60-4e49-a14c-fff49b4258ee-pod-info\") pod \"52033640-0e60-4e49-a14c-fff49b4258ee\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.283199 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "52033640-0e60-4e49-a14c-fff49b4258ee" (UID: "52033640-0e60-4e49-a14c-fff49b4258ee"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.283234 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/52033640-0e60-4e49-a14c-fff49b4258ee-erlang-cookie-secret\") pod \"52033640-0e60-4e49-a14c-fff49b4258ee\" (UID: \"52033640-0e60-4e49-a14c-fff49b4258ee\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.283677 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.285178 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "52033640-0e60-4e49-a14c-fff49b4258ee" (UID: "52033640-0e60-4e49-a14c-fff49b4258ee"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.285538 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "52033640-0e60-4e49-a14c-fff49b4258ee" (UID: "52033640-0e60-4e49-a14c-fff49b4258ee"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.290397 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-kube-api-access-qbx2w" (OuterVolumeSpecName: "kube-api-access-qbx2w") pod "52033640-0e60-4e49-a14c-fff49b4258ee" (UID: "52033640-0e60-4e49-a14c-fff49b4258ee"). InnerVolumeSpecName "kube-api-access-qbx2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.290909 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/52033640-0e60-4e49-a14c-fff49b4258ee-pod-info" (OuterVolumeSpecName: "pod-info") pod "52033640-0e60-4e49-a14c-fff49b4258ee" (UID: "52033640-0e60-4e49-a14c-fff49b4258ee"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.290961 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52033640-0e60-4e49-a14c-fff49b4258ee-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "52033640-0e60-4e49-a14c-fff49b4258ee" (UID: "52033640-0e60-4e49-a14c-fff49b4258ee"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.304353 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe" (OuterVolumeSpecName: "persistence") pod "52033640-0e60-4e49-a14c-fff49b4258ee" (UID: "52033640-0e60-4e49-a14c-fff49b4258ee"). InnerVolumeSpecName "pvc-91fd710d-2230-46b4-b170-c4d4072b39fe". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.324854 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-server-conf" (OuterVolumeSpecName: "server-conf") pod "52033640-0e60-4e49-a14c-fff49b4258ee" (UID: "52033640-0e60-4e49-a14c-fff49b4258ee"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.384378 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "52033640-0e60-4e49-a14c-fff49b4258ee" (UID: "52033640-0e60-4e49-a14c-fff49b4258ee"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.385351 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.385381 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbx2w\" (UniqueName: \"kubernetes.io/projected/52033640-0e60-4e49-a14c-fff49b4258ee-kube-api-access-qbx2w\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.385423 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") on node \"crc\" " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.385440 4753 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-server-conf\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.385454 4753 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/52033640-0e60-4e49-a14c-fff49b4258ee-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.385466 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/52033640-0e60-4e49-a14c-fff49b4258ee-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.385477 4753 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/52033640-0e60-4e49-a14c-fff49b4258ee-pod-info\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.385488 4753 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/52033640-0e60-4e49-a14c-fff49b4258ee-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.403140 4753 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.403499 4753 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-91fd710d-2230-46b4-b170-c4d4072b39fe" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe") on node "crc" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.486792 4753 reconciler_common.go:293] "Volume detached for volume \"pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.531679 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.689307 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-plugins-conf\") pod \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.689368 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-erlang-cookie\") pod \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.689389 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-erlang-cookie-secret\") pod \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.689438 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9pwz\" (UniqueName: \"kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-kube-api-access-n9pwz\") pod \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.689524 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") pod \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.689559 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-plugins\") pod \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.689763 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" (UID: "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.690054 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" (UID: "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.690098 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-server-conf\") pod \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.690124 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-confd\") pod \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.690129 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" (UID: "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.690185 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-pod-info\") pod \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\" (UID: \"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42\") " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.690444 4753 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.690460 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.690471 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.694341 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-kube-api-access-n9pwz" (OuterVolumeSpecName: "kube-api-access-n9pwz") pod "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" (UID: "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42"). InnerVolumeSpecName "kube-api-access-n9pwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.694481 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" (UID: "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.695711 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-pod-info" (OuterVolumeSpecName: "pod-info") pod "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" (UID: "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.705197 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137" (OuterVolumeSpecName: "persistence") pod "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" (UID: "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42"). InnerVolumeSpecName "pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.711047 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-server-conf" (OuterVolumeSpecName: "server-conf") pod "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" (UID: "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.773719 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" (UID: "e5573e3e-7af4-4766-a43c-d9d0f2cf6f42"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.792329 4753 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.792363 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9pwz\" (UniqueName: \"kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-kube-api-access-n9pwz\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.792404 4753 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") on node \"crc\" " Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.792420 4753 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-server-conf\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.792432 4753 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.792445 4753 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42-pod-info\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.812938 4753 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.813104 4753 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137") on node "crc" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.835108 4753 generic.go:334] "Generic (PLEG): container finished" podID="e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" containerID="3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8" exitCode=0 Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.835234 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.836055 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42","Type":"ContainerDied","Data":"3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8"} Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.836103 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e5573e3e-7af4-4766-a43c-d9d0f2cf6f42","Type":"ContainerDied","Data":"c44a9967aa4a95151c886b6ce709ff1d747dfcb87cfb53f35b442536bd91c8a1"} Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.836121 4753 scope.go:117] "RemoveContainer" containerID="3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.839410 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"52033640-0e60-4e49-a14c-fff49b4258ee","Type":"ContainerDied","Data":"b26b62ec75ce866cd108fbf9fb2deb33f12538a189dfee272bd8c0706c3d889d"} Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.839497 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.854496 4753 scope.go:117] "RemoveContainer" containerID="7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.870205 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.890217 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.894033 4753 reconciler_common.go:293] "Volume detached for volume \"pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.900066 4753 scope.go:117] "RemoveContainer" containerID="3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.901002 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 15:21:18 crc kubenswrapper[4753]: E0129 15:21:18.901089 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8\": container with ID starting with 3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8 not found: ID does not exist" containerID="3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.901140 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8"} err="failed to get container status \"3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8\": rpc error: code = NotFound desc = could not find container \"3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8\": container with ID starting with 3103a17da988262184b1bc77e9112eee8db90f1f6f717d9acbc79f7b341767e8 not found: ID does not exist" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.901190 4753 scope.go:117] "RemoveContainer" containerID="7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200" Jan 29 15:21:18 crc kubenswrapper[4753]: E0129 15:21:18.904300 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200\": container with ID starting with 7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200 not found: ID does not exist" containerID="7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.904363 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200"} err="failed to get container status \"7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200\": rpc error: code = NotFound desc = could not find container \"7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200\": container with ID starting with 7c326ba8049859fdf499ff029488136f15517e2f825d793ec292be15d8815200 not found: ID does not exist" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.904412 4753 scope.go:117] "RemoveContainer" containerID="16d5ae0313d32c9f7319bd651a136a7fa71050db5c4702a5874b4a67b338ef17" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.911314 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.918685 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 15:21:18 crc kubenswrapper[4753]: E0129 15:21:18.919035 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52033640-0e60-4e49-a14c-fff49b4258ee" containerName="rabbitmq" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.919049 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="52033640-0e60-4e49-a14c-fff49b4258ee" containerName="rabbitmq" Jan 29 15:21:18 crc kubenswrapper[4753]: E0129 15:21:18.919074 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52033640-0e60-4e49-a14c-fff49b4258ee" containerName="setup-container" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.919082 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="52033640-0e60-4e49-a14c-fff49b4258ee" containerName="setup-container" Jan 29 15:21:18 crc kubenswrapper[4753]: E0129 15:21:18.919102 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" containerName="rabbitmq" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.919110 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" containerName="rabbitmq" Jan 29 15:21:18 crc kubenswrapper[4753]: E0129 15:21:18.919129 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" containerName="setup-container" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.919136 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" containerName="setup-container" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.919334 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" containerName="rabbitmq" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.919351 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="52033640-0e60-4e49-a14c-fff49b4258ee" containerName="rabbitmq" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.920289 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.923572 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.923912 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.924084 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.924275 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.924444 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-j5d7j" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.928325 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.937505 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.938912 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.942758 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.942987 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-l7s95" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.943136 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.943678 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.945594 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.945704 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 15:21:18 crc kubenswrapper[4753]: I0129 15:21:18.956482 4753 scope.go:117] "RemoveContainer" containerID="dfa2afb048901242468ccf990eac6de707c46ca94fba1edf1367dc3e2e3f2cc0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103017 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95e968fb-1cc0-4aae-a72a-204c2515f449-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103068 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95e968fb-1cc0-4aae-a72a-204c2515f449-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103104 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lhgf\" (UniqueName: \"kubernetes.io/projected/135e4efe-2677-4433-9a33-e2d1e1220037-kube-api-access-6lhgf\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103123 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/135e4efe-2677-4433-9a33-e2d1e1220037-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103141 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/135e4efe-2677-4433-9a33-e2d1e1220037-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103170 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/135e4efe-2677-4433-9a33-e2d1e1220037-server-conf\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103189 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95e968fb-1cc0-4aae-a72a-204c2515f449-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103211 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r7gq\" (UniqueName: \"kubernetes.io/projected/95e968fb-1cc0-4aae-a72a-204c2515f449-kube-api-access-5r7gq\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103231 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/135e4efe-2677-4433-9a33-e2d1e1220037-pod-info\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103252 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103268 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95e968fb-1cc0-4aae-a72a-204c2515f449-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103288 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103313 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95e968fb-1cc0-4aae-a72a-204c2515f449-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103338 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95e968fb-1cc0-4aae-a72a-204c2515f449-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103366 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95e968fb-1cc0-4aae-a72a-204c2515f449-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103389 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/135e4efe-2677-4433-9a33-e2d1e1220037-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103411 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/135e4efe-2677-4433-9a33-e2d1e1220037-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.103426 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/135e4efe-2677-4433-9a33-e2d1e1220037-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.204510 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/135e4efe-2677-4433-9a33-e2d1e1220037-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.204566 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95e968fb-1cc0-4aae-a72a-204c2515f449-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.204591 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95e968fb-1cc0-4aae-a72a-204c2515f449-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.204622 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lhgf\" (UniqueName: \"kubernetes.io/projected/135e4efe-2677-4433-9a33-e2d1e1220037-kube-api-access-6lhgf\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.204642 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/135e4efe-2677-4433-9a33-e2d1e1220037-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.204684 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/135e4efe-2677-4433-9a33-e2d1e1220037-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.204699 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/135e4efe-2677-4433-9a33-e2d1e1220037-server-conf\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205266 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95e968fb-1cc0-4aae-a72a-204c2515f449-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.204718 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95e968fb-1cc0-4aae-a72a-204c2515f449-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205367 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r7gq\" (UniqueName: \"kubernetes.io/projected/95e968fb-1cc0-4aae-a72a-204c2515f449-kube-api-access-5r7gq\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205400 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/135e4efe-2677-4433-9a33-e2d1e1220037-pod-info\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205429 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205453 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95e968fb-1cc0-4aae-a72a-204c2515f449-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205484 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205521 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95e968fb-1cc0-4aae-a72a-204c2515f449-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205557 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95e968fb-1cc0-4aae-a72a-204c2515f449-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205585 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95e968fb-1cc0-4aae-a72a-204c2515f449-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205618 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/135e4efe-2677-4433-9a33-e2d1e1220037-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205647 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/135e4efe-2677-4433-9a33-e2d1e1220037-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.205813 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/135e4efe-2677-4433-9a33-e2d1e1220037-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.206056 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/135e4efe-2677-4433-9a33-e2d1e1220037-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.206179 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/135e4efe-2677-4433-9a33-e2d1e1220037-server-conf\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.206321 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95e968fb-1cc0-4aae-a72a-204c2515f449-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.206912 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/135e4efe-2677-4433-9a33-e2d1e1220037-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.207312 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95e968fb-1cc0-4aae-a72a-204c2515f449-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.207933 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95e968fb-1cc0-4aae-a72a-204c2515f449-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.208804 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.208861 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/71de4f8810de901ef5dd8c0afb3b0cb4e095c83e6fc96df3808fb90c3d031dcc/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.210303 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/135e4efe-2677-4433-9a33-e2d1e1220037-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.210390 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.210426 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c9350eec8e6d65f69d55b571e856890720df0c7e44ce5b03367f1258321e98e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.210645 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95e968fb-1cc0-4aae-a72a-204c2515f449-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.210791 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95e968fb-1cc0-4aae-a72a-204c2515f449-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.210968 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95e968fb-1cc0-4aae-a72a-204c2515f449-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.211139 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/135e4efe-2677-4433-9a33-e2d1e1220037-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.212128 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/135e4efe-2677-4433-9a33-e2d1e1220037-pod-info\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.228236 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lhgf\" (UniqueName: \"kubernetes.io/projected/135e4efe-2677-4433-9a33-e2d1e1220037-kube-api-access-6lhgf\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.236560 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r7gq\" (UniqueName: \"kubernetes.io/projected/95e968fb-1cc0-4aae-a72a-204c2515f449-kube-api-access-5r7gq\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.242376 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91fd710d-2230-46b4-b170-c4d4072b39fe\") pod \"rabbitmq-server-0\" (UID: \"135e4efe-2677-4433-9a33-e2d1e1220037\") " pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.242467 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.250894 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d5fda16a-7e0f-4aa0-b99b-c01eece99137\") pod \"rabbitmq-cell1-server-0\" (UID: \"95e968fb-1cc0-4aae-a72a-204c2515f449\") " pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.280588 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.306527 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.310086 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-9wm7w"] Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.310510 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" podUID="84d53137-9bfe-4bf0-9703-b7a02bfb45be" containerName="dnsmasq-dns" containerID="cri-o://c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19" gracePeriod=10 Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.694881 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.761232 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 29 15:21:19 crc kubenswrapper[4753]: W0129 15:21:19.770465 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95e968fb_1cc0_4aae_a72a_204c2515f449.slice/crio-de635602cd965136fc75f10b431615c8882ef3dbbd7621b6629b8f317be7f791 WatchSource:0}: Error finding container de635602cd965136fc75f10b431615c8882ef3dbbd7621b6629b8f317be7f791: Status 404 returned error can't find the container with id de635602cd965136fc75f10b431615c8882ef3dbbd7621b6629b8f317be7f791 Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.805865 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.815951 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f9tp\" (UniqueName: \"kubernetes.io/projected/84d53137-9bfe-4bf0-9703-b7a02bfb45be-kube-api-access-4f9tp\") pod \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.816050 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-dns-svc\") pod \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.816079 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-config\") pod \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\" (UID: \"84d53137-9bfe-4bf0-9703-b7a02bfb45be\") " Jan 29 15:21:19 crc kubenswrapper[4753]: W0129 15:21:19.816727 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod135e4efe_2677_4433_9a33_e2d1e1220037.slice/crio-4340032d62c9a2e9f05ef9771ed04be5ffe4e6d9ee0136742c0534875364937f WatchSource:0}: Error finding container 4340032d62c9a2e9f05ef9771ed04be5ffe4e6d9ee0136742c0534875364937f: Status 404 returned error can't find the container with id 4340032d62c9a2e9f05ef9771ed04be5ffe4e6d9ee0136742c0534875364937f Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.819682 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d53137-9bfe-4bf0-9703-b7a02bfb45be-kube-api-access-4f9tp" (OuterVolumeSpecName: "kube-api-access-4f9tp") pod "84d53137-9bfe-4bf0-9703-b7a02bfb45be" (UID: "84d53137-9bfe-4bf0-9703-b7a02bfb45be"). InnerVolumeSpecName "kube-api-access-4f9tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.847125 4753 generic.go:334] "Generic (PLEG): container finished" podID="84d53137-9bfe-4bf0-9703-b7a02bfb45be" containerID="c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19" exitCode=0 Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.847191 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" event={"ID":"84d53137-9bfe-4bf0-9703-b7a02bfb45be","Type":"ContainerDied","Data":"c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19"} Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.847214 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" event={"ID":"84d53137-9bfe-4bf0-9703-b7a02bfb45be","Type":"ContainerDied","Data":"a3c20d369e244e2adaacdcabf5803c3905d9eef7af2147702c4b4e33842151e2"} Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.847234 4753 scope.go:117] "RemoveContainer" containerID="c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.847345 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79f765b5-9wm7w" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.849337 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"95e968fb-1cc0-4aae-a72a-204c2515f449","Type":"ContainerStarted","Data":"de635602cd965136fc75f10b431615c8882ef3dbbd7621b6629b8f317be7f791"} Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.851706 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84d53137-9bfe-4bf0-9703-b7a02bfb45be" (UID: "84d53137-9bfe-4bf0-9703-b7a02bfb45be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.852006 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"135e4efe-2677-4433-9a33-e2d1e1220037","Type":"ContainerStarted","Data":"4340032d62c9a2e9f05ef9771ed04be5ffe4e6d9ee0136742c0534875364937f"} Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.852907 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-config" (OuterVolumeSpecName: "config") pod "84d53137-9bfe-4bf0-9703-b7a02bfb45be" (UID: "84d53137-9bfe-4bf0-9703-b7a02bfb45be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.909831 4753 scope.go:117] "RemoveContainer" containerID="1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.918244 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f9tp\" (UniqueName: \"kubernetes.io/projected/84d53137-9bfe-4bf0-9703-b7a02bfb45be-kube-api-access-4f9tp\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.918277 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.918287 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d53137-9bfe-4bf0-9703-b7a02bfb45be-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.935347 4753 scope.go:117] "RemoveContainer" containerID="c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19" Jan 29 15:21:19 crc kubenswrapper[4753]: E0129 15:21:19.935750 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19\": container with ID starting with c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19 not found: ID does not exist" containerID="c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.935783 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19"} err="failed to get container status \"c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19\": rpc error: code = NotFound desc = could not find container \"c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19\": container with ID starting with c4283884c963036a6b4088c882bc450bafe1aa1674866e901f2846f105ffca19 not found: ID does not exist" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.935810 4753 scope.go:117] "RemoveContainer" containerID="1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693" Jan 29 15:21:19 crc kubenswrapper[4753]: E0129 15:21:19.936074 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693\": container with ID starting with 1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693 not found: ID does not exist" containerID="1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693" Jan 29 15:21:19 crc kubenswrapper[4753]: I0129 15:21:19.936098 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693"} err="failed to get container status \"1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693\": rpc error: code = NotFound desc = could not find container \"1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693\": container with ID starting with 1c4a7ef76a0ee0477b46c1ddf50e6c9789241c18e850b8c8bce464130bf82693 not found: ID does not exist" Jan 29 15:21:20 crc kubenswrapper[4753]: I0129 15:21:20.167736 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52033640-0e60-4e49-a14c-fff49b4258ee" path="/var/lib/kubelet/pods/52033640-0e60-4e49-a14c-fff49b4258ee/volumes" Jan 29 15:21:20 crc kubenswrapper[4753]: I0129 15:21:20.169335 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5573e3e-7af4-4766-a43c-d9d0f2cf6f42" path="/var/lib/kubelet/pods/e5573e3e-7af4-4766-a43c-d9d0f2cf6f42/volumes" Jan 29 15:21:20 crc kubenswrapper[4753]: I0129 15:21:20.214631 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-9wm7w"] Jan 29 15:21:20 crc kubenswrapper[4753]: I0129 15:21:20.226264 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d79f765b5-9wm7w"] Jan 29 15:21:21 crc kubenswrapper[4753]: I0129 15:21:21.882364 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"95e968fb-1cc0-4aae-a72a-204c2515f449","Type":"ContainerStarted","Data":"720d5ad5431eef0e88a2a3797ce5675a5e9cf592c352b89bdb932ad9e58883cc"} Jan 29 15:21:21 crc kubenswrapper[4753]: I0129 15:21:21.886012 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"135e4efe-2677-4433-9a33-e2d1e1220037","Type":"ContainerStarted","Data":"46e8d359a42e0bfbb24ba6063e33d4fb21318805073c3abbd4ae027e9371cac2"} Jan 29 15:21:22 crc kubenswrapper[4753]: I0129 15:21:22.165453 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d53137-9bfe-4bf0-9703-b7a02bfb45be" path="/var/lib/kubelet/pods/84d53137-9bfe-4bf0-9703-b7a02bfb45be/volumes" Jan 29 15:21:28 crc kubenswrapper[4753]: E0129 15:21:28.153390 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:21:33 crc kubenswrapper[4753]: E0129 15:21:33.153561 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:21:42 crc kubenswrapper[4753]: E0129 15:21:42.153400 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:21:44 crc kubenswrapper[4753]: E0129 15:21:44.154463 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:21:54 crc kubenswrapper[4753]: I0129 15:21:54.176911 4753 generic.go:334] "Generic (PLEG): container finished" podID="95e968fb-1cc0-4aae-a72a-204c2515f449" containerID="720d5ad5431eef0e88a2a3797ce5675a5e9cf592c352b89bdb932ad9e58883cc" exitCode=0 Jan 29 15:21:54 crc kubenswrapper[4753]: I0129 15:21:54.177024 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"95e968fb-1cc0-4aae-a72a-204c2515f449","Type":"ContainerDied","Data":"720d5ad5431eef0e88a2a3797ce5675a5e9cf592c352b89bdb932ad9e58883cc"} Jan 29 15:21:55 crc kubenswrapper[4753]: I0129 15:21:55.187549 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"95e968fb-1cc0-4aae-a72a-204c2515f449","Type":"ContainerStarted","Data":"f8c83c3070629a50cfa5ddad26e08214313646f354f85ade7ef7bc49b7c11fa4"} Jan 29 15:21:55 crc kubenswrapper[4753]: I0129 15:21:55.188049 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:21:55 crc kubenswrapper[4753]: I0129 15:21:55.189144 4753 generic.go:334] "Generic (PLEG): container finished" podID="135e4efe-2677-4433-9a33-e2d1e1220037" containerID="46e8d359a42e0bfbb24ba6063e33d4fb21318805073c3abbd4ae027e9371cac2" exitCode=0 Jan 29 15:21:55 crc kubenswrapper[4753]: I0129 15:21:55.189199 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"135e4efe-2677-4433-9a33-e2d1e1220037","Type":"ContainerDied","Data":"46e8d359a42e0bfbb24ba6063e33d4fb21318805073c3abbd4ae027e9371cac2"} Jan 29 15:21:55 crc kubenswrapper[4753]: I0129 15:21:55.228091 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.228074106 podStartE2EDuration="37.228074106s" podCreationTimestamp="2026-01-29 15:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:21:55.221709214 +0000 UTC m=+4749.916443636" watchObservedRunningTime="2026-01-29 15:21:55.228074106 +0000 UTC m=+4749.922808498" Jan 29 15:21:56 crc kubenswrapper[4753]: I0129 15:21:56.198846 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"135e4efe-2677-4433-9a33-e2d1e1220037","Type":"ContainerStarted","Data":"4df5162b2ce0dd011814f2071959b63914b22c82e5b3488f43ae46eb30122d9d"} Jan 29 15:21:56 crc kubenswrapper[4753]: I0129 15:21:56.199586 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 29 15:21:56 crc kubenswrapper[4753]: I0129 15:21:56.222136 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.22211372 podStartE2EDuration="38.22211372s" podCreationTimestamp="2026-01-29 15:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:21:56.216302684 +0000 UTC m=+4750.911037086" watchObservedRunningTime="2026-01-29 15:21:56.22211372 +0000 UTC m=+4750.916848112" Jan 29 15:21:57 crc kubenswrapper[4753]: E0129 15:21:57.151163 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:21:59 crc kubenswrapper[4753]: E0129 15:21:59.151542 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:22:09 crc kubenswrapper[4753]: I0129 15:22:09.285384 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 29 15:22:09 crc kubenswrapper[4753]: I0129 15:22:09.312359 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 29 15:22:12 crc kubenswrapper[4753]: E0129 15:22:12.151950 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:22:12 crc kubenswrapper[4753]: E0129 15:22:12.151950 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:22:20 crc kubenswrapper[4753]: I0129 15:22:20.722002 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Jan 29 15:22:20 crc kubenswrapper[4753]: E0129 15:22:20.722800 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d53137-9bfe-4bf0-9703-b7a02bfb45be" containerName="init" Jan 29 15:22:20 crc kubenswrapper[4753]: I0129 15:22:20.722813 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d53137-9bfe-4bf0-9703-b7a02bfb45be" containerName="init" Jan 29 15:22:20 crc kubenswrapper[4753]: E0129 15:22:20.722825 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d53137-9bfe-4bf0-9703-b7a02bfb45be" containerName="dnsmasq-dns" Jan 29 15:22:20 crc kubenswrapper[4753]: I0129 15:22:20.722831 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d53137-9bfe-4bf0-9703-b7a02bfb45be" containerName="dnsmasq-dns" Jan 29 15:22:20 crc kubenswrapper[4753]: I0129 15:22:20.722974 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d53137-9bfe-4bf0-9703-b7a02bfb45be" containerName="dnsmasq-dns" Jan 29 15:22:20 crc kubenswrapper[4753]: I0129 15:22:20.723464 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Jan 29 15:22:20 crc kubenswrapper[4753]: I0129 15:22:20.726773 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bphpf" Jan 29 15:22:20 crc kubenswrapper[4753]: I0129 15:22:20.731657 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Jan 29 15:22:20 crc kubenswrapper[4753]: I0129 15:22:20.751296 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwb6b\" (UniqueName: \"kubernetes.io/projected/f15fb700-8c7b-4455-bc5e-db230667e600-kube-api-access-dwb6b\") pod \"mariadb-client-1-default\" (UID: \"f15fb700-8c7b-4455-bc5e-db230667e600\") " pod="openstack/mariadb-client-1-default" Jan 29 15:22:20 crc kubenswrapper[4753]: I0129 15:22:20.852558 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwb6b\" (UniqueName: \"kubernetes.io/projected/f15fb700-8c7b-4455-bc5e-db230667e600-kube-api-access-dwb6b\") pod \"mariadb-client-1-default\" (UID: \"f15fb700-8c7b-4455-bc5e-db230667e600\") " pod="openstack/mariadb-client-1-default" Jan 29 15:22:20 crc kubenswrapper[4753]: I0129 15:22:20.872486 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwb6b\" (UniqueName: \"kubernetes.io/projected/f15fb700-8c7b-4455-bc5e-db230667e600-kube-api-access-dwb6b\") pod \"mariadb-client-1-default\" (UID: \"f15fb700-8c7b-4455-bc5e-db230667e600\") " pod="openstack/mariadb-client-1-default" Jan 29 15:22:21 crc kubenswrapper[4753]: I0129 15:22:21.041621 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Jan 29 15:22:21 crc kubenswrapper[4753]: I0129 15:22:21.572614 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Jan 29 15:22:21 crc kubenswrapper[4753]: I0129 15:22:21.575624 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 15:22:22 crc kubenswrapper[4753]: I0129 15:22:22.413575 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"f15fb700-8c7b-4455-bc5e-db230667e600","Type":"ContainerStarted","Data":"64e30d70d39a46584a44fcf17656ee4674782394ddeafb9485e28e18b491d21e"} Jan 29 15:22:23 crc kubenswrapper[4753]: I0129 15:22:23.420789 4753 generic.go:334] "Generic (PLEG): container finished" podID="f15fb700-8c7b-4455-bc5e-db230667e600" containerID="8932a393025a0bf84d862bb6479e52ec424f9054fb2ecd3b38fea88f5daf29bf" exitCode=0 Jan 29 15:22:23 crc kubenswrapper[4753]: I0129 15:22:23.420871 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"f15fb700-8c7b-4455-bc5e-db230667e600","Type":"ContainerDied","Data":"8932a393025a0bf84d862bb6479e52ec424f9054fb2ecd3b38fea88f5daf29bf"} Jan 29 15:22:24 crc kubenswrapper[4753]: I0129 15:22:24.788306 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Jan 29 15:22:24 crc kubenswrapper[4753]: I0129 15:22:24.828926 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_f15fb700-8c7b-4455-bc5e-db230667e600/mariadb-client-1-default/0.log" Jan 29 15:22:24 crc kubenswrapper[4753]: I0129 15:22:24.857830 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Jan 29 15:22:24 crc kubenswrapper[4753]: I0129 15:22:24.863254 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Jan 29 15:22:24 crc kubenswrapper[4753]: I0129 15:22:24.927643 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwb6b\" (UniqueName: \"kubernetes.io/projected/f15fb700-8c7b-4455-bc5e-db230667e600-kube-api-access-dwb6b\") pod \"f15fb700-8c7b-4455-bc5e-db230667e600\" (UID: \"f15fb700-8c7b-4455-bc5e-db230667e600\") " Jan 29 15:22:24 crc kubenswrapper[4753]: I0129 15:22:24.932817 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15fb700-8c7b-4455-bc5e-db230667e600-kube-api-access-dwb6b" (OuterVolumeSpecName: "kube-api-access-dwb6b") pod "f15fb700-8c7b-4455-bc5e-db230667e600" (UID: "f15fb700-8c7b-4455-bc5e-db230667e600"). InnerVolumeSpecName "kube-api-access-dwb6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.029611 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwb6b\" (UniqueName: \"kubernetes.io/projected/f15fb700-8c7b-4455-bc5e-db230667e600-kube-api-access-dwb6b\") on node \"crc\" DevicePath \"\"" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.235250 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Jan 29 15:22:25 crc kubenswrapper[4753]: E0129 15:22:25.235856 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15fb700-8c7b-4455-bc5e-db230667e600" containerName="mariadb-client-1-default" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.235875 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15fb700-8c7b-4455-bc5e-db230667e600" containerName="mariadb-client-1-default" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.236050 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15fb700-8c7b-4455-bc5e-db230667e600" containerName="mariadb-client-1-default" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.236713 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.241410 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Jan 29 15:22:25 crc kubenswrapper[4753]: E0129 15:22:25.297196 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 29 15:22:25 crc kubenswrapper[4753]: E0129 15:22:25.297621 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qwgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9wxf5_openshift-marketplace(2dfdc10e-dc31-4565-b790-9b778061ba36): ErrImagePull: initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:22:25 crc kubenswrapper[4753]: E0129 15:22:25.298958 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/certified-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.440301 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvgz5\" (UniqueName: \"kubernetes.io/projected/4c84e331-41f6-4d78-9b2f-57c8a4d390f5-kube-api-access-nvgz5\") pod \"mariadb-client-2-default\" (UID: \"4c84e331-41f6-4d78-9b2f-57c8a4d390f5\") " pod="openstack/mariadb-client-2-default" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.450637 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64e30d70d39a46584a44fcf17656ee4674782394ddeafb9485e28e18b491d21e" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.450737 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.541945 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvgz5\" (UniqueName: \"kubernetes.io/projected/4c84e331-41f6-4d78-9b2f-57c8a4d390f5-kube-api-access-nvgz5\") pod \"mariadb-client-2-default\" (UID: \"4c84e331-41f6-4d78-9b2f-57c8a4d390f5\") " pod="openstack/mariadb-client-2-default" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.563876 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvgz5\" (UniqueName: \"kubernetes.io/projected/4c84e331-41f6-4d78-9b2f-57c8a4d390f5-kube-api-access-nvgz5\") pod \"mariadb-client-2-default\" (UID: \"4c84e331-41f6-4d78-9b2f-57c8a4d390f5\") " pod="openstack/mariadb-client-2-default" Jan 29 15:22:25 crc kubenswrapper[4753]: I0129 15:22:25.857677 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Jan 29 15:22:26 crc kubenswrapper[4753]: E0129 15:22:26.161056 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:22:26 crc kubenswrapper[4753]: I0129 15:22:26.161528 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15fb700-8c7b-4455-bc5e-db230667e600" path="/var/lib/kubelet/pods/f15fb700-8c7b-4455-bc5e-db230667e600/volumes" Jan 29 15:22:26 crc kubenswrapper[4753]: I0129 15:22:26.185250 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Jan 29 15:22:26 crc kubenswrapper[4753]: W0129 15:22:26.200887 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c84e331_41f6_4d78_9b2f_57c8a4d390f5.slice/crio-f218967e2af7d8100186be1fd5799faa163e5a216578272616179fbd0bbbd40e WatchSource:0}: Error finding container f218967e2af7d8100186be1fd5799faa163e5a216578272616179fbd0bbbd40e: Status 404 returned error can't find the container with id f218967e2af7d8100186be1fd5799faa163e5a216578272616179fbd0bbbd40e Jan 29 15:22:26 crc kubenswrapper[4753]: I0129 15:22:26.464424 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"4c84e331-41f6-4d78-9b2f-57c8a4d390f5","Type":"ContainerStarted","Data":"bc1342b24fcbcd3ddf9a9d8cb4668c82367e9d7c93fe2055cd3afc60445c1f60"} Jan 29 15:22:26 crc kubenswrapper[4753]: I0129 15:22:26.464470 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"4c84e331-41f6-4d78-9b2f-57c8a4d390f5","Type":"ContainerStarted","Data":"f218967e2af7d8100186be1fd5799faa163e5a216578272616179fbd0bbbd40e"} Jan 29 15:22:27 crc kubenswrapper[4753]: I0129 15:22:27.476906 4753 generic.go:334] "Generic (PLEG): container finished" podID="4c84e331-41f6-4d78-9b2f-57c8a4d390f5" containerID="bc1342b24fcbcd3ddf9a9d8cb4668c82367e9d7c93fe2055cd3afc60445c1f60" exitCode=1 Jan 29 15:22:27 crc kubenswrapper[4753]: I0129 15:22:27.477405 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"4c84e331-41f6-4d78-9b2f-57c8a4d390f5","Type":"ContainerDied","Data":"bc1342b24fcbcd3ddf9a9d8cb4668c82367e9d7c93fe2055cd3afc60445c1f60"} Jan 29 15:22:28 crc kubenswrapper[4753]: I0129 15:22:28.851015 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Jan 29 15:22:28 crc kubenswrapper[4753]: I0129 15:22:28.896317 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Jan 29 15:22:28 crc kubenswrapper[4753]: I0129 15:22:28.902706 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Jan 29 15:22:28 crc kubenswrapper[4753]: I0129 15:22:28.997442 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvgz5\" (UniqueName: \"kubernetes.io/projected/4c84e331-41f6-4d78-9b2f-57c8a4d390f5-kube-api-access-nvgz5\") pod \"4c84e331-41f6-4d78-9b2f-57c8a4d390f5\" (UID: \"4c84e331-41f6-4d78-9b2f-57c8a4d390f5\") " Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.004787 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c84e331-41f6-4d78-9b2f-57c8a4d390f5-kube-api-access-nvgz5" (OuterVolumeSpecName: "kube-api-access-nvgz5") pod "4c84e331-41f6-4d78-9b2f-57c8a4d390f5" (UID: "4c84e331-41f6-4d78-9b2f-57c8a4d390f5"). InnerVolumeSpecName "kube-api-access-nvgz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.099901 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvgz5\" (UniqueName: \"kubernetes.io/projected/4c84e331-41f6-4d78-9b2f-57c8a4d390f5-kube-api-access-nvgz5\") on node \"crc\" DevicePath \"\"" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.262502 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Jan 29 15:22:29 crc kubenswrapper[4753]: E0129 15:22:29.264899 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c84e331-41f6-4d78-9b2f-57c8a4d390f5" containerName="mariadb-client-2-default" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.264947 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c84e331-41f6-4d78-9b2f-57c8a4d390f5" containerName="mariadb-client-2-default" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.265860 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c84e331-41f6-4d78-9b2f-57c8a4d390f5" containerName="mariadb-client-2-default" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.267806 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.295264 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.407520 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swfhj\" (UniqueName: \"kubernetes.io/projected/3c816a7c-aa79-4663-882e-8fc0021c691f-kube-api-access-swfhj\") pod \"mariadb-client-1\" (UID: \"3c816a7c-aa79-4663-882e-8fc0021c691f\") " pod="openstack/mariadb-client-1" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.495771 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f218967e2af7d8100186be1fd5799faa163e5a216578272616179fbd0bbbd40e" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.495882 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.512214 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swfhj\" (UniqueName: \"kubernetes.io/projected/3c816a7c-aa79-4663-882e-8fc0021c691f-kube-api-access-swfhj\") pod \"mariadb-client-1\" (UID: \"3c816a7c-aa79-4663-882e-8fc0021c691f\") " pod="openstack/mariadb-client-1" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.531782 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swfhj\" (UniqueName: \"kubernetes.io/projected/3c816a7c-aa79-4663-882e-8fc0021c691f-kube-api-access-swfhj\") pod \"mariadb-client-1\" (UID: \"3c816a7c-aa79-4663-882e-8fc0021c691f\") " pod="openstack/mariadb-client-1" Jan 29 15:22:29 crc kubenswrapper[4753]: I0129 15:22:29.605267 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Jan 29 15:22:30 crc kubenswrapper[4753]: I0129 15:22:30.164622 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c84e331-41f6-4d78-9b2f-57c8a4d390f5" path="/var/lib/kubelet/pods/4c84e331-41f6-4d78-9b2f-57c8a4d390f5/volumes" Jan 29 15:22:30 crc kubenswrapper[4753]: I0129 15:22:30.166032 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Jan 29 15:22:30 crc kubenswrapper[4753]: I0129 15:22:30.505723 4753 generic.go:334] "Generic (PLEG): container finished" podID="3c816a7c-aa79-4663-882e-8fc0021c691f" containerID="8268b0c22d80520c4113a789bca7dc6f080b9307ebad7ddd883bcac947c0e500" exitCode=0 Jan 29 15:22:30 crc kubenswrapper[4753]: I0129 15:22:30.505939 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"3c816a7c-aa79-4663-882e-8fc0021c691f","Type":"ContainerDied","Data":"8268b0c22d80520c4113a789bca7dc6f080b9307ebad7ddd883bcac947c0e500"} Jan 29 15:22:30 crc kubenswrapper[4753]: I0129 15:22:30.506014 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"3c816a7c-aa79-4663-882e-8fc0021c691f","Type":"ContainerStarted","Data":"fa9c52357d73752efb244ccefb6fca69faadfbd460bdb533f7c6e8fdb1d5a9bf"} Jan 29 15:22:31 crc kubenswrapper[4753]: I0129 15:22:31.918379 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Jan 29 15:22:31 crc kubenswrapper[4753]: I0129 15:22:31.940744 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_3c816a7c-aa79-4663-882e-8fc0021c691f/mariadb-client-1/0.log" Jan 29 15:22:31 crc kubenswrapper[4753]: I0129 15:22:31.969394 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Jan 29 15:22:31 crc kubenswrapper[4753]: I0129 15:22:31.974840 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.056205 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swfhj\" (UniqueName: \"kubernetes.io/projected/3c816a7c-aa79-4663-882e-8fc0021c691f-kube-api-access-swfhj\") pod \"3c816a7c-aa79-4663-882e-8fc0021c691f\" (UID: \"3c816a7c-aa79-4663-882e-8fc0021c691f\") " Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.071412 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c816a7c-aa79-4663-882e-8fc0021c691f-kube-api-access-swfhj" (OuterVolumeSpecName: "kube-api-access-swfhj") pod "3c816a7c-aa79-4663-882e-8fc0021c691f" (UID: "3c816a7c-aa79-4663-882e-8fc0021c691f"). InnerVolumeSpecName "kube-api-access-swfhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.158047 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swfhj\" (UniqueName: \"kubernetes.io/projected/3c816a7c-aa79-4663-882e-8fc0021c691f-kube-api-access-swfhj\") on node \"crc\" DevicePath \"\"" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.160942 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c816a7c-aa79-4663-882e-8fc0021c691f" path="/var/lib/kubelet/pods/3c816a7c-aa79-4663-882e-8fc0021c691f/volumes" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.388056 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Jan 29 15:22:32 crc kubenswrapper[4753]: E0129 15:22:32.388643 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c816a7c-aa79-4663-882e-8fc0021c691f" containerName="mariadb-client-1" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.388680 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c816a7c-aa79-4663-882e-8fc0021c691f" containerName="mariadb-client-1" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.388973 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c816a7c-aa79-4663-882e-8fc0021c691f" containerName="mariadb-client-1" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.389958 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.397759 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.465294 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgbxx\" (UniqueName: \"kubernetes.io/projected/79f8aac7-b57b-4ed1-87b4-895652f1fa64-kube-api-access-fgbxx\") pod \"mariadb-client-4-default\" (UID: \"79f8aac7-b57b-4ed1-87b4-895652f1fa64\") " pod="openstack/mariadb-client-4-default" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.524611 4753 scope.go:117] "RemoveContainer" containerID="8268b0c22d80520c4113a789bca7dc6f080b9307ebad7ddd883bcac947c0e500" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.524643 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.566446 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgbxx\" (UniqueName: \"kubernetes.io/projected/79f8aac7-b57b-4ed1-87b4-895652f1fa64-kube-api-access-fgbxx\") pod \"mariadb-client-4-default\" (UID: \"79f8aac7-b57b-4ed1-87b4-895652f1fa64\") " pod="openstack/mariadb-client-4-default" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.589232 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgbxx\" (UniqueName: \"kubernetes.io/projected/79f8aac7-b57b-4ed1-87b4-895652f1fa64-kube-api-access-fgbxx\") pod \"mariadb-client-4-default\" (UID: \"79f8aac7-b57b-4ed1-87b4-895652f1fa64\") " pod="openstack/mariadb-client-4-default" Jan 29 15:22:32 crc kubenswrapper[4753]: I0129 15:22:32.718108 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Jan 29 15:22:33 crc kubenswrapper[4753]: I0129 15:22:33.273814 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Jan 29 15:22:33 crc kubenswrapper[4753]: W0129 15:22:33.602697 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f8aac7_b57b_4ed1_87b4_895652f1fa64.slice/crio-4bc15993cbdf2e33cdec18d9c2b5a4635ec3e4989422f456fa5d37667e72b0c5 WatchSource:0}: Error finding container 4bc15993cbdf2e33cdec18d9c2b5a4635ec3e4989422f456fa5d37667e72b0c5: Status 404 returned error can't find the container with id 4bc15993cbdf2e33cdec18d9c2b5a4635ec3e4989422f456fa5d37667e72b0c5 Jan 29 15:22:34 crc kubenswrapper[4753]: I0129 15:22:34.581841 4753 generic.go:334] "Generic (PLEG): container finished" podID="79f8aac7-b57b-4ed1-87b4-895652f1fa64" containerID="c3759be22ad99b77829af8ec0f086b781a03c78174c706a86b97d6e9c3d02fcb" exitCode=0 Jan 29 15:22:34 crc kubenswrapper[4753]: I0129 15:22:34.581914 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"79f8aac7-b57b-4ed1-87b4-895652f1fa64","Type":"ContainerDied","Data":"c3759be22ad99b77829af8ec0f086b781a03c78174c706a86b97d6e9c3d02fcb"} Jan 29 15:22:34 crc kubenswrapper[4753]: I0129 15:22:34.582187 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"79f8aac7-b57b-4ed1-87b4-895652f1fa64","Type":"ContainerStarted","Data":"4bc15993cbdf2e33cdec18d9c2b5a4635ec3e4989422f456fa5d37667e72b0c5"} Jan 29 15:22:35 crc kubenswrapper[4753]: I0129 15:22:35.975568 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Jan 29 15:22:35 crc kubenswrapper[4753]: I0129 15:22:35.993452 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_79f8aac7-b57b-4ed1-87b4-895652f1fa64/mariadb-client-4-default/0.log" Jan 29 15:22:36 crc kubenswrapper[4753]: I0129 15:22:36.019609 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Jan 29 15:22:36 crc kubenswrapper[4753]: I0129 15:22:36.025103 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Jan 29 15:22:36 crc kubenswrapper[4753]: I0129 15:22:36.116970 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgbxx\" (UniqueName: \"kubernetes.io/projected/79f8aac7-b57b-4ed1-87b4-895652f1fa64-kube-api-access-fgbxx\") pod \"79f8aac7-b57b-4ed1-87b4-895652f1fa64\" (UID: \"79f8aac7-b57b-4ed1-87b4-895652f1fa64\") " Jan 29 15:22:36 crc kubenswrapper[4753]: I0129 15:22:36.125310 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f8aac7-b57b-4ed1-87b4-895652f1fa64-kube-api-access-fgbxx" (OuterVolumeSpecName: "kube-api-access-fgbxx") pod "79f8aac7-b57b-4ed1-87b4-895652f1fa64" (UID: "79f8aac7-b57b-4ed1-87b4-895652f1fa64"). InnerVolumeSpecName "kube-api-access-fgbxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:22:36 crc kubenswrapper[4753]: I0129 15:22:36.163413 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f8aac7-b57b-4ed1-87b4-895652f1fa64" path="/var/lib/kubelet/pods/79f8aac7-b57b-4ed1-87b4-895652f1fa64/volumes" Jan 29 15:22:36 crc kubenswrapper[4753]: I0129 15:22:36.219103 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgbxx\" (UniqueName: \"kubernetes.io/projected/79f8aac7-b57b-4ed1-87b4-895652f1fa64-kube-api-access-fgbxx\") on node \"crc\" DevicePath \"\"" Jan 29 15:22:36 crc kubenswrapper[4753]: I0129 15:22:36.596573 4753 scope.go:117] "RemoveContainer" containerID="c3759be22ad99b77829af8ec0f086b781a03c78174c706a86b97d6e9c3d02fcb" Jan 29 15:22:36 crc kubenswrapper[4753]: I0129 15:22:36.596610 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Jan 29 15:22:38 crc kubenswrapper[4753]: E0129 15:22:38.316756 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 29 15:22:38 crc kubenswrapper[4753]: E0129 15:22:38.317243 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6bn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cxsh4_openshift-marketplace(d7388aa3-c682-48f9-b7d2-db1beee6393e): ErrImagePull: initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:22:38 crc kubenswrapper[4753]: E0129 15:22:38.320731 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/redhat-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:22:40 crc kubenswrapper[4753]: E0129 15:22:40.153198 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:22:40 crc kubenswrapper[4753]: I0129 15:22:40.526507 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Jan 29 15:22:40 crc kubenswrapper[4753]: E0129 15:22:40.526919 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f8aac7-b57b-4ed1-87b4-895652f1fa64" containerName="mariadb-client-4-default" Jan 29 15:22:40 crc kubenswrapper[4753]: I0129 15:22:40.526932 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f8aac7-b57b-4ed1-87b4-895652f1fa64" containerName="mariadb-client-4-default" Jan 29 15:22:40 crc kubenswrapper[4753]: I0129 15:22:40.527072 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f8aac7-b57b-4ed1-87b4-895652f1fa64" containerName="mariadb-client-4-default" Jan 29 15:22:40 crc kubenswrapper[4753]: I0129 15:22:40.527617 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Jan 29 15:22:40 crc kubenswrapper[4753]: I0129 15:22:40.530234 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bphpf" Jan 29 15:22:40 crc kubenswrapper[4753]: I0129 15:22:40.537615 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Jan 29 15:22:40 crc kubenswrapper[4753]: I0129 15:22:40.691788 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqq2j\" (UniqueName: \"kubernetes.io/projected/6e10b0de-b2d3-4cb3-ab29-658b251b93b8-kube-api-access-kqq2j\") pod \"mariadb-client-5-default\" (UID: \"6e10b0de-b2d3-4cb3-ab29-658b251b93b8\") " pod="openstack/mariadb-client-5-default" Jan 29 15:22:40 crc kubenswrapper[4753]: I0129 15:22:40.792783 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqq2j\" (UniqueName: \"kubernetes.io/projected/6e10b0de-b2d3-4cb3-ab29-658b251b93b8-kube-api-access-kqq2j\") pod \"mariadb-client-5-default\" (UID: \"6e10b0de-b2d3-4cb3-ab29-658b251b93b8\") " pod="openstack/mariadb-client-5-default" Jan 29 15:22:40 crc kubenswrapper[4753]: I0129 15:22:40.812361 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqq2j\" (UniqueName: \"kubernetes.io/projected/6e10b0de-b2d3-4cb3-ab29-658b251b93b8-kube-api-access-kqq2j\") pod \"mariadb-client-5-default\" (UID: \"6e10b0de-b2d3-4cb3-ab29-658b251b93b8\") " pod="openstack/mariadb-client-5-default" Jan 29 15:22:40 crc kubenswrapper[4753]: I0129 15:22:40.854803 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Jan 29 15:22:41 crc kubenswrapper[4753]: I0129 15:22:41.378714 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Jan 29 15:22:41 crc kubenswrapper[4753]: I0129 15:22:41.639399 4753 generic.go:334] "Generic (PLEG): container finished" podID="6e10b0de-b2d3-4cb3-ab29-658b251b93b8" containerID="41751e677c8457ccc4f8d9eb15ddc6bc84ca4ee1b36b7ad3a69b7193d6875bc6" exitCode=0 Jan 29 15:22:41 crc kubenswrapper[4753]: I0129 15:22:41.639446 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"6e10b0de-b2d3-4cb3-ab29-658b251b93b8","Type":"ContainerDied","Data":"41751e677c8457ccc4f8d9eb15ddc6bc84ca4ee1b36b7ad3a69b7193d6875bc6"} Jan 29 15:22:41 crc kubenswrapper[4753]: I0129 15:22:41.639471 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"6e10b0de-b2d3-4cb3-ab29-658b251b93b8","Type":"ContainerStarted","Data":"9c7316fe6c37ceba17519032201022820b94dcab388c27c999ea31c855793448"} Jan 29 15:22:42 crc kubenswrapper[4753]: I0129 15:22:42.997180 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.019630 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_6e10b0de-b2d3-4cb3-ab29-658b251b93b8/mariadb-client-5-default/0.log" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.052069 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.057892 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.131750 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqq2j\" (UniqueName: \"kubernetes.io/projected/6e10b0de-b2d3-4cb3-ab29-658b251b93b8-kube-api-access-kqq2j\") pod \"6e10b0de-b2d3-4cb3-ab29-658b251b93b8\" (UID: \"6e10b0de-b2d3-4cb3-ab29-658b251b93b8\") " Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.137294 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e10b0de-b2d3-4cb3-ab29-658b251b93b8-kube-api-access-kqq2j" (OuterVolumeSpecName: "kube-api-access-kqq2j") pod "6e10b0de-b2d3-4cb3-ab29-658b251b93b8" (UID: "6e10b0de-b2d3-4cb3-ab29-658b251b93b8"). InnerVolumeSpecName "kube-api-access-kqq2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.204213 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Jan 29 15:22:43 crc kubenswrapper[4753]: E0129 15:22:43.204629 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e10b0de-b2d3-4cb3-ab29-658b251b93b8" containerName="mariadb-client-5-default" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.204652 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e10b0de-b2d3-4cb3-ab29-658b251b93b8" containerName="mariadb-client-5-default" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.204845 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e10b0de-b2d3-4cb3-ab29-658b251b93b8" containerName="mariadb-client-5-default" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.205492 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.214058 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.234284 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qws4w\" (UniqueName: \"kubernetes.io/projected/60db294b-5b80-4d1f-afdb-5fba14e7b954-kube-api-access-qws4w\") pod \"mariadb-client-6-default\" (UID: \"60db294b-5b80-4d1f-afdb-5fba14e7b954\") " pod="openstack/mariadb-client-6-default" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.234436 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqq2j\" (UniqueName: \"kubernetes.io/projected/6e10b0de-b2d3-4cb3-ab29-658b251b93b8-kube-api-access-kqq2j\") on node \"crc\" DevicePath \"\"" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.335754 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qws4w\" (UniqueName: \"kubernetes.io/projected/60db294b-5b80-4d1f-afdb-5fba14e7b954-kube-api-access-qws4w\") pod \"mariadb-client-6-default\" (UID: \"60db294b-5b80-4d1f-afdb-5fba14e7b954\") " pod="openstack/mariadb-client-6-default" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.358600 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qws4w\" (UniqueName: \"kubernetes.io/projected/60db294b-5b80-4d1f-afdb-5fba14e7b954-kube-api-access-qws4w\") pod \"mariadb-client-6-default\" (UID: \"60db294b-5b80-4d1f-afdb-5fba14e7b954\") " pod="openstack/mariadb-client-6-default" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.526384 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.656094 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c7316fe6c37ceba17519032201022820b94dcab388c27c999ea31c855793448" Jan 29 15:22:43 crc kubenswrapper[4753]: I0129 15:22:43.656243 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Jan 29 15:22:44 crc kubenswrapper[4753]: I0129 15:22:44.004628 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Jan 29 15:22:44 crc kubenswrapper[4753]: W0129 15:22:44.008765 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60db294b_5b80_4d1f_afdb_5fba14e7b954.slice/crio-8ba9cb93ed7e6a320171d08f364a6fc43a792b663b2130c5a68e6bf356aeee45 WatchSource:0}: Error finding container 8ba9cb93ed7e6a320171d08f364a6fc43a792b663b2130c5a68e6bf356aeee45: Status 404 returned error can't find the container with id 8ba9cb93ed7e6a320171d08f364a6fc43a792b663b2130c5a68e6bf356aeee45 Jan 29 15:22:44 crc kubenswrapper[4753]: I0129 15:22:44.161377 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e10b0de-b2d3-4cb3-ab29-658b251b93b8" path="/var/lib/kubelet/pods/6e10b0de-b2d3-4cb3-ab29-658b251b93b8/volumes" Jan 29 15:22:44 crc kubenswrapper[4753]: I0129 15:22:44.670126 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"60db294b-5b80-4d1f-afdb-5fba14e7b954","Type":"ContainerStarted","Data":"3f8a81ffdac224b8f2798d195f769e4326e7c6d0f75e0eb9a4ed6853d513c6ef"} Jan 29 15:22:44 crc kubenswrapper[4753]: I0129 15:22:44.670239 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"60db294b-5b80-4d1f-afdb-5fba14e7b954","Type":"ContainerStarted","Data":"8ba9cb93ed7e6a320171d08f364a6fc43a792b663b2130c5a68e6bf356aeee45"} Jan 29 15:22:44 crc kubenswrapper[4753]: I0129 15:22:44.693211 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.693191196 podStartE2EDuration="1.693191196s" podCreationTimestamp="2026-01-29 15:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:22:44.685098438 +0000 UTC m=+4799.379832830" watchObservedRunningTime="2026-01-29 15:22:44.693191196 +0000 UTC m=+4799.387925608" Jan 29 15:22:44 crc kubenswrapper[4753]: E0129 15:22:44.998797 4753 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.142:33672->38.102.83.142:37393: read tcp 38.102.83.142:33672->38.102.83.142:37393: read: connection reset by peer Jan 29 15:22:45 crc kubenswrapper[4753]: I0129 15:22:45.681790 4753 generic.go:334] "Generic (PLEG): container finished" podID="60db294b-5b80-4d1f-afdb-5fba14e7b954" containerID="3f8a81ffdac224b8f2798d195f769e4326e7c6d0f75e0eb9a4ed6853d513c6ef" exitCode=1 Jan 29 15:22:45 crc kubenswrapper[4753]: I0129 15:22:45.682016 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"60db294b-5b80-4d1f-afdb-5fba14e7b954","Type":"ContainerDied","Data":"3f8a81ffdac224b8f2798d195f769e4326e7c6d0f75e0eb9a4ed6853d513c6ef"} Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.200483 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.244777 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.254130 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.302068 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qws4w\" (UniqueName: \"kubernetes.io/projected/60db294b-5b80-4d1f-afdb-5fba14e7b954-kube-api-access-qws4w\") pod \"60db294b-5b80-4d1f-afdb-5fba14e7b954\" (UID: \"60db294b-5b80-4d1f-afdb-5fba14e7b954\") " Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.309729 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60db294b-5b80-4d1f-afdb-5fba14e7b954-kube-api-access-qws4w" (OuterVolumeSpecName: "kube-api-access-qws4w") pod "60db294b-5b80-4d1f-afdb-5fba14e7b954" (UID: "60db294b-5b80-4d1f-afdb-5fba14e7b954"). InnerVolumeSpecName "kube-api-access-qws4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.370355 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Jan 29 15:22:47 crc kubenswrapper[4753]: E0129 15:22:47.371343 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60db294b-5b80-4d1f-afdb-5fba14e7b954" containerName="mariadb-client-6-default" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.371369 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="60db294b-5b80-4d1f-afdb-5fba14e7b954" containerName="mariadb-client-6-default" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.371582 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="60db294b-5b80-4d1f-afdb-5fba14e7b954" containerName="mariadb-client-6-default" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.373880 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.378472 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.403976 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm8g8\" (UniqueName: \"kubernetes.io/projected/36cf39c7-e710-4a55-876a-b62cf79ee934-kube-api-access-xm8g8\") pod \"mariadb-client-7-default\" (UID: \"36cf39c7-e710-4a55-876a-b62cf79ee934\") " pod="openstack/mariadb-client-7-default" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.404292 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qws4w\" (UniqueName: \"kubernetes.io/projected/60db294b-5b80-4d1f-afdb-5fba14e7b954-kube-api-access-qws4w\") on node \"crc\" DevicePath \"\"" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.505430 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm8g8\" (UniqueName: \"kubernetes.io/projected/36cf39c7-e710-4a55-876a-b62cf79ee934-kube-api-access-xm8g8\") pod \"mariadb-client-7-default\" (UID: \"36cf39c7-e710-4a55-876a-b62cf79ee934\") " pod="openstack/mariadb-client-7-default" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.522373 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm8g8\" (UniqueName: \"kubernetes.io/projected/36cf39c7-e710-4a55-876a-b62cf79ee934-kube-api-access-xm8g8\") pod \"mariadb-client-7-default\" (UID: \"36cf39c7-e710-4a55-876a-b62cf79ee934\") " pod="openstack/mariadb-client-7-default" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.698578 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.707722 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ba9cb93ed7e6a320171d08f364a6fc43a792b663b2130c5a68e6bf356aeee45" Jan 29 15:22:47 crc kubenswrapper[4753]: I0129 15:22:47.707813 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Jan 29 15:22:48 crc kubenswrapper[4753]: I0129 15:22:48.168627 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60db294b-5b80-4d1f-afdb-5fba14e7b954" path="/var/lib/kubelet/pods/60db294b-5b80-4d1f-afdb-5fba14e7b954/volumes" Jan 29 15:22:48 crc kubenswrapper[4753]: I0129 15:22:48.205463 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Jan 29 15:22:48 crc kubenswrapper[4753]: W0129 15:22:48.210847 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36cf39c7_e710_4a55_876a_b62cf79ee934.slice/crio-a9f0d9b7b0e7af5cda9844e04078783225d6de55e076b458c45874479ed769e2 WatchSource:0}: Error finding container a9f0d9b7b0e7af5cda9844e04078783225d6de55e076b458c45874479ed769e2: Status 404 returned error can't find the container with id a9f0d9b7b0e7af5cda9844e04078783225d6de55e076b458c45874479ed769e2 Jan 29 15:22:48 crc kubenswrapper[4753]: I0129 15:22:48.718852 4753 generic.go:334] "Generic (PLEG): container finished" podID="36cf39c7-e710-4a55-876a-b62cf79ee934" containerID="f0f9d1da7e58e7226e1f6a9c4c95babfffb713f93552d7f86411075b063cdfb3" exitCode=0 Jan 29 15:22:48 crc kubenswrapper[4753]: I0129 15:22:48.718951 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"36cf39c7-e710-4a55-876a-b62cf79ee934","Type":"ContainerDied","Data":"f0f9d1da7e58e7226e1f6a9c4c95babfffb713f93552d7f86411075b063cdfb3"} Jan 29 15:22:48 crc kubenswrapper[4753]: I0129 15:22:48.720794 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"36cf39c7-e710-4a55-876a-b62cf79ee934","Type":"ContainerStarted","Data":"a9f0d9b7b0e7af5cda9844e04078783225d6de55e076b458c45874479ed769e2"} Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.103731 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.131969 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_36cf39c7-e710-4a55-876a-b62cf79ee934/mariadb-client-7-default/0.log" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.192625 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.200697 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.262580 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm8g8\" (UniqueName: \"kubernetes.io/projected/36cf39c7-e710-4a55-876a-b62cf79ee934-kube-api-access-xm8g8\") pod \"36cf39c7-e710-4a55-876a-b62cf79ee934\" (UID: \"36cf39c7-e710-4a55-876a-b62cf79ee934\") " Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.269990 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36cf39c7-e710-4a55-876a-b62cf79ee934-kube-api-access-xm8g8" (OuterVolumeSpecName: "kube-api-access-xm8g8") pod "36cf39c7-e710-4a55-876a-b62cf79ee934" (UID: "36cf39c7-e710-4a55-876a-b62cf79ee934"). InnerVolumeSpecName "kube-api-access-xm8g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.319654 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Jan 29 15:22:50 crc kubenswrapper[4753]: E0129 15:22:50.320406 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36cf39c7-e710-4a55-876a-b62cf79ee934" containerName="mariadb-client-7-default" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.320425 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="36cf39c7-e710-4a55-876a-b62cf79ee934" containerName="mariadb-client-7-default" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.320584 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="36cf39c7-e710-4a55-876a-b62cf79ee934" containerName="mariadb-client-7-default" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.321110 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.326120 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.364913 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm8g8\" (UniqueName: \"kubernetes.io/projected/36cf39c7-e710-4a55-876a-b62cf79ee934-kube-api-access-xm8g8\") on node \"crc\" DevicePath \"\"" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.466816 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n82c\" (UniqueName: \"kubernetes.io/projected/aa665795-9d6f-41ee-9d0e-6783b3f946d7-kube-api-access-4n82c\") pod \"mariadb-client-2\" (UID: \"aa665795-9d6f-41ee-9d0e-6783b3f946d7\") " pod="openstack/mariadb-client-2" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.568414 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n82c\" (UniqueName: \"kubernetes.io/projected/aa665795-9d6f-41ee-9d0e-6783b3f946d7-kube-api-access-4n82c\") pod \"mariadb-client-2\" (UID: \"aa665795-9d6f-41ee-9d0e-6783b3f946d7\") " pod="openstack/mariadb-client-2" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.591141 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n82c\" (UniqueName: \"kubernetes.io/projected/aa665795-9d6f-41ee-9d0e-6783b3f946d7-kube-api-access-4n82c\") pod \"mariadb-client-2\" (UID: \"aa665795-9d6f-41ee-9d0e-6783b3f946d7\") " pod="openstack/mariadb-client-2" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.641934 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.746386 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9f0d9b7b0e7af5cda9844e04078783225d6de55e076b458c45874479ed769e2" Jan 29 15:22:50 crc kubenswrapper[4753]: I0129 15:22:50.746529 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Jan 29 15:22:51 crc kubenswrapper[4753]: E0129 15:22:51.151592 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:22:51 crc kubenswrapper[4753]: I0129 15:22:51.202015 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Jan 29 15:22:51 crc kubenswrapper[4753]: W0129 15:22:51.212378 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa665795_9d6f_41ee_9d0e_6783b3f946d7.slice/crio-55b26c8a04f27feb24286a1186d6438e2228f59869b27f41ddc950120937b28c WatchSource:0}: Error finding container 55b26c8a04f27feb24286a1186d6438e2228f59869b27f41ddc950120937b28c: Status 404 returned error can't find the container with id 55b26c8a04f27feb24286a1186d6438e2228f59869b27f41ddc950120937b28c Jan 29 15:22:51 crc kubenswrapper[4753]: I0129 15:22:51.758588 4753 generic.go:334] "Generic (PLEG): container finished" podID="aa665795-9d6f-41ee-9d0e-6783b3f946d7" containerID="5c49c8ffceb19e2e710be36ab0abc78f98514c650584c7c90ccfa3464bbe5bed" exitCode=0 Jan 29 15:22:51 crc kubenswrapper[4753]: I0129 15:22:51.758644 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"aa665795-9d6f-41ee-9d0e-6783b3f946d7","Type":"ContainerDied","Data":"5c49c8ffceb19e2e710be36ab0abc78f98514c650584c7c90ccfa3464bbe5bed"} Jan 29 15:22:51 crc kubenswrapper[4753]: I0129 15:22:51.758681 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"aa665795-9d6f-41ee-9d0e-6783b3f946d7","Type":"ContainerStarted","Data":"55b26c8a04f27feb24286a1186d6438e2228f59869b27f41ddc950120937b28c"} Jan 29 15:22:52 crc kubenswrapper[4753]: I0129 15:22:52.158948 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36cf39c7-e710-4a55-876a-b62cf79ee934" path="/var/lib/kubelet/pods/36cf39c7-e710-4a55-876a-b62cf79ee934/volumes" Jan 29 15:22:53 crc kubenswrapper[4753]: I0129 15:22:53.124954 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Jan 29 15:22:53 crc kubenswrapper[4753]: I0129 15:22:53.141313 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_aa665795-9d6f-41ee-9d0e-6783b3f946d7/mariadb-client-2/0.log" Jan 29 15:22:53 crc kubenswrapper[4753]: E0129 15:22:53.150804 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:22:53 crc kubenswrapper[4753]: I0129 15:22:53.175334 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Jan 29 15:22:53 crc kubenswrapper[4753]: I0129 15:22:53.183172 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Jan 29 15:22:53 crc kubenswrapper[4753]: I0129 15:22:53.213298 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n82c\" (UniqueName: \"kubernetes.io/projected/aa665795-9d6f-41ee-9d0e-6783b3f946d7-kube-api-access-4n82c\") pod \"aa665795-9d6f-41ee-9d0e-6783b3f946d7\" (UID: \"aa665795-9d6f-41ee-9d0e-6783b3f946d7\") " Jan 29 15:22:53 crc kubenswrapper[4753]: I0129 15:22:53.223641 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa665795-9d6f-41ee-9d0e-6783b3f946d7-kube-api-access-4n82c" (OuterVolumeSpecName: "kube-api-access-4n82c") pod "aa665795-9d6f-41ee-9d0e-6783b3f946d7" (UID: "aa665795-9d6f-41ee-9d0e-6783b3f946d7"). InnerVolumeSpecName "kube-api-access-4n82c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:22:53 crc kubenswrapper[4753]: I0129 15:22:53.314976 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n82c\" (UniqueName: \"kubernetes.io/projected/aa665795-9d6f-41ee-9d0e-6783b3f946d7-kube-api-access-4n82c\") on node \"crc\" DevicePath \"\"" Jan 29 15:22:53 crc kubenswrapper[4753]: I0129 15:22:53.780675 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55b26c8a04f27feb24286a1186d6438e2228f59869b27f41ddc950120937b28c" Jan 29 15:22:53 crc kubenswrapper[4753]: I0129 15:22:53.780749 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Jan 29 15:22:54 crc kubenswrapper[4753]: I0129 15:22:54.173220 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa665795-9d6f-41ee-9d0e-6783b3f946d7" path="/var/lib/kubelet/pods/aa665795-9d6f-41ee-9d0e-6783b3f946d7/volumes" Jan 29 15:22:57 crc kubenswrapper[4753]: I0129 15:22:57.055140 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:22:57 crc kubenswrapper[4753]: I0129 15:22:57.055491 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.011285 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8kgkj"] Jan 29 15:23:01 crc kubenswrapper[4753]: E0129 15:23:01.011809 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa665795-9d6f-41ee-9d0e-6783b3f946d7" containerName="mariadb-client-2" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.011820 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa665795-9d6f-41ee-9d0e-6783b3f946d7" containerName="mariadb-client-2" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.011975 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa665795-9d6f-41ee-9d0e-6783b3f946d7" containerName="mariadb-client-2" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.012947 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.025606 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8kgkj"] Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.138053 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-utilities\") pod \"community-operators-8kgkj\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.138571 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6gf8\" (UniqueName: \"kubernetes.io/projected/506e7d4a-3e33-45e0-854c-cd484ba78694-kube-api-access-q6gf8\") pod \"community-operators-8kgkj\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.138613 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-catalog-content\") pod \"community-operators-8kgkj\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.240376 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-utilities\") pod \"community-operators-8kgkj\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.240449 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6gf8\" (UniqueName: \"kubernetes.io/projected/506e7d4a-3e33-45e0-854c-cd484ba78694-kube-api-access-q6gf8\") pod \"community-operators-8kgkj\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.240477 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-catalog-content\") pod \"community-operators-8kgkj\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.241681 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-utilities\") pod \"community-operators-8kgkj\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.242078 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-catalog-content\") pod \"community-operators-8kgkj\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.265092 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6gf8\" (UniqueName: \"kubernetes.io/projected/506e7d4a-3e33-45e0-854c-cd484ba78694-kube-api-access-q6gf8\") pod \"community-operators-8kgkj\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.344638 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.639325 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8kgkj"] Jan 29 15:23:01 crc kubenswrapper[4753]: W0129 15:23:01.649687 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod506e7d4a_3e33_45e0_854c_cd484ba78694.slice/crio-ea29cc3eb737a9b2838b44141122aad341fea8fd0927ea9a128f5dc17fd1d929 WatchSource:0}: Error finding container ea29cc3eb737a9b2838b44141122aad341fea8fd0927ea9a128f5dc17fd1d929: Status 404 returned error can't find the container with id ea29cc3eb737a9b2838b44141122aad341fea8fd0927ea9a128f5dc17fd1d929 Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.844426 4753 generic.go:334] "Generic (PLEG): container finished" podID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerID="8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c" exitCode=0 Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.844474 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kgkj" event={"ID":"506e7d4a-3e33-45e0-854c-cd484ba78694","Type":"ContainerDied","Data":"8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c"} Jan 29 15:23:01 crc kubenswrapper[4753]: I0129 15:23:01.844502 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kgkj" event={"ID":"506e7d4a-3e33-45e0-854c-cd484ba78694","Type":"ContainerStarted","Data":"ea29cc3eb737a9b2838b44141122aad341fea8fd0927ea9a128f5dc17fd1d929"} Jan 29 15:23:01 crc kubenswrapper[4753]: E0129 15:23:01.969973 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 15:23:01 crc kubenswrapper[4753]: E0129 15:23:01.970536 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6gf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8kgkj_openshift-marketplace(506e7d4a-3e33-45e0-854c-cd484ba78694): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:23:01 crc kubenswrapper[4753]: E0129 15:23:01.971829 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-8kgkj" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" Jan 29 15:23:02 crc kubenswrapper[4753]: E0129 15:23:02.857369 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8kgkj" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" Jan 29 15:23:04 crc kubenswrapper[4753]: E0129 15:23:04.152509 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:23:06 crc kubenswrapper[4753]: E0129 15:23:06.155934 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:23:06 crc kubenswrapper[4753]: I0129 15:23:06.426697 4753 scope.go:117] "RemoveContainer" containerID="0505363f7b613cbedabfefdb2d26ee3018c6869379b11973da0d4ce0196c990f" Jan 29 15:23:16 crc kubenswrapper[4753]: E0129 15:23:16.294273 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 15:23:16 crc kubenswrapper[4753]: E0129 15:23:16.294940 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6gf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8kgkj_openshift-marketplace(506e7d4a-3e33-45e0-854c-cd484ba78694): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:23:16 crc kubenswrapper[4753]: E0129 15:23:16.296205 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-8kgkj" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" Jan 29 15:23:19 crc kubenswrapper[4753]: E0129 15:23:19.152082 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:23:21 crc kubenswrapper[4753]: E0129 15:23:21.151623 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:23:27 crc kubenswrapper[4753]: I0129 15:23:27.055803 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:23:27 crc kubenswrapper[4753]: I0129 15:23:27.056966 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:23:31 crc kubenswrapper[4753]: E0129 15:23:31.152726 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8kgkj" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" Jan 29 15:23:34 crc kubenswrapper[4753]: E0129 15:23:34.152139 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:23:34 crc kubenswrapper[4753]: E0129 15:23:34.152429 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:23:43 crc kubenswrapper[4753]: E0129 15:23:43.276287 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 15:23:43 crc kubenswrapper[4753]: E0129 15:23:43.277130 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6gf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8kgkj_openshift-marketplace(506e7d4a-3e33-45e0-854c-cd484ba78694): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:23:43 crc kubenswrapper[4753]: E0129 15:23:43.278434 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-8kgkj" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" Jan 29 15:23:45 crc kubenswrapper[4753]: E0129 15:23:45.151530 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:23:47 crc kubenswrapper[4753]: E0129 15:23:47.151246 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:23:56 crc kubenswrapper[4753]: E0129 15:23:56.156364 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:23:57 crc kubenswrapper[4753]: I0129 15:23:57.055038 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:23:57 crc kubenswrapper[4753]: I0129 15:23:57.055524 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:23:57 crc kubenswrapper[4753]: I0129 15:23:57.055596 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:23:57 crc kubenswrapper[4753]: I0129 15:23:57.056635 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87c9b25e517a2d6c985d3c8b6f73dd009a38a0273f33cbbbc56c00d084557cd5"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:23:57 crc kubenswrapper[4753]: I0129 15:23:57.056745 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://87c9b25e517a2d6c985d3c8b6f73dd009a38a0273f33cbbbc56c00d084557cd5" gracePeriod=600 Jan 29 15:23:57 crc kubenswrapper[4753]: I0129 15:23:57.335421 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="87c9b25e517a2d6c985d3c8b6f73dd009a38a0273f33cbbbc56c00d084557cd5" exitCode=0 Jan 29 15:23:57 crc kubenswrapper[4753]: I0129 15:23:57.335469 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"87c9b25e517a2d6c985d3c8b6f73dd009a38a0273f33cbbbc56c00d084557cd5"} Jan 29 15:23:57 crc kubenswrapper[4753]: I0129 15:23:57.335503 4753 scope.go:117] "RemoveContainer" containerID="ad56cb02bd1d838d9936483cc088a3a205b2ea0581b72158af4e6fd3d7978c20" Jan 29 15:23:58 crc kubenswrapper[4753]: E0129 15:23:58.151637 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8kgkj" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" Jan 29 15:23:58 crc kubenswrapper[4753]: I0129 15:23:58.342967 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca"} Jan 29 15:23:59 crc kubenswrapper[4753]: E0129 15:23:59.150819 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:24:09 crc kubenswrapper[4753]: E0129 15:24:09.151195 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:24:09 crc kubenswrapper[4753]: E0129 15:24:09.151430 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8kgkj" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" Jan 29 15:24:12 crc kubenswrapper[4753]: E0129 15:24:12.153090 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:24:23 crc kubenswrapper[4753]: E0129 15:24:23.153179 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:24:24 crc kubenswrapper[4753]: E0129 15:24:24.153433 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:24:25 crc kubenswrapper[4753]: I0129 15:24:25.548863 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kgkj" event={"ID":"506e7d4a-3e33-45e0-854c-cd484ba78694","Type":"ContainerStarted","Data":"ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1"} Jan 29 15:24:26 crc kubenswrapper[4753]: I0129 15:24:26.559808 4753 generic.go:334] "Generic (PLEG): container finished" podID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerID="ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1" exitCode=0 Jan 29 15:24:26 crc kubenswrapper[4753]: I0129 15:24:26.559874 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kgkj" event={"ID":"506e7d4a-3e33-45e0-854c-cd484ba78694","Type":"ContainerDied","Data":"ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1"} Jan 29 15:24:27 crc kubenswrapper[4753]: I0129 15:24:27.567536 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kgkj" event={"ID":"506e7d4a-3e33-45e0-854c-cd484ba78694","Type":"ContainerStarted","Data":"cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3"} Jan 29 15:24:27 crc kubenswrapper[4753]: I0129 15:24:27.590432 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8kgkj" podStartSLOduration=2.378080463 podStartE2EDuration="1m27.590410951s" podCreationTimestamp="2026-01-29 15:23:00 +0000 UTC" firstStartedPulling="2026-01-29 15:23:01.845972994 +0000 UTC m=+4816.540707376" lastFinishedPulling="2026-01-29 15:24:27.058303482 +0000 UTC m=+4901.753037864" observedRunningTime="2026-01-29 15:24:27.584780499 +0000 UTC m=+4902.279514881" watchObservedRunningTime="2026-01-29 15:24:27.590410951 +0000 UTC m=+4902.285145353" Jan 29 15:24:31 crc kubenswrapper[4753]: I0129 15:24:31.345326 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:24:31 crc kubenswrapper[4753]: I0129 15:24:31.346115 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:24:31 crc kubenswrapper[4753]: I0129 15:24:31.405523 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:24:35 crc kubenswrapper[4753]: E0129 15:24:35.151981 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:24:36 crc kubenswrapper[4753]: E0129 15:24:36.156270 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:24:41 crc kubenswrapper[4753]: I0129 15:24:41.392676 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:24:41 crc kubenswrapper[4753]: I0129 15:24:41.433365 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8kgkj"] Jan 29 15:24:41 crc kubenswrapper[4753]: I0129 15:24:41.708261 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8kgkj" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerName="registry-server" containerID="cri-o://cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3" gracePeriod=2 Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.164119 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.265811 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-catalog-content\") pod \"506e7d4a-3e33-45e0-854c-cd484ba78694\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.265965 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-utilities\") pod \"506e7d4a-3e33-45e0-854c-cd484ba78694\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.265990 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6gf8\" (UniqueName: \"kubernetes.io/projected/506e7d4a-3e33-45e0-854c-cd484ba78694-kube-api-access-q6gf8\") pod \"506e7d4a-3e33-45e0-854c-cd484ba78694\" (UID: \"506e7d4a-3e33-45e0-854c-cd484ba78694\") " Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.267053 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-utilities" (OuterVolumeSpecName: "utilities") pod "506e7d4a-3e33-45e0-854c-cd484ba78694" (UID: "506e7d4a-3e33-45e0-854c-cd484ba78694"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.267480 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.271521 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506e7d4a-3e33-45e0-854c-cd484ba78694-kube-api-access-q6gf8" (OuterVolumeSpecName: "kube-api-access-q6gf8") pod "506e7d4a-3e33-45e0-854c-cd484ba78694" (UID: "506e7d4a-3e33-45e0-854c-cd484ba78694"). InnerVolumeSpecName "kube-api-access-q6gf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.315075 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "506e7d4a-3e33-45e0-854c-cd484ba78694" (UID: "506e7d4a-3e33-45e0-854c-cd484ba78694"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.368697 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6gf8\" (UniqueName: \"kubernetes.io/projected/506e7d4a-3e33-45e0-854c-cd484ba78694-kube-api-access-q6gf8\") on node \"crc\" DevicePath \"\"" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.368735 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/506e7d4a-3e33-45e0-854c-cd484ba78694-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.716837 4753 generic.go:334] "Generic (PLEG): container finished" podID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerID="cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3" exitCode=0 Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.716909 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8kgkj" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.716914 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kgkj" event={"ID":"506e7d4a-3e33-45e0-854c-cd484ba78694","Type":"ContainerDied","Data":"cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3"} Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.718127 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kgkj" event={"ID":"506e7d4a-3e33-45e0-854c-cd484ba78694","Type":"ContainerDied","Data":"ea29cc3eb737a9b2838b44141122aad341fea8fd0927ea9a128f5dc17fd1d929"} Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.718173 4753 scope.go:117] "RemoveContainer" containerID="cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.736865 4753 scope.go:117] "RemoveContainer" containerID="ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.749455 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8kgkj"] Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.755932 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8kgkj"] Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.776874 4753 scope.go:117] "RemoveContainer" containerID="8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.792761 4753 scope.go:117] "RemoveContainer" containerID="cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3" Jan 29 15:24:42 crc kubenswrapper[4753]: E0129 15:24:42.793273 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3\": container with ID starting with cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3 not found: ID does not exist" containerID="cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.793319 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3"} err="failed to get container status \"cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3\": rpc error: code = NotFound desc = could not find container \"cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3\": container with ID starting with cd560b8a1fbf149787f61e7bf9e1c948bc84e7b6f488b193de0b6fc731ac17f3 not found: ID does not exist" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.793346 4753 scope.go:117] "RemoveContainer" containerID="ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1" Jan 29 15:24:42 crc kubenswrapper[4753]: E0129 15:24:42.793634 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1\": container with ID starting with ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1 not found: ID does not exist" containerID="ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.793668 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1"} err="failed to get container status \"ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1\": rpc error: code = NotFound desc = could not find container \"ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1\": container with ID starting with ce46978c4e1becd59696469546ded5633d6efb54d8c6fa00eeab9366464083b1 not found: ID does not exist" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.793692 4753 scope.go:117] "RemoveContainer" containerID="8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c" Jan 29 15:24:42 crc kubenswrapper[4753]: E0129 15:24:42.793899 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c\": container with ID starting with 8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c not found: ID does not exist" containerID="8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c" Jan 29 15:24:42 crc kubenswrapper[4753]: I0129 15:24:42.793927 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c"} err="failed to get container status \"8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c\": rpc error: code = NotFound desc = could not find container \"8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c\": container with ID starting with 8823a948c8788abd97402bb055dbe20c32d5b536e2b22e111090a9b9a8bc1d9c not found: ID does not exist" Jan 29 15:24:44 crc kubenswrapper[4753]: I0129 15:24:44.164071 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" path="/var/lib/kubelet/pods/506e7d4a-3e33-45e0-854c-cd484ba78694/volumes" Jan 29 15:24:47 crc kubenswrapper[4753]: E0129 15:24:47.151238 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:24:47 crc kubenswrapper[4753]: E0129 15:24:47.151398 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:25:00 crc kubenswrapper[4753]: E0129 15:25:00.152779 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" Jan 29 15:25:01 crc kubenswrapper[4753]: E0129 15:25:01.150863 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:25:14 crc kubenswrapper[4753]: E0129 15:25:14.153513 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" Jan 29 15:25:16 crc kubenswrapper[4753]: I0129 15:25:16.046202 4753 generic.go:334] "Generic (PLEG): container finished" podID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerID="70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b" exitCode=0 Jan 29 15:25:16 crc kubenswrapper[4753]: I0129 15:25:16.046249 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wxf5" event={"ID":"2dfdc10e-dc31-4565-b790-9b778061ba36","Type":"ContainerDied","Data":"70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b"} Jan 29 15:25:17 crc kubenswrapper[4753]: I0129 15:25:17.057733 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wxf5" event={"ID":"2dfdc10e-dc31-4565-b790-9b778061ba36","Type":"ContainerStarted","Data":"25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11"} Jan 29 15:25:17 crc kubenswrapper[4753]: I0129 15:25:17.079943 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9wxf5" podStartSLOduration=3.538404871 podStartE2EDuration="5m48.079925644s" podCreationTimestamp="2026-01-29 15:19:29 +0000 UTC" firstStartedPulling="2026-01-29 15:19:31.90244059 +0000 UTC m=+4606.597174982" lastFinishedPulling="2026-01-29 15:25:16.443961373 +0000 UTC m=+4951.138695755" observedRunningTime="2026-01-29 15:25:17.07867215 +0000 UTC m=+4951.773406542" watchObservedRunningTime="2026-01-29 15:25:17.079925644 +0000 UTC m=+4951.774660026" Jan 29 15:25:20 crc kubenswrapper[4753]: I0129 15:25:20.349894 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:25:20 crc kubenswrapper[4753]: I0129 15:25:20.350676 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:25:20 crc kubenswrapper[4753]: I0129 15:25:20.399934 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:25:21 crc kubenswrapper[4753]: I0129 15:25:21.181007 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:25:21 crc kubenswrapper[4753]: I0129 15:25:21.247010 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9wxf5"] Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.132784 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9wxf5" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerName="registry-server" containerID="cri-o://25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11" gracePeriod=2 Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.576621 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.777049 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-utilities\") pod \"2dfdc10e-dc31-4565-b790-9b778061ba36\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.777132 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-catalog-content\") pod \"2dfdc10e-dc31-4565-b790-9b778061ba36\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.777466 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qwgf\" (UniqueName: \"kubernetes.io/projected/2dfdc10e-dc31-4565-b790-9b778061ba36-kube-api-access-9qwgf\") pod \"2dfdc10e-dc31-4565-b790-9b778061ba36\" (UID: \"2dfdc10e-dc31-4565-b790-9b778061ba36\") " Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.778191 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-utilities" (OuterVolumeSpecName: "utilities") pod "2dfdc10e-dc31-4565-b790-9b778061ba36" (UID: "2dfdc10e-dc31-4565-b790-9b778061ba36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.791389 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfdc10e-dc31-4565-b790-9b778061ba36-kube-api-access-9qwgf" (OuterVolumeSpecName: "kube-api-access-9qwgf") pod "2dfdc10e-dc31-4565-b790-9b778061ba36" (UID: "2dfdc10e-dc31-4565-b790-9b778061ba36"). InnerVolumeSpecName "kube-api-access-9qwgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.840593 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dfdc10e-dc31-4565-b790-9b778061ba36" (UID: "2dfdc10e-dc31-4565-b790-9b778061ba36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.879466 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qwgf\" (UniqueName: \"kubernetes.io/projected/2dfdc10e-dc31-4565-b790-9b778061ba36-kube-api-access-9qwgf\") on node \"crc\" DevicePath \"\"" Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.879536 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:25:23 crc kubenswrapper[4753]: I0129 15:25:23.879567 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dfdc10e-dc31-4565-b790-9b778061ba36-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.143882 4753 generic.go:334] "Generic (PLEG): container finished" podID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerID="25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11" exitCode=0 Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.143931 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wxf5" event={"ID":"2dfdc10e-dc31-4565-b790-9b778061ba36","Type":"ContainerDied","Data":"25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11"} Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.144004 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wxf5" event={"ID":"2dfdc10e-dc31-4565-b790-9b778061ba36","Type":"ContainerDied","Data":"64804fdedb8b5f372aaf28c349a3b67da65b715fae1a28d0bc9f9c7a4a6ec392"} Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.144002 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wxf5" Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.144034 4753 scope.go:117] "RemoveContainer" containerID="25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11" Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.181134 4753 scope.go:117] "RemoveContainer" containerID="70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b" Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.184239 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9wxf5"] Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.197815 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9wxf5"] Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.203216 4753 scope.go:117] "RemoveContainer" containerID="05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa" Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.264246 4753 scope.go:117] "RemoveContainer" containerID="25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11" Jan 29 15:25:24 crc kubenswrapper[4753]: E0129 15:25:24.264629 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11\": container with ID starting with 25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11 not found: ID does not exist" containerID="25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11" Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.264666 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11"} err="failed to get container status \"25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11\": rpc error: code = NotFound desc = could not find container \"25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11\": container with ID starting with 25f529dd7cacabc6cf74669e716325c0179c1b53ae12dee3050bd129c561bd11 not found: ID does not exist" Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.264693 4753 scope.go:117] "RemoveContainer" containerID="70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b" Jan 29 15:25:24 crc kubenswrapper[4753]: E0129 15:25:24.265020 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b\": container with ID starting with 70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b not found: ID does not exist" containerID="70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b" Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.265063 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b"} err="failed to get container status \"70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b\": rpc error: code = NotFound desc = could not find container \"70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b\": container with ID starting with 70ab236b65d5a52841430dab94a43f1544609bcdf17cf64e311088897b22775b not found: ID does not exist" Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.265089 4753 scope.go:117] "RemoveContainer" containerID="05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa" Jan 29 15:25:24 crc kubenswrapper[4753]: E0129 15:25:24.265610 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa\": container with ID starting with 05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa not found: ID does not exist" containerID="05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa" Jan 29 15:25:24 crc kubenswrapper[4753]: I0129 15:25:24.265861 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa"} err="failed to get container status \"05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa\": rpc error: code = NotFound desc = could not find container \"05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa\": container with ID starting with 05b6d2f20a877892ec3daa09149233cf9ad23eb72bc3035564d07f7719dc2ffa not found: ID does not exist" Jan 29 15:25:26 crc kubenswrapper[4753]: I0129 15:25:26.165519 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" path="/var/lib/kubelet/pods/2dfdc10e-dc31-4565-b790-9b778061ba36/volumes" Jan 29 15:25:29 crc kubenswrapper[4753]: I0129 15:25:29.190282 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxsh4" event={"ID":"d7388aa3-c682-48f9-b7d2-db1beee6393e","Type":"ContainerStarted","Data":"70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1"} Jan 29 15:25:30 crc kubenswrapper[4753]: I0129 15:25:30.200475 4753 generic.go:334] "Generic (PLEG): container finished" podID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerID="70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1" exitCode=0 Jan 29 15:25:30 crc kubenswrapper[4753]: I0129 15:25:30.200555 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxsh4" event={"ID":"d7388aa3-c682-48f9-b7d2-db1beee6393e","Type":"ContainerDied","Data":"70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1"} Jan 29 15:25:31 crc kubenswrapper[4753]: I0129 15:25:31.215800 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxsh4" event={"ID":"d7388aa3-c682-48f9-b7d2-db1beee6393e","Type":"ContainerStarted","Data":"9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1"} Jan 29 15:25:31 crc kubenswrapper[4753]: I0129 15:25:31.247986 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cxsh4" podStartSLOduration=1.517762636 podStartE2EDuration="5m58.247961326s" podCreationTimestamp="2026-01-29 15:19:33 +0000 UTC" firstStartedPulling="2026-01-29 15:19:33.914666482 +0000 UTC m=+4608.609400864" lastFinishedPulling="2026-01-29 15:25:30.644865172 +0000 UTC m=+4965.339599554" observedRunningTime="2026-01-29 15:25:31.241976635 +0000 UTC m=+4965.936711027" watchObservedRunningTime="2026-01-29 15:25:31.247961326 +0000 UTC m=+4965.942695728" Jan 29 15:25:33 crc kubenswrapper[4753]: I0129 15:25:33.470699 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:25:33 crc kubenswrapper[4753]: I0129 15:25:33.471020 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:25:34 crc kubenswrapper[4753]: I0129 15:25:34.509010 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerName="registry-server" probeResult="failure" output=< Jan 29 15:25:34 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 15:25:34 crc kubenswrapper[4753]: > Jan 29 15:25:43 crc kubenswrapper[4753]: I0129 15:25:43.519434 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:25:43 crc kubenswrapper[4753]: I0129 15:25:43.566311 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:25:43 crc kubenswrapper[4753]: I0129 15:25:43.765313 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxsh4"] Jan 29 15:25:45 crc kubenswrapper[4753]: I0129 15:25:45.324524 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cxsh4" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerName="registry-server" containerID="cri-o://9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1" gracePeriod=2 Jan 29 15:25:45 crc kubenswrapper[4753]: I0129 15:25:45.770703 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:25:45 crc kubenswrapper[4753]: I0129 15:25:45.947559 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-catalog-content\") pod \"d7388aa3-c682-48f9-b7d2-db1beee6393e\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " Jan 29 15:25:45 crc kubenswrapper[4753]: I0129 15:25:45.947655 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-utilities\") pod \"d7388aa3-c682-48f9-b7d2-db1beee6393e\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " Jan 29 15:25:45 crc kubenswrapper[4753]: I0129 15:25:45.947706 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6bn6\" (UniqueName: \"kubernetes.io/projected/d7388aa3-c682-48f9-b7d2-db1beee6393e-kube-api-access-x6bn6\") pod \"d7388aa3-c682-48f9-b7d2-db1beee6393e\" (UID: \"d7388aa3-c682-48f9-b7d2-db1beee6393e\") " Jan 29 15:25:45 crc kubenswrapper[4753]: I0129 15:25:45.948484 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-utilities" (OuterVolumeSpecName: "utilities") pod "d7388aa3-c682-48f9-b7d2-db1beee6393e" (UID: "d7388aa3-c682-48f9-b7d2-db1beee6393e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:25:45 crc kubenswrapper[4753]: I0129 15:25:45.954420 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7388aa3-c682-48f9-b7d2-db1beee6393e-kube-api-access-x6bn6" (OuterVolumeSpecName: "kube-api-access-x6bn6") pod "d7388aa3-c682-48f9-b7d2-db1beee6393e" (UID: "d7388aa3-c682-48f9-b7d2-db1beee6393e"). InnerVolumeSpecName "kube-api-access-x6bn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.048800 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6bn6\" (UniqueName: \"kubernetes.io/projected/d7388aa3-c682-48f9-b7d2-db1beee6393e-kube-api-access-x6bn6\") on node \"crc\" DevicePath \"\"" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.048835 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.095274 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7388aa3-c682-48f9-b7d2-db1beee6393e" (UID: "d7388aa3-c682-48f9-b7d2-db1beee6393e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.150179 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7388aa3-c682-48f9-b7d2-db1beee6393e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.333926 4753 generic.go:334] "Generic (PLEG): container finished" podID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerID="9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1" exitCode=0 Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.333979 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxsh4" event={"ID":"d7388aa3-c682-48f9-b7d2-db1beee6393e","Type":"ContainerDied","Data":"9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1"} Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.334008 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxsh4" event={"ID":"d7388aa3-c682-48f9-b7d2-db1beee6393e","Type":"ContainerDied","Data":"24198fe41fb4f512e0d5435d1b669f34f9cd0df482de5c83cde4d4ca1a373227"} Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.334020 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxsh4" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.334042 4753 scope.go:117] "RemoveContainer" containerID="9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.359485 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxsh4"] Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.361644 4753 scope.go:117] "RemoveContainer" containerID="70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.366724 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cxsh4"] Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.383541 4753 scope.go:117] "RemoveContainer" containerID="83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.418574 4753 scope.go:117] "RemoveContainer" containerID="9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1" Jan 29 15:25:46 crc kubenswrapper[4753]: E0129 15:25:46.419574 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1\": container with ID starting with 9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1 not found: ID does not exist" containerID="9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.419635 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1"} err="failed to get container status \"9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1\": rpc error: code = NotFound desc = could not find container \"9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1\": container with ID starting with 9d3278906064e0d9fea4b501bf3f762c7451d90329191fb3bd731bf7cf01d8c1 not found: ID does not exist" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.419674 4753 scope.go:117] "RemoveContainer" containerID="70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1" Jan 29 15:25:46 crc kubenswrapper[4753]: E0129 15:25:46.420177 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1\": container with ID starting with 70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1 not found: ID does not exist" containerID="70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.420227 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1"} err="failed to get container status \"70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1\": rpc error: code = NotFound desc = could not find container \"70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1\": container with ID starting with 70a530e71ab2ba1e958781704288b2145f90b357f5b576258997e3c36e8fcbe1 not found: ID does not exist" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.420255 4753 scope.go:117] "RemoveContainer" containerID="83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca" Jan 29 15:25:46 crc kubenswrapper[4753]: E0129 15:25:46.420530 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca\": container with ID starting with 83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca not found: ID does not exist" containerID="83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca" Jan 29 15:25:46 crc kubenswrapper[4753]: I0129 15:25:46.420581 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca"} err="failed to get container status \"83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca\": rpc error: code = NotFound desc = could not find container \"83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca\": container with ID starting with 83c087de75c65c1fcee2e0d6cce9b75f1cc51063c2b20308e7a1a5182ba21eca not found: ID does not exist" Jan 29 15:25:48 crc kubenswrapper[4753]: I0129 15:25:48.160008 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" path="/var/lib/kubelet/pods/d7388aa3-c682-48f9-b7d2-db1beee6393e/volumes" Jan 29 15:25:57 crc kubenswrapper[4753]: I0129 15:25:57.055276 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:25:57 crc kubenswrapper[4753]: I0129 15:25:57.056054 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:26:27 crc kubenswrapper[4753]: I0129 15:26:27.055164 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:26:27 crc kubenswrapper[4753]: I0129 15:26:27.055759 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:26:57 crc kubenswrapper[4753]: I0129 15:26:57.055029 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:26:57 crc kubenswrapper[4753]: I0129 15:26:57.056049 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:26:57 crc kubenswrapper[4753]: I0129 15:26:57.056120 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:26:57 crc kubenswrapper[4753]: I0129 15:26:57.057213 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:26:57 crc kubenswrapper[4753]: I0129 15:26:57.057282 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" gracePeriod=600 Jan 29 15:26:57 crc kubenswrapper[4753]: E0129 15:26:57.178606 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:26:58 crc kubenswrapper[4753]: I0129 15:26:58.016038 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" exitCode=0 Jan 29 15:26:58 crc kubenswrapper[4753]: I0129 15:26:58.016126 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca"} Jan 29 15:26:58 crc kubenswrapper[4753]: I0129 15:26:58.016288 4753 scope.go:117] "RemoveContainer" containerID="87c9b25e517a2d6c985d3c8b6f73dd009a38a0273f33cbbbc56c00d084557cd5" Jan 29 15:26:58 crc kubenswrapper[4753]: I0129 15:26:58.016934 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:26:58 crc kubenswrapper[4753]: E0129 15:26:58.017201 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.251567 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Jan 29 15:27:03 crc kubenswrapper[4753]: E0129 15:27:03.252631 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerName="extract-utilities" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.252649 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerName="extract-utilities" Jan 29 15:27:03 crc kubenswrapper[4753]: E0129 15:27:03.252667 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerName="registry-server" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.252674 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerName="registry-server" Jan 29 15:27:03 crc kubenswrapper[4753]: E0129 15:27:03.252690 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerName="extract-content" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.252697 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerName="extract-content" Jan 29 15:27:03 crc kubenswrapper[4753]: E0129 15:27:03.252709 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerName="registry-server" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.252716 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerName="registry-server" Jan 29 15:27:03 crc kubenswrapper[4753]: E0129 15:27:03.252730 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerName="registry-server" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.252738 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerName="registry-server" Jan 29 15:27:03 crc kubenswrapper[4753]: E0129 15:27:03.252755 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerName="extract-utilities" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.252763 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerName="extract-utilities" Jan 29 15:27:03 crc kubenswrapper[4753]: E0129 15:27:03.252773 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerName="extract-content" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.252780 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerName="extract-content" Jan 29 15:27:03 crc kubenswrapper[4753]: E0129 15:27:03.252792 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerName="extract-content" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.252799 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerName="extract-content" Jan 29 15:27:03 crc kubenswrapper[4753]: E0129 15:27:03.252812 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerName="extract-utilities" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.252820 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerName="extract-utilities" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.252984 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7388aa3-c682-48f9-b7d2-db1beee6393e" containerName="registry-server" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.253000 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dfdc10e-dc31-4565-b790-9b778061ba36" containerName="registry-server" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.253014 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="506e7d4a-3e33-45e0-854c-cd484ba78694" containerName="registry-server" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.253647 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.257360 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bphpf" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.261473 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.308702 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vzq\" (UniqueName: \"kubernetes.io/projected/744ae72c-556c-4e95-afc0-0f7fb7cb0923-kube-api-access-44vzq\") pod \"mariadb-copy-data\" (UID: \"744ae72c-556c-4e95-afc0-0f7fb7cb0923\") " pod="openstack/mariadb-copy-data" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.308814 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7ceb18a9-468e-4441-84d7-a902129ca40a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ceb18a9-468e-4441-84d7-a902129ca40a\") pod \"mariadb-copy-data\" (UID: \"744ae72c-556c-4e95-afc0-0f7fb7cb0923\") " pod="openstack/mariadb-copy-data" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.410558 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7ceb18a9-468e-4441-84d7-a902129ca40a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ceb18a9-468e-4441-84d7-a902129ca40a\") pod \"mariadb-copy-data\" (UID: \"744ae72c-556c-4e95-afc0-0f7fb7cb0923\") " pod="openstack/mariadb-copy-data" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.410652 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vzq\" (UniqueName: \"kubernetes.io/projected/744ae72c-556c-4e95-afc0-0f7fb7cb0923-kube-api-access-44vzq\") pod \"mariadb-copy-data\" (UID: \"744ae72c-556c-4e95-afc0-0f7fb7cb0923\") " pod="openstack/mariadb-copy-data" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.415550 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.415588 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7ceb18a9-468e-4441-84d7-a902129ca40a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ceb18a9-468e-4441-84d7-a902129ca40a\") pod \"mariadb-copy-data\" (UID: \"744ae72c-556c-4e95-afc0-0f7fb7cb0923\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd78fa0acb4aa3da68647d9a98ef77a1f119dedd96b4f08cb3903219ab7c3f64/globalmount\"" pod="openstack/mariadb-copy-data" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.435579 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vzq\" (UniqueName: \"kubernetes.io/projected/744ae72c-556c-4e95-afc0-0f7fb7cb0923-kube-api-access-44vzq\") pod \"mariadb-copy-data\" (UID: \"744ae72c-556c-4e95-afc0-0f7fb7cb0923\") " pod="openstack/mariadb-copy-data" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.448931 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7ceb18a9-468e-4441-84d7-a902129ca40a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ceb18a9-468e-4441-84d7-a902129ca40a\") pod \"mariadb-copy-data\" (UID: \"744ae72c-556c-4e95-afc0-0f7fb7cb0923\") " pod="openstack/mariadb-copy-data" Jan 29 15:27:03 crc kubenswrapper[4753]: I0129 15:27:03.578080 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Jan 29 15:27:04 crc kubenswrapper[4753]: I0129 15:27:04.121357 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Jan 29 15:27:05 crc kubenswrapper[4753]: I0129 15:27:05.086384 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"744ae72c-556c-4e95-afc0-0f7fb7cb0923","Type":"ContainerStarted","Data":"e0a761c34f96748b921fc1a23082057464ae67877a2d1551042dc15b2c426482"} Jan 29 15:27:05 crc kubenswrapper[4753]: I0129 15:27:05.086899 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"744ae72c-556c-4e95-afc0-0f7fb7cb0923","Type":"ContainerStarted","Data":"b713b1baa072c50543e151a5a3dd6dcf1e604bf72d7539432e2b193b2ae52bae"} Jan 29 15:27:05 crc kubenswrapper[4753]: I0129 15:27:05.118051 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.118001929 podStartE2EDuration="3.118001929s" podCreationTimestamp="2026-01-29 15:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:27:05.10873002 +0000 UTC m=+5059.803464412" watchObservedRunningTime="2026-01-29 15:27:05.118001929 +0000 UTC m=+5059.812736321" Jan 29 15:27:06 crc kubenswrapper[4753]: I0129 15:27:06.582021 4753 scope.go:117] "RemoveContainer" containerID="1969da0e5321749e4e89d0fea5bb7854b3710a35c18dccb0896600571dc2014e" Jan 29 15:27:07 crc kubenswrapper[4753]: I0129 15:27:07.903409 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 29 15:27:07 crc kubenswrapper[4753]: I0129 15:27:07.904943 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 29 15:27:07 crc kubenswrapper[4753]: I0129 15:27:07.919924 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 29 15:27:08 crc kubenswrapper[4753]: I0129 15:27:08.086023 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwsrl\" (UniqueName: \"kubernetes.io/projected/ffd96ccb-c3ae-4464-ac99-936e15780c1f-kube-api-access-lwsrl\") pod \"mariadb-client\" (UID: \"ffd96ccb-c3ae-4464-ac99-936e15780c1f\") " pod="openstack/mariadb-client" Jan 29 15:27:08 crc kubenswrapper[4753]: I0129 15:27:08.187091 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwsrl\" (UniqueName: \"kubernetes.io/projected/ffd96ccb-c3ae-4464-ac99-936e15780c1f-kube-api-access-lwsrl\") pod \"mariadb-client\" (UID: \"ffd96ccb-c3ae-4464-ac99-936e15780c1f\") " pod="openstack/mariadb-client" Jan 29 15:27:08 crc kubenswrapper[4753]: I0129 15:27:08.222915 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwsrl\" (UniqueName: \"kubernetes.io/projected/ffd96ccb-c3ae-4464-ac99-936e15780c1f-kube-api-access-lwsrl\") pod \"mariadb-client\" (UID: \"ffd96ccb-c3ae-4464-ac99-936e15780c1f\") " pod="openstack/mariadb-client" Jan 29 15:27:08 crc kubenswrapper[4753]: I0129 15:27:08.227243 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 29 15:27:08 crc kubenswrapper[4753]: I0129 15:27:08.691830 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 29 15:27:08 crc kubenswrapper[4753]: W0129 15:27:08.696617 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffd96ccb_c3ae_4464_ac99_936e15780c1f.slice/crio-2f4b137ca637a4451f3c31b38071057df109bf7e4d3d823febcb6ba3b5733c0f WatchSource:0}: Error finding container 2f4b137ca637a4451f3c31b38071057df109bf7e4d3d823febcb6ba3b5733c0f: Status 404 returned error can't find the container with id 2f4b137ca637a4451f3c31b38071057df109bf7e4d3d823febcb6ba3b5733c0f Jan 29 15:27:09 crc kubenswrapper[4753]: I0129 15:27:09.135975 4753 generic.go:334] "Generic (PLEG): container finished" podID="ffd96ccb-c3ae-4464-ac99-936e15780c1f" containerID="166553dd5a710aca9b0e0e02018aaf924bc1974ea43eb185e00a14ab9cf5ff9b" exitCode=0 Jan 29 15:27:09 crc kubenswrapper[4753]: I0129 15:27:09.136392 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ffd96ccb-c3ae-4464-ac99-936e15780c1f","Type":"ContainerDied","Data":"166553dd5a710aca9b0e0e02018aaf924bc1974ea43eb185e00a14ab9cf5ff9b"} Jan 29 15:27:09 crc kubenswrapper[4753]: I0129 15:27:09.136529 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ffd96ccb-c3ae-4464-ac99-936e15780c1f","Type":"ContainerStarted","Data":"2f4b137ca637a4451f3c31b38071057df109bf7e4d3d823febcb6ba3b5733c0f"} Jan 29 15:27:09 crc kubenswrapper[4753]: I0129 15:27:09.150232 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:27:09 crc kubenswrapper[4753]: E0129 15:27:09.150701 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.430778 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.459368 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_ffd96ccb-c3ae-4464-ac99-936e15780c1f/mariadb-client/0.log" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.494299 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.502640 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.527456 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwsrl\" (UniqueName: \"kubernetes.io/projected/ffd96ccb-c3ae-4464-ac99-936e15780c1f-kube-api-access-lwsrl\") pod \"ffd96ccb-c3ae-4464-ac99-936e15780c1f\" (UID: \"ffd96ccb-c3ae-4464-ac99-936e15780c1f\") " Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.533851 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd96ccb-c3ae-4464-ac99-936e15780c1f-kube-api-access-lwsrl" (OuterVolumeSpecName: "kube-api-access-lwsrl") pod "ffd96ccb-c3ae-4464-ac99-936e15780c1f" (UID: "ffd96ccb-c3ae-4464-ac99-936e15780c1f"). InnerVolumeSpecName "kube-api-access-lwsrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.629674 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwsrl\" (UniqueName: \"kubernetes.io/projected/ffd96ccb-c3ae-4464-ac99-936e15780c1f-kube-api-access-lwsrl\") on node \"crc\" DevicePath \"\"" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.653499 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 29 15:27:10 crc kubenswrapper[4753]: E0129 15:27:10.653770 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd96ccb-c3ae-4464-ac99-936e15780c1f" containerName="mariadb-client" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.653783 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd96ccb-c3ae-4464-ac99-936e15780c1f" containerName="mariadb-client" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.653958 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd96ccb-c3ae-4464-ac99-936e15780c1f" containerName="mariadb-client" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.654470 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.667067 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.833639 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rq7b\" (UniqueName: \"kubernetes.io/projected/c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4-kube-api-access-2rq7b\") pod \"mariadb-client\" (UID: \"c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4\") " pod="openstack/mariadb-client" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.935804 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rq7b\" (UniqueName: \"kubernetes.io/projected/c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4-kube-api-access-2rq7b\") pod \"mariadb-client\" (UID: \"c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4\") " pod="openstack/mariadb-client" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.957271 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rq7b\" (UniqueName: \"kubernetes.io/projected/c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4-kube-api-access-2rq7b\") pod \"mariadb-client\" (UID: \"c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4\") " pod="openstack/mariadb-client" Jan 29 15:27:10 crc kubenswrapper[4753]: I0129 15:27:10.976504 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 29 15:27:11 crc kubenswrapper[4753]: I0129 15:27:11.162910 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4b137ca637a4451f3c31b38071057df109bf7e4d3d823febcb6ba3b5733c0f" Jan 29 15:27:11 crc kubenswrapper[4753]: I0129 15:27:11.162960 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 29 15:27:11 crc kubenswrapper[4753]: I0129 15:27:11.265953 4753 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="ffd96ccb-c3ae-4464-ac99-936e15780c1f" podUID="c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4" Jan 29 15:27:11 crc kubenswrapper[4753]: W0129 15:27:11.613517 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4f6e0fe_3ca0_4f3d_8fb4_8a00470026b4.slice/crio-fddf81890c62a56aeaac05228d44ca1d4256d9e264c67e37c8b96d38e171292f WatchSource:0}: Error finding container fddf81890c62a56aeaac05228d44ca1d4256d9e264c67e37c8b96d38e171292f: Status 404 returned error can't find the container with id fddf81890c62a56aeaac05228d44ca1d4256d9e264c67e37c8b96d38e171292f Jan 29 15:27:11 crc kubenswrapper[4753]: I0129 15:27:11.619792 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 29 15:27:12 crc kubenswrapper[4753]: I0129 15:27:12.158085 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd96ccb-c3ae-4464-ac99-936e15780c1f" path="/var/lib/kubelet/pods/ffd96ccb-c3ae-4464-ac99-936e15780c1f/volumes" Jan 29 15:27:12 crc kubenswrapper[4753]: I0129 15:27:12.171190 4753 generic.go:334] "Generic (PLEG): container finished" podID="c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4" containerID="f97a8d18ab69bee8fda157f780873794c25a8c05f8048b4689b5ff7e8c376148" exitCode=0 Jan 29 15:27:12 crc kubenswrapper[4753]: I0129 15:27:12.171243 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4","Type":"ContainerDied","Data":"f97a8d18ab69bee8fda157f780873794c25a8c05f8048b4689b5ff7e8c376148"} Jan 29 15:27:12 crc kubenswrapper[4753]: I0129 15:27:12.171287 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4","Type":"ContainerStarted","Data":"fddf81890c62a56aeaac05228d44ca1d4256d9e264c67e37c8b96d38e171292f"} Jan 29 15:27:13 crc kubenswrapper[4753]: I0129 15:27:13.492378 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 29 15:27:13 crc kubenswrapper[4753]: I0129 15:27:13.523417 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4/mariadb-client/0.log" Jan 29 15:27:13 crc kubenswrapper[4753]: I0129 15:27:13.576591 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rq7b\" (UniqueName: \"kubernetes.io/projected/c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4-kube-api-access-2rq7b\") pod \"c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4\" (UID: \"c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4\") " Jan 29 15:27:13 crc kubenswrapper[4753]: I0129 15:27:13.587118 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 29 15:27:13 crc kubenswrapper[4753]: I0129 15:27:13.592297 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 29 15:27:13 crc kubenswrapper[4753]: I0129 15:27:13.603334 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4-kube-api-access-2rq7b" (OuterVolumeSpecName: "kube-api-access-2rq7b") pod "c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4" (UID: "c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4"). InnerVolumeSpecName "kube-api-access-2rq7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:27:13 crc kubenswrapper[4753]: I0129 15:27:13.678254 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rq7b\" (UniqueName: \"kubernetes.io/projected/c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4-kube-api-access-2rq7b\") on node \"crc\" DevicePath \"\"" Jan 29 15:27:14 crc kubenswrapper[4753]: I0129 15:27:14.159686 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4" path="/var/lib/kubelet/pods/c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4/volumes" Jan 29 15:27:14 crc kubenswrapper[4753]: I0129 15:27:14.186127 4753 scope.go:117] "RemoveContainer" containerID="f97a8d18ab69bee8fda157f780873794c25a8c05f8048b4689b5ff7e8c376148" Jan 29 15:27:14 crc kubenswrapper[4753]: I0129 15:27:14.186194 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 29 15:27:21 crc kubenswrapper[4753]: I0129 15:27:21.149693 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:27:21 crc kubenswrapper[4753]: E0129 15:27:21.152552 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:27:36 crc kubenswrapper[4753]: I0129 15:27:36.155737 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:27:36 crc kubenswrapper[4753]: E0129 15:27:36.156781 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.777926 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 15:27:47 crc kubenswrapper[4753]: E0129 15:27:47.778977 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4" containerName="mariadb-client" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.778998 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4" containerName="mariadb-client" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.779290 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4f6e0fe-3ca0-4f3d-8fb4-8a00470026b4" containerName="mariadb-client" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.780641 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.782582 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-t5dvm" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.782919 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.783013 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.788913 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.791280 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.814265 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.816339 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.817832 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.824768 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.826000 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqzft\" (UniqueName: \"kubernetes.io/projected/4da64566-6ed8-41ab-aaa7-354bead2c806-kube-api-access-mqzft\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.826054 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a3097a4b-4713-44fe-9dcb-ff5b55a19381\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a3097a4b-4713-44fe-9dcb-ff5b55a19381\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.826086 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da64566-6ed8-41ab-aaa7-354bead2c806-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.826115 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4da64566-6ed8-41ab-aaa7-354bead2c806-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.826236 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4da64566-6ed8-41ab-aaa7-354bead2c806-config\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.826302 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4da64566-6ed8-41ab-aaa7-354bead2c806-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.837177 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.927286 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/231b5162-2261-4cff-80bb-10a61ef63095-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.927633 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-91923d7b-64d3-4e53-952d-15d607c4ff62\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91923d7b-64d3-4e53-952d-15d607c4ff62\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.927668 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/df70109c-a132-46c6-92d9-ea3af4d90b11-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.927700 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df70109c-a132-46c6-92d9-ea3af4d90b11-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.927736 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqzft\" (UniqueName: \"kubernetes.io/projected/4da64566-6ed8-41ab-aaa7-354bead2c806-kube-api-access-mqzft\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.927846 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a3097a4b-4713-44fe-9dcb-ff5b55a19381\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a3097a4b-4713-44fe-9dcb-ff5b55a19381\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.927886 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-56290be1-bed7-4954-bfdf-0138dd828b28\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56290be1-bed7-4954-bfdf-0138dd828b28\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.927920 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da64566-6ed8-41ab-aaa7-354bead2c806-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.927948 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df70109c-a132-46c6-92d9-ea3af4d90b11-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.927981 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4da64566-6ed8-41ab-aaa7-354bead2c806-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.928025 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/231b5162-2261-4cff-80bb-10a61ef63095-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.928061 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpqnw\" (UniqueName: \"kubernetes.io/projected/231b5162-2261-4cff-80bb-10a61ef63095-kube-api-access-cpqnw\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.928096 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4da64566-6ed8-41ab-aaa7-354bead2c806-config\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.928139 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231b5162-2261-4cff-80bb-10a61ef63095-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.928195 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df70109c-a132-46c6-92d9-ea3af4d90b11-config\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.928251 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231b5162-2261-4cff-80bb-10a61ef63095-config\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.928294 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wj6p\" (UniqueName: \"kubernetes.io/projected/df70109c-a132-46c6-92d9-ea3af4d90b11-kube-api-access-7wj6p\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.928344 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4da64566-6ed8-41ab-aaa7-354bead2c806-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.928873 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4da64566-6ed8-41ab-aaa7-354bead2c806-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.929188 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4da64566-6ed8-41ab-aaa7-354bead2c806-config\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.929489 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4da64566-6ed8-41ab-aaa7-354bead2c806-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.932016 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.932063 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a3097a4b-4713-44fe-9dcb-ff5b55a19381\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a3097a4b-4713-44fe-9dcb-ff5b55a19381\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e3d3a5814efbec1e143d0b8aff874bd45e70ea280319e7436eaf1f2bdee777d4/globalmount\"" pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.934984 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4da64566-6ed8-41ab-aaa7-354bead2c806-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.946844 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqzft\" (UniqueName: \"kubernetes.io/projected/4da64566-6ed8-41ab-aaa7-354bead2c806-kube-api-access-mqzft\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.968166 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a3097a4b-4713-44fe-9dcb-ff5b55a19381\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a3097a4b-4713-44fe-9dcb-ff5b55a19381\") pod \"ovsdbserver-nb-0\" (UID: \"4da64566-6ed8-41ab-aaa7-354bead2c806\") " pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.995017 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.996133 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.998290 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 29 15:27:47 crc kubenswrapper[4753]: I0129 15:27:47.998516 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.001298 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-j88v5" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.006740 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.022392 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.028888 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.030821 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/231b5162-2261-4cff-80bb-10a61ef63095-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.030951 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpqnw\" (UniqueName: \"kubernetes.io/projected/231b5162-2261-4cff-80bb-10a61ef63095-kube-api-access-cpqnw\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.031015 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231b5162-2261-4cff-80bb-10a61ef63095-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.031051 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df70109c-a132-46c6-92d9-ea3af4d90b11-config\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.031085 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231b5162-2261-4cff-80bb-10a61ef63095-config\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.031189 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wj6p\" (UniqueName: \"kubernetes.io/projected/df70109c-a132-46c6-92d9-ea3af4d90b11-kube-api-access-7wj6p\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.031257 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-91923d7b-64d3-4e53-952d-15d607c4ff62\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91923d7b-64d3-4e53-952d-15d607c4ff62\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.031291 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/231b5162-2261-4cff-80bb-10a61ef63095-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.031350 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/df70109c-a132-46c6-92d9-ea3af4d90b11-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.031395 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df70109c-a132-46c6-92d9-ea3af4d90b11-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.031464 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-56290be1-bed7-4954-bfdf-0138dd828b28\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56290be1-bed7-4954-bfdf-0138dd828b28\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.031498 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df70109c-a132-46c6-92d9-ea3af4d90b11-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.035201 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.035285 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/231b5162-2261-4cff-80bb-10a61ef63095-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.035424 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231b5162-2261-4cff-80bb-10a61ef63095-config\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.037856 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.037876 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-91923d7b-64d3-4e53-952d-15d607c4ff62\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91923d7b-64d3-4e53-952d-15d607c4ff62\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/369ef588bf5da84b4ab35570d7e871263d8c918dd05f70fe0d6a7eb4286c394e/globalmount\"" pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.043446 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.044307 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231b5162-2261-4cff-80bb-10a61ef63095-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.044735 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/231b5162-2261-4cff-80bb-10a61ef63095-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.048338 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.048372 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-56290be1-bed7-4954-bfdf-0138dd828b28\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56290be1-bed7-4954-bfdf-0138dd828b28\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0b05329b3fe5e53c309661e0ba82fbfeba8bf3aaf19abb8a4996b69cd03f065e/globalmount\"" pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.050106 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/df70109c-a132-46c6-92d9-ea3af4d90b11-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.050876 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df70109c-a132-46c6-92d9-ea3af4d90b11-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.061252 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df70109c-a132-46c6-92d9-ea3af4d90b11-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.056395 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df70109c-a132-46c6-92d9-ea3af4d90b11-config\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.063260 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wj6p\" (UniqueName: \"kubernetes.io/projected/df70109c-a132-46c6-92d9-ea3af4d90b11-kube-api-access-7wj6p\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.066603 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpqnw\" (UniqueName: \"kubernetes.io/projected/231b5162-2261-4cff-80bb-10a61ef63095-kube-api-access-cpqnw\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.074597 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.078213 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.099873 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-56290be1-bed7-4954-bfdf-0138dd828b28\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56290be1-bed7-4954-bfdf-0138dd828b28\") pod \"ovsdbserver-nb-2\" (UID: \"df70109c-a132-46c6-92d9-ea3af4d90b11\") " pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.101496 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.104606 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-91923d7b-64d3-4e53-952d-15d607c4ff62\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91923d7b-64d3-4e53-952d-15d607c4ff62\") pod \"ovsdbserver-nb-1\" (UID: \"231b5162-2261-4cff-80bb-10a61ef63095\") " pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.134131 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.134209 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckwbj\" (UniqueName: \"kubernetes.io/projected/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-kube-api-access-ckwbj\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.134274 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-config\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.134309 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-240a6f3f-8091-4937-a944-fefe131a1728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-240a6f3f-8091-4937-a944-fefe131a1728\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.134360 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.134575 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.163198 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.170911 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236022 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckwbj\" (UniqueName: \"kubernetes.io/projected/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-kube-api-access-ckwbj\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236459 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e4a88502-4db4-4387-a942-bba43250fb20-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236508 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-config\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236541 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-49bb4600-50db-4a8e-8371-3c73ddacf9d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49bb4600-50db-4a8e-8371-3c73ddacf9d9\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236572 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236600 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-240a6f3f-8091-4937-a944-fefe131a1728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-240a6f3f-8091-4937-a944-fefe131a1728\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236624 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236663 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3fff0849-d763-4738-a2c6-737b392decf7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fff0849-d763-4738-a2c6-737b392decf7\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236708 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4a88502-4db4-4387-a942-bba43250fb20-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236738 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a88502-4db4-4387-a942-bba43250fb20-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236763 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236839 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljbvq\" (UniqueName: \"kubernetes.io/projected/e4a88502-4db4-4387-a942-bba43250fb20-kube-api-access-ljbvq\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236863 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7skf\" (UniqueName: \"kubernetes.io/projected/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-kube-api-access-m7skf\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.236885 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.237317 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.237358 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a88502-4db4-4387-a942-bba43250fb20-config\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.237387 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-config\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.237457 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.237958 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.238922 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-config\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.238932 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.243412 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.243456 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-240a6f3f-8091-4937-a944-fefe131a1728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-240a6f3f-8091-4937-a944-fefe131a1728\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/366e6d83559fe511dee2894bb7585c8f9d225594b33884434898df259b27c1c4/globalmount\"" pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.245824 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.260990 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckwbj\" (UniqueName: \"kubernetes.io/projected/ff76ca68-ec4f-4730-9c99-8e51389ba0a6-kube-api-access-ckwbj\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.289437 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-240a6f3f-8091-4937-a944-fefe131a1728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-240a6f3f-8091-4937-a944-fefe131a1728\") pod \"ovsdbserver-sb-0\" (UID: \"ff76ca68-ec4f-4730-9c99-8e51389ba0a6\") " pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339134 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339397 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a88502-4db4-4387-a942-bba43250fb20-config\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339423 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-config\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339479 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e4a88502-4db4-4387-a942-bba43250fb20-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339527 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-49bb4600-50db-4a8e-8371-3c73ddacf9d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49bb4600-50db-4a8e-8371-3c73ddacf9d9\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339558 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339596 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3fff0849-d763-4738-a2c6-737b392decf7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fff0849-d763-4738-a2c6-737b392decf7\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339622 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339640 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4a88502-4db4-4387-a942-bba43250fb20-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339689 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a88502-4db4-4387-a942-bba43250fb20-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339748 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljbvq\" (UniqueName: \"kubernetes.io/projected/e4a88502-4db4-4387-a942-bba43250fb20-kube-api-access-ljbvq\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339771 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7skf\" (UniqueName: \"kubernetes.io/projected/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-kube-api-access-m7skf\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.339800 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.340472 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a88502-4db4-4387-a942-bba43250fb20-config\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.340532 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-config\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.340936 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.341389 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e4a88502-4db4-4387-a942-bba43250fb20-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.342077 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4a88502-4db4-4387-a942-bba43250fb20-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.344242 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.344276 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-49bb4600-50db-4a8e-8371-3c73ddacf9d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49bb4600-50db-4a8e-8371-3c73ddacf9d9\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f046a959b36d29cb825a21caed7302d6c605901af395aa7f6e97942aeb135af7/globalmount\"" pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.345026 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.345070 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3fff0849-d763-4738-a2c6-737b392decf7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fff0849-d763-4738-a2c6-737b392decf7\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1a2212ec2539e8aa9c526b59f5b940bcb6d4d8f17c64855d180429094db9c90/globalmount\"" pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.355257 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.356805 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a88502-4db4-4387-a942-bba43250fb20-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.366758 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljbvq\" (UniqueName: \"kubernetes.io/projected/e4a88502-4db4-4387-a942-bba43250fb20-kube-api-access-ljbvq\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.367091 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7skf\" (UniqueName: \"kubernetes.io/projected/f34befd5-5c61-47e0-8dd6-e3637efdbc8d-kube-api-access-m7skf\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.373267 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-49bb4600-50db-4a8e-8371-3c73ddacf9d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49bb4600-50db-4a8e-8371-3c73ddacf9d9\") pod \"ovsdbserver-sb-2\" (UID: \"e4a88502-4db4-4387-a942-bba43250fb20\") " pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.398463 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3fff0849-d763-4738-a2c6-737b392decf7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fff0849-d763-4738-a2c6-737b392decf7\") pod \"ovsdbserver-sb-1\" (UID: \"f34befd5-5c61-47e0-8dd6-e3637efdbc8d\") " pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.428116 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.438354 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.448604 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.614397 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 29 15:27:48 crc kubenswrapper[4753]: I0129 15:27:48.721224 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 29 15:27:48 crc kubenswrapper[4753]: W0129 15:27:48.724865 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod231b5162_2261_4cff_80bb_10a61ef63095.slice/crio-0f5bc5602afd8020d3325b1e946a296aff71f23a9951d8494c7ffa4b13d3920a WatchSource:0}: Error finding container 0f5bc5602afd8020d3325b1e946a296aff71f23a9951d8494c7ffa4b13d3920a: Status 404 returned error can't find the container with id 0f5bc5602afd8020d3325b1e946a296aff71f23a9951d8494c7ffa4b13d3920a Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.067022 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 29 15:27:49 crc kubenswrapper[4753]: W0129 15:27:49.071961 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4a88502_4db4_4387_a942_bba43250fb20.slice/crio-3193b1a89247fdfc287c61e7178136e34baf0528c80153d4f5a91e7a2a01c13a WatchSource:0}: Error finding container 3193b1a89247fdfc287c61e7178136e34baf0528c80153d4f5a91e7a2a01c13a: Status 404 returned error can't find the container with id 3193b1a89247fdfc287c61e7178136e34baf0528c80153d4f5a91e7a2a01c13a Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.149674 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:27:49 crc kubenswrapper[4753]: E0129 15:27:49.149917 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.500855 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"231b5162-2261-4cff-80bb-10a61ef63095","Type":"ContainerStarted","Data":"3963257943e28afef9dd20bfd8d4afba210d1c0ccada84852f7260cda7e94e73"} Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.500933 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"231b5162-2261-4cff-80bb-10a61ef63095","Type":"ContainerStarted","Data":"6ab2f2d8de099ca685ce6663b2c5550893914a558121ea809b81b06f31c81836"} Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.500955 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"231b5162-2261-4cff-80bb-10a61ef63095","Type":"ContainerStarted","Data":"0f5bc5602afd8020d3325b1e946a296aff71f23a9951d8494c7ffa4b13d3920a"} Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.502967 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4da64566-6ed8-41ab-aaa7-354bead2c806","Type":"ContainerStarted","Data":"d93f3c866f9484c5d38db7f141bb72f606b2cc0245a25a84e90ad5e5e30045d0"} Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.503052 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4da64566-6ed8-41ab-aaa7-354bead2c806","Type":"ContainerStarted","Data":"be3aaa2fb6765a2b0896ed5f6b9d6710faaeb5b2221c0fba421b43dda6080638"} Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.503063 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4da64566-6ed8-41ab-aaa7-354bead2c806","Type":"ContainerStarted","Data":"8ed3f28fe471e07d2f7086411309a06ec89400b67c9a3ae1b28ca3c62e644353"} Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.504578 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e4a88502-4db4-4387-a942-bba43250fb20","Type":"ContainerStarted","Data":"22661502ddf135a71ca0837ce18766911c6ee1cd66e216d57b6e4a5165e695b3"} Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.504603 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e4a88502-4db4-4387-a942-bba43250fb20","Type":"ContainerStarted","Data":"5b259b9b7f1c802d1d1117d7a9dd6c98c079c765fa57c9d124f6c095d664992c"} Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.504612 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e4a88502-4db4-4387-a942-bba43250fb20","Type":"ContainerStarted","Data":"3193b1a89247fdfc287c61e7178136e34baf0528c80153d4f5a91e7a2a01c13a"} Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.534049 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.534026796 podStartE2EDuration="3.534026796s" podCreationTimestamp="2026-01-29 15:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:27:49.52748999 +0000 UTC m=+5104.222224372" watchObservedRunningTime="2026-01-29 15:27:49.534026796 +0000 UTC m=+5104.228761168" Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.554524 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.554502726 podStartE2EDuration="3.554502726s" podCreationTimestamp="2026-01-29 15:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:27:49.546284025 +0000 UTC m=+5104.241018457" watchObservedRunningTime="2026-01-29 15:27:49.554502726 +0000 UTC m=+5104.249237108" Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.588044 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.588018658 podStartE2EDuration="3.588018658s" podCreationTimestamp="2026-01-29 15:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:27:49.571350679 +0000 UTC m=+5104.266085071" watchObservedRunningTime="2026-01-29 15:27:49.588018658 +0000 UTC m=+5104.282753080" Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.737709 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.844528 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 29 15:27:49 crc kubenswrapper[4753]: I0129 15:27:49.951445 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.514762 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ff76ca68-ec4f-4730-9c99-8e51389ba0a6","Type":"ContainerStarted","Data":"b5de29f40f9a964adaf87f9949bf17633b32ee6f4d879e7cf3c56139344788da"} Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.515139 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ff76ca68-ec4f-4730-9c99-8e51389ba0a6","Type":"ContainerStarted","Data":"a4a6781c2d4b6eb1cc968490719ef0d15be34237a596b8307934aeb678419e88"} Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.515174 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ff76ca68-ec4f-4730-9c99-8e51389ba0a6","Type":"ContainerStarted","Data":"d302c044175ec1219e81347f7787657f4a6dec65c008b9c741bb3173d4e868d9"} Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.517517 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"df70109c-a132-46c6-92d9-ea3af4d90b11","Type":"ContainerStarted","Data":"b7af922de646d92aba8c9829553d6579d2e8c9977d24a3da55761cec1a56856d"} Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.517585 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"df70109c-a132-46c6-92d9-ea3af4d90b11","Type":"ContainerStarted","Data":"f0a0847a19785c721fdf8253362724dab62a3c016cbb84a7d8bf76fdefb20baa"} Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.517605 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"df70109c-a132-46c6-92d9-ea3af4d90b11","Type":"ContainerStarted","Data":"9a083e07829204ec4647f2809c991fcc81d4e9a3cf20c1b811892ad3cf2e1d89"} Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.519485 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"f34befd5-5c61-47e0-8dd6-e3637efdbc8d","Type":"ContainerStarted","Data":"920c11afaa9a78832c80836ab352fb4da7fbd1ddef8cb0b906623aa99453e2ac"} Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.519559 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"f34befd5-5c61-47e0-8dd6-e3637efdbc8d","Type":"ContainerStarted","Data":"58acb2848d6bed71f73b96301be40a2cba4346271b692d0b7b7f5c167ef5f78b"} Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.519590 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"f34befd5-5c61-47e0-8dd6-e3637efdbc8d","Type":"ContainerStarted","Data":"9751d1e582277a7eeee4cf0ae0fdcc1d0f983f142aa83d0e135ee3de3e3826ca"} Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.538261 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.538243757 podStartE2EDuration="4.538243757s" podCreationTimestamp="2026-01-29 15:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:27:50.536338975 +0000 UTC m=+5105.231073437" watchObservedRunningTime="2026-01-29 15:27:50.538243757 +0000 UTC m=+5105.232978139" Jan 29 15:27:50 crc kubenswrapper[4753]: I0129 15:27:50.560987 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.560961447 podStartE2EDuration="4.560961447s" podCreationTimestamp="2026-01-29 15:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:27:50.554599037 +0000 UTC m=+5105.249333419" watchObservedRunningTime="2026-01-29 15:27:50.560961447 +0000 UTC m=+5105.255695869" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.101759 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.156021 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.164059 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.171517 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.179373 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=5.179350975 podStartE2EDuration="5.179350975s" podCreationTimestamp="2026-01-29 15:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:27:50.574742578 +0000 UTC m=+5105.269476970" watchObservedRunningTime="2026-01-29 15:27:51.179350975 +0000 UTC m=+5105.874085367" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.242824 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.428888 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.439828 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.448986 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.530244 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:51 crc kubenswrapper[4753]: I0129 15:27:51.530521 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.158974 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.163638 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.221099 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.400428 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d75cc5ff9-mgdjw"] Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.402266 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.405499 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.411412 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75cc5ff9-mgdjw"] Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.428426 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.439083 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.449623 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.457750 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.457825 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-config\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.457870 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dljkn\" (UniqueName: \"kubernetes.io/projected/9095a2b4-327a-4fde-a633-85dce709cb6d-kube-api-access-dljkn\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.457931 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-dns-svc\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.559653 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-dns-svc\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.559770 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.559818 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-config\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.559856 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dljkn\" (UniqueName: \"kubernetes.io/projected/9095a2b4-327a-4fde-a633-85dce709cb6d-kube-api-access-dljkn\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.560965 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-dns-svc\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.561008 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-config\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.561630 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.591416 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dljkn\" (UniqueName: \"kubernetes.io/projected/9095a2b4-327a-4fde-a633-85dce709cb6d-kube-api-access-dljkn\") pod \"dnsmasq-dns-5d75cc5ff9-mgdjw\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:53 crc kubenswrapper[4753]: I0129 15:27:53.730222 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.335901 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75cc5ff9-mgdjw"] Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.341503 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.507822 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.551503 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.557660 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" event={"ID":"9095a2b4-327a-4fde-a633-85dce709cb6d","Type":"ContainerStarted","Data":"a31409f256e9b81f457ea8216b141d18ecf95f598098432cb9ad7374389ad0db"} Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.581798 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.631847 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.631968 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.640497 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.843102 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d75cc5ff9-mgdjw"] Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.864666 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c4fbb6c5-k6k4z"] Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.872309 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.874711 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.882642 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4fbb6c5-k6k4z"] Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.986462 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-dns-svc\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.986549 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.986594 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.986747 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9s2l\" (UniqueName: \"kubernetes.io/projected/05ee8739-6c98-455d-a5e8-f9b7b55788ed-kube-api-access-n9s2l\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:54 crc kubenswrapper[4753]: I0129 15:27:54.987041 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-config\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.088660 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-config\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.088821 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-dns-svc\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.089701 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-config\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.090133 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-dns-svc\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.090378 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.091466 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.091643 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.093095 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.093631 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9s2l\" (UniqueName: \"kubernetes.io/projected/05ee8739-6c98-455d-a5e8-f9b7b55788ed-kube-api-access-n9s2l\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.110084 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9s2l\" (UniqueName: \"kubernetes.io/projected/05ee8739-6c98-455d-a5e8-f9b7b55788ed-kube-api-access-n9s2l\") pod \"dnsmasq-dns-c4fbb6c5-k6k4z\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.188931 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.567451 4753 generic.go:334] "Generic (PLEG): container finished" podID="9095a2b4-327a-4fde-a633-85dce709cb6d" containerID="e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c" exitCode=0 Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.567513 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" event={"ID":"9095a2b4-327a-4fde-a633-85dce709cb6d","Type":"ContainerDied","Data":"e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c"} Jan 29 15:27:55 crc kubenswrapper[4753]: I0129 15:27:55.697970 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4fbb6c5-k6k4z"] Jan 29 15:27:56 crc kubenswrapper[4753]: I0129 15:27:56.579064 4753 generic.go:334] "Generic (PLEG): container finished" podID="05ee8739-6c98-455d-a5e8-f9b7b55788ed" containerID="35b5bb938da1b4b9e01854d47efc2d569a95a59a023d82c219b26874f9d17b63" exitCode=0 Jan 29 15:27:56 crc kubenswrapper[4753]: I0129 15:27:56.579193 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" event={"ID":"05ee8739-6c98-455d-a5e8-f9b7b55788ed","Type":"ContainerDied","Data":"35b5bb938da1b4b9e01854d47efc2d569a95a59a023d82c219b26874f9d17b63"} Jan 29 15:27:56 crc kubenswrapper[4753]: I0129 15:27:56.579232 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" event={"ID":"05ee8739-6c98-455d-a5e8-f9b7b55788ed","Type":"ContainerStarted","Data":"5dba340e6c256bf385daf5cacc7af19f6974f757f29fc75c85eee5e646ebe2cf"} Jan 29 15:27:56 crc kubenswrapper[4753]: I0129 15:27:56.583036 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" event={"ID":"9095a2b4-327a-4fde-a633-85dce709cb6d","Type":"ContainerStarted","Data":"1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac"} Jan 29 15:27:56 crc kubenswrapper[4753]: I0129 15:27:56.583357 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:56 crc kubenswrapper[4753]: I0129 15:27:56.583413 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" podUID="9095a2b4-327a-4fde-a633-85dce709cb6d" containerName="dnsmasq-dns" containerID="cri-o://1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac" gracePeriod=10 Jan 29 15:27:56 crc kubenswrapper[4753]: I0129 15:27:56.647659 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" podStartSLOduration=3.647634881 podStartE2EDuration="3.647634881s" podCreationTimestamp="2026-01-29 15:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:27:56.64014735 +0000 UTC m=+5111.334881742" watchObservedRunningTime="2026-01-29 15:27:56.647634881 +0000 UTC m=+5111.342369273" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.111585 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.235292 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dljkn\" (UniqueName: \"kubernetes.io/projected/9095a2b4-327a-4fde-a633-85dce709cb6d-kube-api-access-dljkn\") pod \"9095a2b4-327a-4fde-a633-85dce709cb6d\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.235338 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-dns-svc\") pod \"9095a2b4-327a-4fde-a633-85dce709cb6d\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.235370 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-config\") pod \"9095a2b4-327a-4fde-a633-85dce709cb6d\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.235442 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-ovsdbserver-nb\") pod \"9095a2b4-327a-4fde-a633-85dce709cb6d\" (UID: \"9095a2b4-327a-4fde-a633-85dce709cb6d\") " Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.244853 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9095a2b4-327a-4fde-a633-85dce709cb6d-kube-api-access-dljkn" (OuterVolumeSpecName: "kube-api-access-dljkn") pod "9095a2b4-327a-4fde-a633-85dce709cb6d" (UID: "9095a2b4-327a-4fde-a633-85dce709cb6d"). InnerVolumeSpecName "kube-api-access-dljkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.273531 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-config" (OuterVolumeSpecName: "config") pod "9095a2b4-327a-4fde-a633-85dce709cb6d" (UID: "9095a2b4-327a-4fde-a633-85dce709cb6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.273751 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9095a2b4-327a-4fde-a633-85dce709cb6d" (UID: "9095a2b4-327a-4fde-a633-85dce709cb6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.274576 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9095a2b4-327a-4fde-a633-85dce709cb6d" (UID: "9095a2b4-327a-4fde-a633-85dce709cb6d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.337308 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dljkn\" (UniqueName: \"kubernetes.io/projected/9095a2b4-327a-4fde-a633-85dce709cb6d-kube-api-access-dljkn\") on node \"crc\" DevicePath \"\"" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.337350 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.337367 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.337385 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9095a2b4-327a-4fde-a633-85dce709cb6d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.594808 4753 generic.go:334] "Generic (PLEG): container finished" podID="9095a2b4-327a-4fde-a633-85dce709cb6d" containerID="1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac" exitCode=0 Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.594906 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.594924 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" event={"ID":"9095a2b4-327a-4fde-a633-85dce709cb6d","Type":"ContainerDied","Data":"1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac"} Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.594959 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75cc5ff9-mgdjw" event={"ID":"9095a2b4-327a-4fde-a633-85dce709cb6d","Type":"ContainerDied","Data":"a31409f256e9b81f457ea8216b141d18ecf95f598098432cb9ad7374389ad0db"} Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.594981 4753 scope.go:117] "RemoveContainer" containerID="1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.597772 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" event={"ID":"05ee8739-6c98-455d-a5e8-f9b7b55788ed","Type":"ContainerStarted","Data":"ff10efe4e2d91bbf36c5c780589d145331c9d9eeaa71ad4903c58cdd437c498b"} Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.598541 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.629364 4753 scope.go:117] "RemoveContainer" containerID="e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.640047 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" podStartSLOduration=3.640022374 podStartE2EDuration="3.640022374s" podCreationTimestamp="2026-01-29 15:27:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:27:57.638821322 +0000 UTC m=+5112.333555734" watchObservedRunningTime="2026-01-29 15:27:57.640022374 +0000 UTC m=+5112.334756766" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.671118 4753 scope.go:117] "RemoveContainer" containerID="1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac" Jan 29 15:27:57 crc kubenswrapper[4753]: E0129 15:27:57.671800 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac\": container with ID starting with 1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac not found: ID does not exist" containerID="1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.671834 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac"} err="failed to get container status \"1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac\": rpc error: code = NotFound desc = could not find container \"1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac\": container with ID starting with 1dda434f3efa7ff3bb58a6ddaec72bc40376890211f9f1c9644a874f7040d8ac not found: ID does not exist" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.671858 4753 scope.go:117] "RemoveContainer" containerID="e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c" Jan 29 15:27:57 crc kubenswrapper[4753]: E0129 15:27:57.672336 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c\": container with ID starting with e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c not found: ID does not exist" containerID="e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.672366 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c"} err="failed to get container status \"e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c\": rpc error: code = NotFound desc = could not find container \"e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c\": container with ID starting with e9d3f378128313e178045405c2a921f6a3cf562802b08658208ab391fb2c818c not found: ID does not exist" Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.687653 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d75cc5ff9-mgdjw"] Jan 29 15:27:57 crc kubenswrapper[4753]: I0129 15:27:57.703047 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d75cc5ff9-mgdjw"] Jan 29 15:27:58 crc kubenswrapper[4753]: I0129 15:27:58.180251 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9095a2b4-327a-4fde-a633-85dce709cb6d" path="/var/lib/kubelet/pods/9095a2b4-327a-4fde-a633-85dce709cb6d/volumes" Jan 29 15:27:58 crc kubenswrapper[4753]: I0129 15:27:58.492774 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.154893 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:28:00 crc kubenswrapper[4753]: E0129 15:28:00.155712 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.729786 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Jan 29 15:28:00 crc kubenswrapper[4753]: E0129 15:28:00.730230 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9095a2b4-327a-4fde-a633-85dce709cb6d" containerName="init" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.730250 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9095a2b4-327a-4fde-a633-85dce709cb6d" containerName="init" Jan 29 15:28:00 crc kubenswrapper[4753]: E0129 15:28:00.730301 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9095a2b4-327a-4fde-a633-85dce709cb6d" containerName="dnsmasq-dns" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.730311 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9095a2b4-327a-4fde-a633-85dce709cb6d" containerName="dnsmasq-dns" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.730508 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9095a2b4-327a-4fde-a633-85dce709cb6d" containerName="dnsmasq-dns" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.731119 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.733880 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.742273 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.894413 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdfz\" (UniqueName: \"kubernetes.io/projected/b4637574-7d3a-4145-91e7-b46c949531e2-kube-api-access-2pdfz\") pod \"ovn-copy-data\" (UID: \"b4637574-7d3a-4145-91e7-b46c949531e2\") " pod="openstack/ovn-copy-data" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.894494 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/b4637574-7d3a-4145-91e7-b46c949531e2-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"b4637574-7d3a-4145-91e7-b46c949531e2\") " pod="openstack/ovn-copy-data" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.894819 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-924f45ce-8399-4c78-bf87-fa3e7890ae3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-924f45ce-8399-4c78-bf87-fa3e7890ae3f\") pod \"ovn-copy-data\" (UID: \"b4637574-7d3a-4145-91e7-b46c949531e2\") " pod="openstack/ovn-copy-data" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.995781 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-924f45ce-8399-4c78-bf87-fa3e7890ae3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-924f45ce-8399-4c78-bf87-fa3e7890ae3f\") pod \"ovn-copy-data\" (UID: \"b4637574-7d3a-4145-91e7-b46c949531e2\") " pod="openstack/ovn-copy-data" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.995850 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdfz\" (UniqueName: \"kubernetes.io/projected/b4637574-7d3a-4145-91e7-b46c949531e2-kube-api-access-2pdfz\") pod \"ovn-copy-data\" (UID: \"b4637574-7d3a-4145-91e7-b46c949531e2\") " pod="openstack/ovn-copy-data" Jan 29 15:28:00 crc kubenswrapper[4753]: I0129 15:28:00.995875 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/b4637574-7d3a-4145-91e7-b46c949531e2-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"b4637574-7d3a-4145-91e7-b46c949531e2\") " pod="openstack/ovn-copy-data" Jan 29 15:28:01 crc kubenswrapper[4753]: I0129 15:28:01.000460 4753 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 29 15:28:01 crc kubenswrapper[4753]: I0129 15:28:01.000502 4753 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-924f45ce-8399-4c78-bf87-fa3e7890ae3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-924f45ce-8399-4c78-bf87-fa3e7890ae3f\") pod \"ovn-copy-data\" (UID: \"b4637574-7d3a-4145-91e7-b46c949531e2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c69e9f53358a2e05786259d689ea129e45bfc34a5eba8161a36681d23ca50da1/globalmount\"" pod="openstack/ovn-copy-data" Jan 29 15:28:01 crc kubenswrapper[4753]: I0129 15:28:01.009053 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/b4637574-7d3a-4145-91e7-b46c949531e2-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"b4637574-7d3a-4145-91e7-b46c949531e2\") " pod="openstack/ovn-copy-data" Jan 29 15:28:01 crc kubenswrapper[4753]: I0129 15:28:01.034518 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdfz\" (UniqueName: \"kubernetes.io/projected/b4637574-7d3a-4145-91e7-b46c949531e2-kube-api-access-2pdfz\") pod \"ovn-copy-data\" (UID: \"b4637574-7d3a-4145-91e7-b46c949531e2\") " pod="openstack/ovn-copy-data" Jan 29 15:28:01 crc kubenswrapper[4753]: I0129 15:28:01.038455 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-924f45ce-8399-4c78-bf87-fa3e7890ae3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-924f45ce-8399-4c78-bf87-fa3e7890ae3f\") pod \"ovn-copy-data\" (UID: \"b4637574-7d3a-4145-91e7-b46c949531e2\") " pod="openstack/ovn-copy-data" Jan 29 15:28:01 crc kubenswrapper[4753]: I0129 15:28:01.073294 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Jan 29 15:28:01 crc kubenswrapper[4753]: W0129 15:28:01.675901 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4637574_7d3a_4145_91e7_b46c949531e2.slice/crio-b373fdf55df72e335a8d0bf151c6a2fc9dd60ab2d3457968dcf1ca4d1933ab63 WatchSource:0}: Error finding container b373fdf55df72e335a8d0bf151c6a2fc9dd60ab2d3457968dcf1ca4d1933ab63: Status 404 returned error can't find the container with id b373fdf55df72e335a8d0bf151c6a2fc9dd60ab2d3457968dcf1ca4d1933ab63 Jan 29 15:28:01 crc kubenswrapper[4753]: I0129 15:28:01.678253 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 15:28:01 crc kubenswrapper[4753]: I0129 15:28:01.683425 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Jan 29 15:28:02 crc kubenswrapper[4753]: I0129 15:28:02.647249 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"b4637574-7d3a-4145-91e7-b46c949531e2","Type":"ContainerStarted","Data":"4abfa68df3991a2d412c2fa6c9f53d0457348c555a4160156d90cc433507d206"} Jan 29 15:28:02 crc kubenswrapper[4753]: I0129 15:28:02.647629 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"b4637574-7d3a-4145-91e7-b46c949531e2","Type":"ContainerStarted","Data":"b373fdf55df72e335a8d0bf151c6a2fc9dd60ab2d3457968dcf1ca4d1933ab63"} Jan 29 15:28:02 crc kubenswrapper[4753]: I0129 15:28:02.674469 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.028700974 podStartE2EDuration="3.674440936s" podCreationTimestamp="2026-01-29 15:27:59 +0000 UTC" firstStartedPulling="2026-01-29 15:28:01.678011015 +0000 UTC m=+5116.372745397" lastFinishedPulling="2026-01-29 15:28:02.323750977 +0000 UTC m=+5117.018485359" observedRunningTime="2026-01-29 15:28:02.666941794 +0000 UTC m=+5117.361676216" watchObservedRunningTime="2026-01-29 15:28:02.674440936 +0000 UTC m=+5117.369175348" Jan 29 15:28:05 crc kubenswrapper[4753]: I0129 15:28:05.191868 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:28:05 crc kubenswrapper[4753]: I0129 15:28:05.279032 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-vj7cn"] Jan 29 15:28:05 crc kubenswrapper[4753]: I0129 15:28:05.279447 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" podUID="28c0dd77-1fe3-4952-be0d-e9d0ef321273" containerName="dnsmasq-dns" containerID="cri-o://2db1d3734e1f9a868fc4df6a2a68fd2f8e137ab50ab72af91cb568f113d708b1" gracePeriod=10 Jan 29 15:28:05 crc kubenswrapper[4753]: I0129 15:28:05.670782 4753 generic.go:334] "Generic (PLEG): container finished" podID="28c0dd77-1fe3-4952-be0d-e9d0ef321273" containerID="2db1d3734e1f9a868fc4df6a2a68fd2f8e137ab50ab72af91cb568f113d708b1" exitCode=0 Jan 29 15:28:05 crc kubenswrapper[4753]: I0129 15:28:05.671069 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" event={"ID":"28c0dd77-1fe3-4952-be0d-e9d0ef321273","Type":"ContainerDied","Data":"2db1d3734e1f9a868fc4df6a2a68fd2f8e137ab50ab72af91cb568f113d708b1"} Jan 29 15:28:05 crc kubenswrapper[4753]: I0129 15:28:05.819845 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.007000 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-config\") pod \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.007409 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-dns-svc\") pod \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.007453 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j47zl\" (UniqueName: \"kubernetes.io/projected/28c0dd77-1fe3-4952-be0d-e9d0ef321273-kube-api-access-j47zl\") pod \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\" (UID: \"28c0dd77-1fe3-4952-be0d-e9d0ef321273\") " Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.016780 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c0dd77-1fe3-4952-be0d-e9d0ef321273-kube-api-access-j47zl" (OuterVolumeSpecName: "kube-api-access-j47zl") pod "28c0dd77-1fe3-4952-be0d-e9d0ef321273" (UID: "28c0dd77-1fe3-4952-be0d-e9d0ef321273"). InnerVolumeSpecName "kube-api-access-j47zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.075515 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-config" (OuterVolumeSpecName: "config") pod "28c0dd77-1fe3-4952-be0d-e9d0ef321273" (UID: "28c0dd77-1fe3-4952-be0d-e9d0ef321273"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.089222 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "28c0dd77-1fe3-4952-be0d-e9d0ef321273" (UID: "28c0dd77-1fe3-4952-be0d-e9d0ef321273"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.109748 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j47zl\" (UniqueName: \"kubernetes.io/projected/28c0dd77-1fe3-4952-be0d-e9d0ef321273-kube-api-access-j47zl\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.109795 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.109808 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28c0dd77-1fe3-4952-be0d-e9d0ef321273-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.634198 4753 scope.go:117] "RemoveContainer" containerID="e085115e5cb3fe81227673565b520de9a5c875c69b73eef6583603ce1dbed345" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.655820 4753 scope.go:117] "RemoveContainer" containerID="2db1d3734e1f9a868fc4df6a2a68fd2f8e137ab50ab72af91cb568f113d708b1" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.682627 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" event={"ID":"28c0dd77-1fe3-4952-be0d-e9d0ef321273","Type":"ContainerDied","Data":"d2851affe45639b5bbe45db9e741eee82c7c22738e5bcc2f61428d3bcb4c6b0a"} Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.682686 4753 scope.go:117] "RemoveContainer" containerID="2db1d3734e1f9a868fc4df6a2a68fd2f8e137ab50ab72af91cb568f113d708b1" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.682698 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699964fbc-vj7cn" Jan 29 15:28:06 crc kubenswrapper[4753]: E0129 15:28:06.695814 4753 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_dnsmasq-dns_dnsmasq-dns-699964fbc-vj7cn_openstack_28c0dd77-1fe3-4952-be0d-e9d0ef321273_0 in pod sandbox d2851affe45639b5bbe45db9e741eee82c7c22738e5bcc2f61428d3bcb4c6b0a from index: no such id: '2db1d3734e1f9a868fc4df6a2a68fd2f8e137ab50ab72af91cb568f113d708b1'" containerID="2db1d3734e1f9a868fc4df6a2a68fd2f8e137ab50ab72af91cb568f113d708b1" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.695852 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db1d3734e1f9a868fc4df6a2a68fd2f8e137ab50ab72af91cb568f113d708b1"} err="rpc error: code = Unknown desc = failed to delete container k8s_dnsmasq-dns_dnsmasq-dns-699964fbc-vj7cn_openstack_28c0dd77-1fe3-4952-be0d-e9d0ef321273_0 in pod sandbox d2851affe45639b5bbe45db9e741eee82c7c22738e5bcc2f61428d3bcb4c6b0a from index: no such id: '2db1d3734e1f9a868fc4df6a2a68fd2f8e137ab50ab72af91cb568f113d708b1'" Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.718449 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-vj7cn"] Jan 29 15:28:06 crc kubenswrapper[4753]: I0129 15:28:06.737769 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699964fbc-vj7cn"] Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.911530 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 29 15:28:07 crc kubenswrapper[4753]: E0129 15:28:07.913694 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c0dd77-1fe3-4952-be0d-e9d0ef321273" containerName="init" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.913726 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c0dd77-1fe3-4952-be0d-e9d0ef321273" containerName="init" Jan 29 15:28:07 crc kubenswrapper[4753]: E0129 15:28:07.913766 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c0dd77-1fe3-4952-be0d-e9d0ef321273" containerName="dnsmasq-dns" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.913778 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c0dd77-1fe3-4952-be0d-e9d0ef321273" containerName="dnsmasq-dns" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.914080 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="28c0dd77-1fe3-4952-be0d-e9d0ef321273" containerName="dnsmasq-dns" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.915490 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.917971 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-cbt86" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.918272 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.923572 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.931034 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.946613 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-config\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.946673 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvvh2\" (UniqueName: \"kubernetes.io/projected/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-kube-api-access-pvvh2\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.946740 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.946764 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-scripts\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:07 crc kubenswrapper[4753]: I0129 15:28:07.946799 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.048168 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-config\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.048218 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvvh2\" (UniqueName: \"kubernetes.io/projected/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-kube-api-access-pvvh2\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.048277 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.048299 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-scripts\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.048329 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.048758 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.049286 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-scripts\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.049957 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-config\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.053232 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.071137 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvvh2\" (UniqueName: \"kubernetes.io/projected/3ebf42b3-0d4c-45b4-b765-f02514c8dc3f-kube-api-access-pvvh2\") pod \"ovn-northd-0\" (UID: \"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f\") " pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.160567 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28c0dd77-1fe3-4952-be0d-e9d0ef321273" path="/var/lib/kubelet/pods/28c0dd77-1fe3-4952-be0d-e9d0ef321273/volumes" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.236654 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.660548 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 29 15:28:08 crc kubenswrapper[4753]: I0129 15:28:08.704622 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f","Type":"ContainerStarted","Data":"af4a358ddc5900f1673c9b57ad2d1eddaa480657e07a6662da15476e29023345"} Jan 29 15:28:09 crc kubenswrapper[4753]: I0129 15:28:09.719380 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f","Type":"ContainerStarted","Data":"ba564f30df572056f78f6ba769ab04e11eb5e288aecb687baa5eb31cea6fb688"} Jan 29 15:28:09 crc kubenswrapper[4753]: I0129 15:28:09.720022 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 29 15:28:09 crc kubenswrapper[4753]: I0129 15:28:09.720063 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3ebf42b3-0d4c-45b4-b765-f02514c8dc3f","Type":"ContainerStarted","Data":"edaa8ecaaa39bea8ec37568fa22fe1e781e6e3615467f856c87140d7e489729d"} Jan 29 15:28:09 crc kubenswrapper[4753]: I0129 15:28:09.761078 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.761049946 podStartE2EDuration="2.761049946s" podCreationTimestamp="2026-01-29 15:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:28:09.741841109 +0000 UTC m=+5124.436575541" watchObservedRunningTime="2026-01-29 15:28:09.761049946 +0000 UTC m=+5124.455784368" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.149838 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:28:13 crc kubenswrapper[4753]: E0129 15:28:13.150570 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.150939 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wqvzr"] Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.152002 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wqvzr" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.165789 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wqvzr"] Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.247407 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-36bb-account-create-update-sfm8j"] Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.248676 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36bb-account-create-update-sfm8j" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.250026 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.261716 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-36bb-account-create-update-sfm8j"] Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.340306 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxfjs\" (UniqueName: \"kubernetes.io/projected/479fd91e-fae0-432b-a820-d08f11fb8229-kube-api-access-sxfjs\") pod \"keystone-db-create-wqvzr\" (UID: \"479fd91e-fae0-432b-a820-d08f11fb8229\") " pod="openstack/keystone-db-create-wqvzr" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.340455 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479fd91e-fae0-432b-a820-d08f11fb8229-operator-scripts\") pod \"keystone-db-create-wqvzr\" (UID: \"479fd91e-fae0-432b-a820-d08f11fb8229\") " pod="openstack/keystone-db-create-wqvzr" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.442241 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08804285-e806-48a0-a925-67672aed97b5-operator-scripts\") pod \"keystone-36bb-account-create-update-sfm8j\" (UID: \"08804285-e806-48a0-a925-67672aed97b5\") " pod="openstack/keystone-36bb-account-create-update-sfm8j" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.442572 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxfjs\" (UniqueName: \"kubernetes.io/projected/479fd91e-fae0-432b-a820-d08f11fb8229-kube-api-access-sxfjs\") pod \"keystone-db-create-wqvzr\" (UID: \"479fd91e-fae0-432b-a820-d08f11fb8229\") " pod="openstack/keystone-db-create-wqvzr" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.442737 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z28xs\" (UniqueName: \"kubernetes.io/projected/08804285-e806-48a0-a925-67672aed97b5-kube-api-access-z28xs\") pod \"keystone-36bb-account-create-update-sfm8j\" (UID: \"08804285-e806-48a0-a925-67672aed97b5\") " pod="openstack/keystone-36bb-account-create-update-sfm8j" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.442924 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479fd91e-fae0-432b-a820-d08f11fb8229-operator-scripts\") pod \"keystone-db-create-wqvzr\" (UID: \"479fd91e-fae0-432b-a820-d08f11fb8229\") " pod="openstack/keystone-db-create-wqvzr" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.444488 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479fd91e-fae0-432b-a820-d08f11fb8229-operator-scripts\") pod \"keystone-db-create-wqvzr\" (UID: \"479fd91e-fae0-432b-a820-d08f11fb8229\") " pod="openstack/keystone-db-create-wqvzr" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.474767 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxfjs\" (UniqueName: \"kubernetes.io/projected/479fd91e-fae0-432b-a820-d08f11fb8229-kube-api-access-sxfjs\") pod \"keystone-db-create-wqvzr\" (UID: \"479fd91e-fae0-432b-a820-d08f11fb8229\") " pod="openstack/keystone-db-create-wqvzr" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.484598 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wqvzr" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.546276 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08804285-e806-48a0-a925-67672aed97b5-operator-scripts\") pod \"keystone-36bb-account-create-update-sfm8j\" (UID: \"08804285-e806-48a0-a925-67672aed97b5\") " pod="openstack/keystone-36bb-account-create-update-sfm8j" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.546632 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z28xs\" (UniqueName: \"kubernetes.io/projected/08804285-e806-48a0-a925-67672aed97b5-kube-api-access-z28xs\") pod \"keystone-36bb-account-create-update-sfm8j\" (UID: \"08804285-e806-48a0-a925-67672aed97b5\") " pod="openstack/keystone-36bb-account-create-update-sfm8j" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.551101 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08804285-e806-48a0-a925-67672aed97b5-operator-scripts\") pod \"keystone-36bb-account-create-update-sfm8j\" (UID: \"08804285-e806-48a0-a925-67672aed97b5\") " pod="openstack/keystone-36bb-account-create-update-sfm8j" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.576445 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z28xs\" (UniqueName: \"kubernetes.io/projected/08804285-e806-48a0-a925-67672aed97b5-kube-api-access-z28xs\") pod \"keystone-36bb-account-create-update-sfm8j\" (UID: \"08804285-e806-48a0-a925-67672aed97b5\") " pod="openstack/keystone-36bb-account-create-update-sfm8j" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.865437 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36bb-account-create-update-sfm8j" Jan 29 15:28:13 crc kubenswrapper[4753]: I0129 15:28:13.929754 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wqvzr"] Jan 29 15:28:13 crc kubenswrapper[4753]: W0129 15:28:13.949027 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod479fd91e_fae0_432b_a820_d08f11fb8229.slice/crio-d36f2831f220afe8878b71b5b5c56288947c73fcc57bdd8e70aa6570111f8363 WatchSource:0}: Error finding container d36f2831f220afe8878b71b5b5c56288947c73fcc57bdd8e70aa6570111f8363: Status 404 returned error can't find the container with id d36f2831f220afe8878b71b5b5c56288947c73fcc57bdd8e70aa6570111f8363 Jan 29 15:28:14 crc kubenswrapper[4753]: I0129 15:28:14.289906 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-36bb-account-create-update-sfm8j"] Jan 29 15:28:14 crc kubenswrapper[4753]: W0129 15:28:14.297864 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08804285_e806_48a0_a925_67672aed97b5.slice/crio-aa12f7f1f876bcf95dc5c21a8aac4d1697446325e904cca578305d9871f59e60 WatchSource:0}: Error finding container aa12f7f1f876bcf95dc5c21a8aac4d1697446325e904cca578305d9871f59e60: Status 404 returned error can't find the container with id aa12f7f1f876bcf95dc5c21a8aac4d1697446325e904cca578305d9871f59e60 Jan 29 15:28:14 crc kubenswrapper[4753]: I0129 15:28:14.764026 4753 generic.go:334] "Generic (PLEG): container finished" podID="08804285-e806-48a0-a925-67672aed97b5" containerID="5eb433e494a46864bdc3f079b1b838b7681c3b991a7ceb1f29fd14d8678b18c6" exitCode=0 Jan 29 15:28:14 crc kubenswrapper[4753]: I0129 15:28:14.764127 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-36bb-account-create-update-sfm8j" event={"ID":"08804285-e806-48a0-a925-67672aed97b5","Type":"ContainerDied","Data":"5eb433e494a46864bdc3f079b1b838b7681c3b991a7ceb1f29fd14d8678b18c6"} Jan 29 15:28:14 crc kubenswrapper[4753]: I0129 15:28:14.764193 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-36bb-account-create-update-sfm8j" event={"ID":"08804285-e806-48a0-a925-67672aed97b5","Type":"ContainerStarted","Data":"aa12f7f1f876bcf95dc5c21a8aac4d1697446325e904cca578305d9871f59e60"} Jan 29 15:28:14 crc kubenswrapper[4753]: I0129 15:28:14.765941 4753 generic.go:334] "Generic (PLEG): container finished" podID="479fd91e-fae0-432b-a820-d08f11fb8229" containerID="cc2a6fc09f44ac5217c4ffde4c2928b7956d1a5ed5ca9d830a11b1554079ca1f" exitCode=0 Jan 29 15:28:14 crc kubenswrapper[4753]: I0129 15:28:14.765993 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wqvzr" event={"ID":"479fd91e-fae0-432b-a820-d08f11fb8229","Type":"ContainerDied","Data":"cc2a6fc09f44ac5217c4ffde4c2928b7956d1a5ed5ca9d830a11b1554079ca1f"} Jan 29 15:28:14 crc kubenswrapper[4753]: I0129 15:28:14.766020 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wqvzr" event={"ID":"479fd91e-fae0-432b-a820-d08f11fb8229","Type":"ContainerStarted","Data":"d36f2831f220afe8878b71b5b5c56288947c73fcc57bdd8e70aa6570111f8363"} Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.247145 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wqvzr" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.255598 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36bb-account-create-update-sfm8j" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.400132 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479fd91e-fae0-432b-a820-d08f11fb8229-operator-scripts\") pod \"479fd91e-fae0-432b-a820-d08f11fb8229\" (UID: \"479fd91e-fae0-432b-a820-d08f11fb8229\") " Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.400215 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z28xs\" (UniqueName: \"kubernetes.io/projected/08804285-e806-48a0-a925-67672aed97b5-kube-api-access-z28xs\") pod \"08804285-e806-48a0-a925-67672aed97b5\" (UID: \"08804285-e806-48a0-a925-67672aed97b5\") " Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.400262 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxfjs\" (UniqueName: \"kubernetes.io/projected/479fd91e-fae0-432b-a820-d08f11fb8229-kube-api-access-sxfjs\") pod \"479fd91e-fae0-432b-a820-d08f11fb8229\" (UID: \"479fd91e-fae0-432b-a820-d08f11fb8229\") " Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.400294 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08804285-e806-48a0-a925-67672aed97b5-operator-scripts\") pod \"08804285-e806-48a0-a925-67672aed97b5\" (UID: \"08804285-e806-48a0-a925-67672aed97b5\") " Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.401802 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479fd91e-fae0-432b-a820-d08f11fb8229-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "479fd91e-fae0-432b-a820-d08f11fb8229" (UID: "479fd91e-fae0-432b-a820-d08f11fb8229"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.402019 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08804285-e806-48a0-a925-67672aed97b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08804285-e806-48a0-a925-67672aed97b5" (UID: "08804285-e806-48a0-a925-67672aed97b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.407529 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479fd91e-fae0-432b-a820-d08f11fb8229-kube-api-access-sxfjs" (OuterVolumeSpecName: "kube-api-access-sxfjs") pod "479fd91e-fae0-432b-a820-d08f11fb8229" (UID: "479fd91e-fae0-432b-a820-d08f11fb8229"). InnerVolumeSpecName "kube-api-access-sxfjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.418855 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08804285-e806-48a0-a925-67672aed97b5-kube-api-access-z28xs" (OuterVolumeSpecName: "kube-api-access-z28xs") pod "08804285-e806-48a0-a925-67672aed97b5" (UID: "08804285-e806-48a0-a925-67672aed97b5"). InnerVolumeSpecName "kube-api-access-z28xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.502336 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479fd91e-fae0-432b-a820-d08f11fb8229-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.502743 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z28xs\" (UniqueName: \"kubernetes.io/projected/08804285-e806-48a0-a925-67672aed97b5-kube-api-access-z28xs\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.502898 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxfjs\" (UniqueName: \"kubernetes.io/projected/479fd91e-fae0-432b-a820-d08f11fb8229-kube-api-access-sxfjs\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.503008 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08804285-e806-48a0-a925-67672aed97b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.789864 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wqvzr" event={"ID":"479fd91e-fae0-432b-a820-d08f11fb8229","Type":"ContainerDied","Data":"d36f2831f220afe8878b71b5b5c56288947c73fcc57bdd8e70aa6570111f8363"} Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.789939 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d36f2831f220afe8878b71b5b5c56288947c73fcc57bdd8e70aa6570111f8363" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.789948 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wqvzr" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.792625 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-36bb-account-create-update-sfm8j" event={"ID":"08804285-e806-48a0-a925-67672aed97b5","Type":"ContainerDied","Data":"aa12f7f1f876bcf95dc5c21a8aac4d1697446325e904cca578305d9871f59e60"} Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.792680 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa12f7f1f876bcf95dc5c21a8aac4d1697446325e904cca578305d9871f59e60" Jan 29 15:28:16 crc kubenswrapper[4753]: I0129 15:28:16.793013 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-36bb-account-create-update-sfm8j" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.296925 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.613259 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-thdvt"] Jan 29 15:28:18 crc kubenswrapper[4753]: E0129 15:28:18.613584 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08804285-e806-48a0-a925-67672aed97b5" containerName="mariadb-account-create-update" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.613601 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="08804285-e806-48a0-a925-67672aed97b5" containerName="mariadb-account-create-update" Jan 29 15:28:18 crc kubenswrapper[4753]: E0129 15:28:18.613614 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479fd91e-fae0-432b-a820-d08f11fb8229" containerName="mariadb-database-create" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.613621 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="479fd91e-fae0-432b-a820-d08f11fb8229" containerName="mariadb-database-create" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.613747 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="08804285-e806-48a0-a925-67672aed97b5" containerName="mariadb-account-create-update" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.613762 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="479fd91e-fae0-432b-a820-d08f11fb8229" containerName="mariadb-database-create" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.614265 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.616343 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nqkj" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.620850 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.621047 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.621444 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.629658 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-thdvt"] Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.639552 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-config-data\") pod \"keystone-db-sync-thdvt\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.639629 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b94h2\" (UniqueName: \"kubernetes.io/projected/d843255e-b0da-492a-92e6-6d42d8ef9848-kube-api-access-b94h2\") pod \"keystone-db-sync-thdvt\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.639762 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-combined-ca-bundle\") pod \"keystone-db-sync-thdvt\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.741059 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b94h2\" (UniqueName: \"kubernetes.io/projected/d843255e-b0da-492a-92e6-6d42d8ef9848-kube-api-access-b94h2\") pod \"keystone-db-sync-thdvt\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.741224 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-combined-ca-bundle\") pod \"keystone-db-sync-thdvt\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.741252 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-config-data\") pod \"keystone-db-sync-thdvt\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.748953 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-combined-ca-bundle\") pod \"keystone-db-sync-thdvt\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.751632 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-config-data\") pod \"keystone-db-sync-thdvt\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.757524 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b94h2\" (UniqueName: \"kubernetes.io/projected/d843255e-b0da-492a-92e6-6d42d8ef9848-kube-api-access-b94h2\") pod \"keystone-db-sync-thdvt\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:18 crc kubenswrapper[4753]: I0129 15:28:18.931747 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:19 crc kubenswrapper[4753]: W0129 15:28:19.394486 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd843255e_b0da_492a_92e6_6d42d8ef9848.slice/crio-f7c80a18b63ea704a7262c2357bd959e517bcb2e7cf0f9300999b63466f16aa5 WatchSource:0}: Error finding container f7c80a18b63ea704a7262c2357bd959e517bcb2e7cf0f9300999b63466f16aa5: Status 404 returned error can't find the container with id f7c80a18b63ea704a7262c2357bd959e517bcb2e7cf0f9300999b63466f16aa5 Jan 29 15:28:19 crc kubenswrapper[4753]: I0129 15:28:19.394851 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-thdvt"] Jan 29 15:28:19 crc kubenswrapper[4753]: I0129 15:28:19.818385 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-thdvt" event={"ID":"d843255e-b0da-492a-92e6-6d42d8ef9848","Type":"ContainerStarted","Data":"8b88f155823524b0f724f10d95de9fe9d8446ee03582534c88dc48df98219555"} Jan 29 15:28:19 crc kubenswrapper[4753]: I0129 15:28:19.818446 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-thdvt" event={"ID":"d843255e-b0da-492a-92e6-6d42d8ef9848","Type":"ContainerStarted","Data":"f7c80a18b63ea704a7262c2357bd959e517bcb2e7cf0f9300999b63466f16aa5"} Jan 29 15:28:19 crc kubenswrapper[4753]: I0129 15:28:19.843800 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-thdvt" podStartSLOduration=1.843715792 podStartE2EDuration="1.843715792s" podCreationTimestamp="2026-01-29 15:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:28:19.841243866 +0000 UTC m=+5134.535978258" watchObservedRunningTime="2026-01-29 15:28:19.843715792 +0000 UTC m=+5134.538450184" Jan 29 15:28:21 crc kubenswrapper[4753]: I0129 15:28:21.841093 4753 generic.go:334] "Generic (PLEG): container finished" podID="d843255e-b0da-492a-92e6-6d42d8ef9848" containerID="8b88f155823524b0f724f10d95de9fe9d8446ee03582534c88dc48df98219555" exitCode=0 Jan 29 15:28:21 crc kubenswrapper[4753]: I0129 15:28:21.841212 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-thdvt" event={"ID":"d843255e-b0da-492a-92e6-6d42d8ef9848","Type":"ContainerDied","Data":"8b88f155823524b0f724f10d95de9fe9d8446ee03582534c88dc48df98219555"} Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.306267 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.434544 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-combined-ca-bundle\") pod \"d843255e-b0da-492a-92e6-6d42d8ef9848\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.434619 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-config-data\") pod \"d843255e-b0da-492a-92e6-6d42d8ef9848\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.434664 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b94h2\" (UniqueName: \"kubernetes.io/projected/d843255e-b0da-492a-92e6-6d42d8ef9848-kube-api-access-b94h2\") pod \"d843255e-b0da-492a-92e6-6d42d8ef9848\" (UID: \"d843255e-b0da-492a-92e6-6d42d8ef9848\") " Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.449425 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d843255e-b0da-492a-92e6-6d42d8ef9848-kube-api-access-b94h2" (OuterVolumeSpecName: "kube-api-access-b94h2") pod "d843255e-b0da-492a-92e6-6d42d8ef9848" (UID: "d843255e-b0da-492a-92e6-6d42d8ef9848"). InnerVolumeSpecName "kube-api-access-b94h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.481365 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d843255e-b0da-492a-92e6-6d42d8ef9848" (UID: "d843255e-b0da-492a-92e6-6d42d8ef9848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.501058 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-config-data" (OuterVolumeSpecName: "config-data") pod "d843255e-b0da-492a-92e6-6d42d8ef9848" (UID: "d843255e-b0da-492a-92e6-6d42d8ef9848"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.536196 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.536237 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d843255e-b0da-492a-92e6-6d42d8ef9848-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.536250 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b94h2\" (UniqueName: \"kubernetes.io/projected/d843255e-b0da-492a-92e6-6d42d8ef9848-kube-api-access-b94h2\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.871621 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-thdvt" event={"ID":"d843255e-b0da-492a-92e6-6d42d8ef9848","Type":"ContainerDied","Data":"f7c80a18b63ea704a7262c2357bd959e517bcb2e7cf0f9300999b63466f16aa5"} Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.872391 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7c80a18b63ea704a7262c2357bd959e517bcb2e7cf0f9300999b63466f16aa5" Jan 29 15:28:23 crc kubenswrapper[4753]: I0129 15:28:23.871730 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-thdvt" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.163437 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64dc58b59-5pmzk"] Jan 29 15:28:24 crc kubenswrapper[4753]: E0129 15:28:24.163739 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d843255e-b0da-492a-92e6-6d42d8ef9848" containerName="keystone-db-sync" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.163755 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d843255e-b0da-492a-92e6-6d42d8ef9848" containerName="keystone-db-sync" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.163940 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d843255e-b0da-492a-92e6-6d42d8ef9848" containerName="keystone-db-sync" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.165106 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.184935 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64dc58b59-5pmzk"] Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.192529 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-c8nlw"] Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.193498 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.198052 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.198381 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.198659 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.199209 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nqkj" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.199247 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.222588 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c8nlw"] Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.252648 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-scripts\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.252706 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-dns-svc\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.252742 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-fernet-keys\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.252764 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skv8l\" (UniqueName: \"kubernetes.io/projected/a5ed2a79-bd49-4241-a902-633a212cb20a-kube-api-access-skv8l\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.252820 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-config\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.252856 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-nb\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.252942 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-sb\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.252964 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-config-data\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.354938 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-scripts\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.355003 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-dns-svc\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.355039 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-fernet-keys\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.355069 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skv8l\" (UniqueName: \"kubernetes.io/projected/a5ed2a79-bd49-4241-a902-633a212cb20a-kube-api-access-skv8l\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.355134 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-config\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.355187 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-nb\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.355218 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-combined-ca-bundle\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.355264 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-credential-keys\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.355331 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c4jw\" (UniqueName: \"kubernetes.io/projected/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-kube-api-access-7c4jw\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.355362 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-sb\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.355385 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-config-data\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.356330 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-nb\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.357072 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-config\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.357654 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-sb\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.358223 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-dns-svc\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.358895 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-scripts\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.359344 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-fernet-keys\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.370274 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-config-data\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.379437 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skv8l\" (UniqueName: \"kubernetes.io/projected/a5ed2a79-bd49-4241-a902-633a212cb20a-kube-api-access-skv8l\") pod \"dnsmasq-dns-64dc58b59-5pmzk\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.456447 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-combined-ca-bundle\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.456746 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-credential-keys\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.456795 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c4jw\" (UniqueName: \"kubernetes.io/projected/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-kube-api-access-7c4jw\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.460806 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-combined-ca-bundle\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.461224 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-credential-keys\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.495718 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.496496 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c4jw\" (UniqueName: \"kubernetes.io/projected/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-kube-api-access-7c4jw\") pod \"keystone-bootstrap-c8nlw\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:24 crc kubenswrapper[4753]: I0129 15:28:24.516432 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:25 crc kubenswrapper[4753]: W0129 15:28:25.040176 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ab39725_f918_404d_9cc5_3f76e4ddcbaa.slice/crio-bad1892aac5d36736f4d8125d19bae1167773bf437955380b59748b55ae97dd6 WatchSource:0}: Error finding container bad1892aac5d36736f4d8125d19bae1167773bf437955380b59748b55ae97dd6: Status 404 returned error can't find the container with id bad1892aac5d36736f4d8125d19bae1167773bf437955380b59748b55ae97dd6 Jan 29 15:28:25 crc kubenswrapper[4753]: I0129 15:28:25.042050 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c8nlw"] Jan 29 15:28:25 crc kubenswrapper[4753]: W0129 15:28:25.095717 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ed2a79_bd49_4241_a902_633a212cb20a.slice/crio-e137a76c5d21ed02d00c79e6b367c3f432fc0186d1294ff8421857a925d07aa2 WatchSource:0}: Error finding container e137a76c5d21ed02d00c79e6b367c3f432fc0186d1294ff8421857a925d07aa2: Status 404 returned error can't find the container with id e137a76c5d21ed02d00c79e6b367c3f432fc0186d1294ff8421857a925d07aa2 Jan 29 15:28:25 crc kubenswrapper[4753]: I0129 15:28:25.098146 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64dc58b59-5pmzk"] Jan 29 15:28:25 crc kubenswrapper[4753]: I0129 15:28:25.895782 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5ed2a79-bd49-4241-a902-633a212cb20a" containerID="7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922" exitCode=0 Jan 29 15:28:25 crc kubenswrapper[4753]: I0129 15:28:25.895838 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" event={"ID":"a5ed2a79-bd49-4241-a902-633a212cb20a","Type":"ContainerDied","Data":"7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922"} Jan 29 15:28:25 crc kubenswrapper[4753]: I0129 15:28:25.896255 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" event={"ID":"a5ed2a79-bd49-4241-a902-633a212cb20a","Type":"ContainerStarted","Data":"e137a76c5d21ed02d00c79e6b367c3f432fc0186d1294ff8421857a925d07aa2"} Jan 29 15:28:25 crc kubenswrapper[4753]: I0129 15:28:25.899214 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c8nlw" event={"ID":"7ab39725-f918-404d-9cc5-3f76e4ddcbaa","Type":"ContainerStarted","Data":"35f784f78a74ebf146871cc1f34d0e1771c60d7180ab610c68c41a92b60eac57"} Jan 29 15:28:25 crc kubenswrapper[4753]: I0129 15:28:25.899292 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c8nlw" event={"ID":"7ab39725-f918-404d-9cc5-3f76e4ddcbaa","Type":"ContainerStarted","Data":"bad1892aac5d36736f4d8125d19bae1167773bf437955380b59748b55ae97dd6"} Jan 29 15:28:25 crc kubenswrapper[4753]: I0129 15:28:25.955291 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-c8nlw" podStartSLOduration=1.955270086 podStartE2EDuration="1.955270086s" podCreationTimestamp="2026-01-29 15:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:28:25.951060473 +0000 UTC m=+5140.645794895" watchObservedRunningTime="2026-01-29 15:28:25.955270086 +0000 UTC m=+5140.650004478" Jan 29 15:28:26 crc kubenswrapper[4753]: I0129 15:28:26.173604 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:28:26 crc kubenswrapper[4753]: E0129 15:28:26.175763 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:28:26 crc kubenswrapper[4753]: I0129 15:28:26.909648 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" event={"ID":"a5ed2a79-bd49-4241-a902-633a212cb20a","Type":"ContainerStarted","Data":"a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144"} Jan 29 15:28:26 crc kubenswrapper[4753]: I0129 15:28:26.938064 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" podStartSLOduration=2.93804672 podStartE2EDuration="2.93804672s" podCreationTimestamp="2026-01-29 15:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:28:26.936998242 +0000 UTC m=+5141.631732664" watchObservedRunningTime="2026-01-29 15:28:26.93804672 +0000 UTC m=+5141.632781102" Jan 29 15:28:27 crc kubenswrapper[4753]: I0129 15:28:27.916840 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:28 crc kubenswrapper[4753]: I0129 15:28:28.929859 4753 generic.go:334] "Generic (PLEG): container finished" podID="7ab39725-f918-404d-9cc5-3f76e4ddcbaa" containerID="35f784f78a74ebf146871cc1f34d0e1771c60d7180ab610c68c41a92b60eac57" exitCode=0 Jan 29 15:28:28 crc kubenswrapper[4753]: I0129 15:28:28.930117 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c8nlw" event={"ID":"7ab39725-f918-404d-9cc5-3f76e4ddcbaa","Type":"ContainerDied","Data":"35f784f78a74ebf146871cc1f34d0e1771c60d7180ab610c68c41a92b60eac57"} Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.284746 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.383405 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-config-data\") pod \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.383515 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-fernet-keys\") pod \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.383567 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-combined-ca-bundle\") pod \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.383644 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-credential-keys\") pod \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.383695 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4jw\" (UniqueName: \"kubernetes.io/projected/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-kube-api-access-7c4jw\") pod \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.383756 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-scripts\") pod \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\" (UID: \"7ab39725-f918-404d-9cc5-3f76e4ddcbaa\") " Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.390342 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7ab39725-f918-404d-9cc5-3f76e4ddcbaa" (UID: "7ab39725-f918-404d-9cc5-3f76e4ddcbaa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.390440 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-kube-api-access-7c4jw" (OuterVolumeSpecName: "kube-api-access-7c4jw") pod "7ab39725-f918-404d-9cc5-3f76e4ddcbaa" (UID: "7ab39725-f918-404d-9cc5-3f76e4ddcbaa"). InnerVolumeSpecName "kube-api-access-7c4jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.391290 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7ab39725-f918-404d-9cc5-3f76e4ddcbaa" (UID: "7ab39725-f918-404d-9cc5-3f76e4ddcbaa"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.397672 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-scripts" (OuterVolumeSpecName: "scripts") pod "7ab39725-f918-404d-9cc5-3f76e4ddcbaa" (UID: "7ab39725-f918-404d-9cc5-3f76e4ddcbaa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.412345 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ab39725-f918-404d-9cc5-3f76e4ddcbaa" (UID: "7ab39725-f918-404d-9cc5-3f76e4ddcbaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.419870 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-config-data" (OuterVolumeSpecName: "config-data") pod "7ab39725-f918-404d-9cc5-3f76e4ddcbaa" (UID: "7ab39725-f918-404d-9cc5-3f76e4ddcbaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.485690 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.485721 4753 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.485757 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.485769 4753 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.485779 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4jw\" (UniqueName: \"kubernetes.io/projected/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-kube-api-access-7c4jw\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.485790 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab39725-f918-404d-9cc5-3f76e4ddcbaa-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.958199 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c8nlw" event={"ID":"7ab39725-f918-404d-9cc5-3f76e4ddcbaa","Type":"ContainerDied","Data":"bad1892aac5d36736f4d8125d19bae1167773bf437955380b59748b55ae97dd6"} Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.958257 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bad1892aac5d36736f4d8125d19bae1167773bf437955380b59748b55ae97dd6" Jan 29 15:28:30 crc kubenswrapper[4753]: I0129 15:28:30.958308 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c8nlw" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.041070 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-c8nlw"] Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.047222 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-c8nlw"] Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.156885 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5482c"] Jan 29 15:28:31 crc kubenswrapper[4753]: E0129 15:28:31.157458 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab39725-f918-404d-9cc5-3f76e4ddcbaa" containerName="keystone-bootstrap" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.157491 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab39725-f918-404d-9cc5-3f76e4ddcbaa" containerName="keystone-bootstrap" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.157750 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab39725-f918-404d-9cc5-3f76e4ddcbaa" containerName="keystone-bootstrap" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.158676 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.161977 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.163369 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nqkj" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.163562 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.163590 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.163901 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.171124 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5482c"] Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.197915 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-scripts\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.198400 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-fernet-keys\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.198463 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-config-data\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.198524 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf4fb\" (UniqueName: \"kubernetes.io/projected/1b556f5d-9b16-408d-88fb-6f9490dde420-kube-api-access-mf4fb\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.198693 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-combined-ca-bundle\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.198895 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-credential-keys\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.301083 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-fernet-keys\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.301196 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-config-data\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.301263 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf4fb\" (UniqueName: \"kubernetes.io/projected/1b556f5d-9b16-408d-88fb-6f9490dde420-kube-api-access-mf4fb\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.301335 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-combined-ca-bundle\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.301436 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-credential-keys\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.301498 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-scripts\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.306060 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-combined-ca-bundle\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.306724 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-fernet-keys\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.306777 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-scripts\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.307461 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-config-data\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.308189 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-credential-keys\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.319652 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf4fb\" (UniqueName: \"kubernetes.io/projected/1b556f5d-9b16-408d-88fb-6f9490dde420-kube-api-access-mf4fb\") pod \"keystone-bootstrap-5482c\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.482704 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:31 crc kubenswrapper[4753]: I0129 15:28:31.981916 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5482c"] Jan 29 15:28:32 crc kubenswrapper[4753]: I0129 15:28:32.164394 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab39725-f918-404d-9cc5-3f76e4ddcbaa" path="/var/lib/kubelet/pods/7ab39725-f918-404d-9cc5-3f76e4ddcbaa/volumes" Jan 29 15:28:32 crc kubenswrapper[4753]: I0129 15:28:32.985636 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5482c" event={"ID":"1b556f5d-9b16-408d-88fb-6f9490dde420","Type":"ContainerStarted","Data":"60ad808dfa00971620b658170303e06e7ae34ec07319de287c382d6ff5a3d568"} Jan 29 15:28:32 crc kubenswrapper[4753]: I0129 15:28:32.985702 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5482c" event={"ID":"1b556f5d-9b16-408d-88fb-6f9490dde420","Type":"ContainerStarted","Data":"680ba4b6a344b38caa1b6d48770e4d6797412d2784ba866865b95e6bc79fcb1b"} Jan 29 15:28:33 crc kubenswrapper[4753]: I0129 15:28:33.010063 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5482c" podStartSLOduration=2.010032211 podStartE2EDuration="2.010032211s" podCreationTimestamp="2026-01-29 15:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:28:33.006214148 +0000 UTC m=+5147.700948540" watchObservedRunningTime="2026-01-29 15:28:33.010032211 +0000 UTC m=+5147.704766643" Jan 29 15:28:34 crc kubenswrapper[4753]: I0129 15:28:34.497957 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:28:34 crc kubenswrapper[4753]: I0129 15:28:34.583220 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fbb6c5-k6k4z"] Jan 29 15:28:34 crc kubenswrapper[4753]: I0129 15:28:34.583540 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" podUID="05ee8739-6c98-455d-a5e8-f9b7b55788ed" containerName="dnsmasq-dns" containerID="cri-o://ff10efe4e2d91bbf36c5c780589d145331c9d9eeaa71ad4903c58cdd437c498b" gracePeriod=10 Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.003746 4753 generic.go:334] "Generic (PLEG): container finished" podID="1b556f5d-9b16-408d-88fb-6f9490dde420" containerID="60ad808dfa00971620b658170303e06e7ae34ec07319de287c382d6ff5a3d568" exitCode=0 Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.003829 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5482c" event={"ID":"1b556f5d-9b16-408d-88fb-6f9490dde420","Type":"ContainerDied","Data":"60ad808dfa00971620b658170303e06e7ae34ec07319de287c382d6ff5a3d568"} Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.007205 4753 generic.go:334] "Generic (PLEG): container finished" podID="05ee8739-6c98-455d-a5e8-f9b7b55788ed" containerID="ff10efe4e2d91bbf36c5c780589d145331c9d9eeaa71ad4903c58cdd437c498b" exitCode=0 Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.007274 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" event={"ID":"05ee8739-6c98-455d-a5e8-f9b7b55788ed","Type":"ContainerDied","Data":"ff10efe4e2d91bbf36c5c780589d145331c9d9eeaa71ad4903c58cdd437c498b"} Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.007316 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" event={"ID":"05ee8739-6c98-455d-a5e8-f9b7b55788ed","Type":"ContainerDied","Data":"5dba340e6c256bf385daf5cacc7af19f6974f757f29fc75c85eee5e646ebe2cf"} Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.007335 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dba340e6c256bf385daf5cacc7af19f6974f757f29fc75c85eee5e646ebe2cf" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.084944 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.179251 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-dns-svc\") pod \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.179335 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-sb\") pod \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.179426 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-nb\") pod \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.179459 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-config\") pod \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.179481 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9s2l\" (UniqueName: \"kubernetes.io/projected/05ee8739-6c98-455d-a5e8-f9b7b55788ed-kube-api-access-n9s2l\") pod \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\" (UID: \"05ee8739-6c98-455d-a5e8-f9b7b55788ed\") " Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.187028 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05ee8739-6c98-455d-a5e8-f9b7b55788ed-kube-api-access-n9s2l" (OuterVolumeSpecName: "kube-api-access-n9s2l") pod "05ee8739-6c98-455d-a5e8-f9b7b55788ed" (UID: "05ee8739-6c98-455d-a5e8-f9b7b55788ed"). InnerVolumeSpecName "kube-api-access-n9s2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.223542 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "05ee8739-6c98-455d-a5e8-f9b7b55788ed" (UID: "05ee8739-6c98-455d-a5e8-f9b7b55788ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.227909 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "05ee8739-6c98-455d-a5e8-f9b7b55788ed" (UID: "05ee8739-6c98-455d-a5e8-f9b7b55788ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.230172 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "05ee8739-6c98-455d-a5e8-f9b7b55788ed" (UID: "05ee8739-6c98-455d-a5e8-f9b7b55788ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.234698 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-config" (OuterVolumeSpecName: "config") pod "05ee8739-6c98-455d-a5e8-f9b7b55788ed" (UID: "05ee8739-6c98-455d-a5e8-f9b7b55788ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.280790 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.280819 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.280829 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.280838 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05ee8739-6c98-455d-a5e8-f9b7b55788ed-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:35 crc kubenswrapper[4753]: I0129 15:28:35.280847 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9s2l\" (UniqueName: \"kubernetes.io/projected/05ee8739-6c98-455d-a5e8-f9b7b55788ed-kube-api-access-n9s2l\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.014981 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fbb6c5-k6k4z" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.065320 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fbb6c5-k6k4z"] Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.073058 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c4fbb6c5-k6k4z"] Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.164326 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05ee8739-6c98-455d-a5e8-f9b7b55788ed" path="/var/lib/kubelet/pods/05ee8739-6c98-455d-a5e8-f9b7b55788ed/volumes" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.388699 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.500396 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-scripts\") pod \"1b556f5d-9b16-408d-88fb-6f9490dde420\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.500580 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-credential-keys\") pod \"1b556f5d-9b16-408d-88fb-6f9490dde420\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.500614 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-combined-ca-bundle\") pod \"1b556f5d-9b16-408d-88fb-6f9490dde420\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.500737 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-fernet-keys\") pod \"1b556f5d-9b16-408d-88fb-6f9490dde420\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.500792 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf4fb\" (UniqueName: \"kubernetes.io/projected/1b556f5d-9b16-408d-88fb-6f9490dde420-kube-api-access-mf4fb\") pod \"1b556f5d-9b16-408d-88fb-6f9490dde420\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.500816 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-config-data\") pod \"1b556f5d-9b16-408d-88fb-6f9490dde420\" (UID: \"1b556f5d-9b16-408d-88fb-6f9490dde420\") " Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.506495 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1b556f5d-9b16-408d-88fb-6f9490dde420" (UID: "1b556f5d-9b16-408d-88fb-6f9490dde420"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.506656 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b556f5d-9b16-408d-88fb-6f9490dde420-kube-api-access-mf4fb" (OuterVolumeSpecName: "kube-api-access-mf4fb") pod "1b556f5d-9b16-408d-88fb-6f9490dde420" (UID: "1b556f5d-9b16-408d-88fb-6f9490dde420"). InnerVolumeSpecName "kube-api-access-mf4fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.507394 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1b556f5d-9b16-408d-88fb-6f9490dde420" (UID: "1b556f5d-9b16-408d-88fb-6f9490dde420"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.508422 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-scripts" (OuterVolumeSpecName: "scripts") pod "1b556f5d-9b16-408d-88fb-6f9490dde420" (UID: "1b556f5d-9b16-408d-88fb-6f9490dde420"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.525814 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-config-data" (OuterVolumeSpecName: "config-data") pod "1b556f5d-9b16-408d-88fb-6f9490dde420" (UID: "1b556f5d-9b16-408d-88fb-6f9490dde420"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.533284 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b556f5d-9b16-408d-88fb-6f9490dde420" (UID: "1b556f5d-9b16-408d-88fb-6f9490dde420"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.602709 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.602762 4753 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.602779 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.602796 4753 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.602809 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf4fb\" (UniqueName: \"kubernetes.io/projected/1b556f5d-9b16-408d-88fb-6f9490dde420-kube-api-access-mf4fb\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:36 crc kubenswrapper[4753]: I0129 15:28:36.602820 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b556f5d-9b16-408d-88fb-6f9490dde420-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.029278 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5482c" event={"ID":"1b556f5d-9b16-408d-88fb-6f9490dde420","Type":"ContainerDied","Data":"680ba4b6a344b38caa1b6d48770e4d6797412d2784ba866865b95e6bc79fcb1b"} Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.029345 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="680ba4b6a344b38caa1b6d48770e4d6797412d2784ba866865b95e6bc79fcb1b" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.030245 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5482c" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.132881 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6d955797-s6jpd"] Jan 29 15:28:37 crc kubenswrapper[4753]: E0129 15:28:37.133392 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b556f5d-9b16-408d-88fb-6f9490dde420" containerName="keystone-bootstrap" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.133426 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b556f5d-9b16-408d-88fb-6f9490dde420" containerName="keystone-bootstrap" Jan 29 15:28:37 crc kubenswrapper[4753]: E0129 15:28:37.133464 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ee8739-6c98-455d-a5e8-f9b7b55788ed" containerName="init" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.133477 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ee8739-6c98-455d-a5e8-f9b7b55788ed" containerName="init" Jan 29 15:28:37 crc kubenswrapper[4753]: E0129 15:28:37.133499 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ee8739-6c98-455d-a5e8-f9b7b55788ed" containerName="dnsmasq-dns" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.133514 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ee8739-6c98-455d-a5e8-f9b7b55788ed" containerName="dnsmasq-dns" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.133769 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="05ee8739-6c98-455d-a5e8-f9b7b55788ed" containerName="dnsmasq-dns" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.133803 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b556f5d-9b16-408d-88fb-6f9490dde420" containerName="keystone-bootstrap" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.134686 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.136902 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.141602 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4nqkj" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.141686 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.141757 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.159118 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d955797-s6jpd"] Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.213046 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-combined-ca-bundle\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.213612 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-config-data\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.213763 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-credential-keys\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.213998 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-fernet-keys\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.214144 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-scripts\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.214371 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26c9\" (UniqueName: \"kubernetes.io/projected/f5d2e134-c722-47ef-b1c9-696e16fa72ce-kube-api-access-r26c9\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.325444 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-combined-ca-bundle\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.326011 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-config-data\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.326129 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-credential-keys\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.326422 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-fernet-keys\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.326566 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-scripts\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.326801 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r26c9\" (UniqueName: \"kubernetes.io/projected/f5d2e134-c722-47ef-b1c9-696e16fa72ce-kube-api-access-r26c9\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.330289 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-config-data\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.330653 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-fernet-keys\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.330712 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-combined-ca-bundle\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.331638 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-credential-keys\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.332636 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5d2e134-c722-47ef-b1c9-696e16fa72ce-scripts\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.352594 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26c9\" (UniqueName: \"kubernetes.io/projected/f5d2e134-c722-47ef-b1c9-696e16fa72ce-kube-api-access-r26c9\") pod \"keystone-6d955797-s6jpd\" (UID: \"f5d2e134-c722-47ef-b1c9-696e16fa72ce\") " pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.461830 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:37 crc kubenswrapper[4753]: I0129 15:28:37.933933 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d955797-s6jpd"] Jan 29 15:28:38 crc kubenswrapper[4753]: I0129 15:28:38.041834 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d955797-s6jpd" event={"ID":"f5d2e134-c722-47ef-b1c9-696e16fa72ce","Type":"ContainerStarted","Data":"89278a40021e46cd38ad9a97a2da7be0deb1d00321e000a0384c95c0feecb9eb"} Jan 29 15:28:39 crc kubenswrapper[4753]: I0129 15:28:39.051191 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d955797-s6jpd" event={"ID":"f5d2e134-c722-47ef-b1c9-696e16fa72ce","Type":"ContainerStarted","Data":"5c34b9cb006ea767bfafe80232b6564f85c61f8ececc4cbba25ba8f5bc888cdb"} Jan 29 15:28:39 crc kubenswrapper[4753]: I0129 15:28:39.051530 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:28:39 crc kubenswrapper[4753]: I0129 15:28:39.067022 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6d955797-s6jpd" podStartSLOduration=2.067000476 podStartE2EDuration="2.067000476s" podCreationTimestamp="2026-01-29 15:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:28:39.066246745 +0000 UTC m=+5153.760981137" watchObservedRunningTime="2026-01-29 15:28:39.067000476 +0000 UTC m=+5153.761734858" Jan 29 15:28:40 crc kubenswrapper[4753]: I0129 15:28:40.150140 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:28:40 crc kubenswrapper[4753]: E0129 15:28:40.150593 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:28:52 crc kubenswrapper[4753]: I0129 15:28:52.150309 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:28:52 crc kubenswrapper[4753]: E0129 15:28:52.151118 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:29:06 crc kubenswrapper[4753]: I0129 15:29:06.769281 4753 scope.go:117] "RemoveContainer" containerID="41751e677c8457ccc4f8d9eb15ddc6bc84ca4ee1b36b7ad3a69b7193d6875bc6" Jan 29 15:29:06 crc kubenswrapper[4753]: I0129 15:29:06.800514 4753 scope.go:117] "RemoveContainer" containerID="f0f9d1da7e58e7226e1f6a9c4c95babfffb713f93552d7f86411075b063cdfb3" Jan 29 15:29:06 crc kubenswrapper[4753]: I0129 15:29:06.837110 4753 scope.go:117] "RemoveContainer" containerID="5c49c8ffceb19e2e710be36ab0abc78f98514c650584c7c90ccfa3464bbe5bed" Jan 29 15:29:06 crc kubenswrapper[4753]: I0129 15:29:06.876013 4753 scope.go:117] "RemoveContainer" containerID="bc1342b24fcbcd3ddf9a9d8cb4668c82367e9d7c93fe2055cd3afc60445c1f60" Jan 29 15:29:06 crc kubenswrapper[4753]: I0129 15:29:06.905515 4753 scope.go:117] "RemoveContainer" containerID="3f8a81ffdac224b8f2798d195f769e4326e7c6d0f75e0eb9a4ed6853d513c6ef" Jan 29 15:29:06 crc kubenswrapper[4753]: I0129 15:29:06.940059 4753 scope.go:117] "RemoveContainer" containerID="8932a393025a0bf84d862bb6479e52ec424f9054fb2ecd3b38fea88f5daf29bf" Jan 29 15:29:07 crc kubenswrapper[4753]: I0129 15:29:07.150320 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:29:07 crc kubenswrapper[4753]: E0129 15:29:07.150643 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:29:08 crc kubenswrapper[4753]: I0129 15:29:08.995083 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6d955797-s6jpd" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.190591 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.193705 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.196938 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.197247 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.197516 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4tpdj" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.199113 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.230416 4753 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T15:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T15:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-29T15:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:2b4f8494513a3af102066fec5868ab167ac8664aceb2f0c639d7a0b60260a944\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ddsfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-29T15:29:13Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.236351 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddsfg\" (UniqueName: \"kubernetes.io/projected/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-kube-api-access-ddsfg\") pod \"openstackclient\" (UID: \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.236662 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config\") pod \"openstackclient\" (UID: \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.236780 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config-secret\") pod \"openstackclient\" (UID: \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.253164 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 29 15:29:13 crc kubenswrapper[4753]: E0129 15:29:13.260347 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-ddsfg openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="38f66882-fe2f-495b-a3fc-8fbbdcd6e524" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.262918 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.294446 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.296464 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.299366 4753 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="38f66882-fe2f-495b-a3fc-8fbbdcd6e524" podUID="218a05db-5006-47e3-992f-2d49802ffe9f" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.303340 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.338907 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config\") pod \"openstackclient\" (UID: \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.340061 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config-secret\") pod \"openstackclient\" (UID: \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.340090 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config\") pod \"openstackclient\" (UID: \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.340121 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsfg\" (UniqueName: \"kubernetes.io/projected/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-kube-api-access-ddsfg\") pod \"openstackclient\" (UID: \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: E0129 15:29:13.343499 4753 projected.go:194] Error preparing data for projected volume kube-api-access-ddsfg for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (38f66882-fe2f-495b-a3fc-8fbbdcd6e524) does not match the UID in record. The object might have been deleted and then recreated Jan 29 15:29:13 crc kubenswrapper[4753]: E0129 15:29:13.343555 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-kube-api-access-ddsfg podName:38f66882-fe2f-495b-a3fc-8fbbdcd6e524 nodeName:}" failed. No retries permitted until 2026-01-29 15:29:13.843540959 +0000 UTC m=+5188.538275331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ddsfg" (UniqueName: "kubernetes.io/projected/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-kube-api-access-ddsfg") pod "openstackclient" (UID: "38f66882-fe2f-495b-a3fc-8fbbdcd6e524") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (38f66882-fe2f-495b-a3fc-8fbbdcd6e524) does not match the UID in record. The object might have been deleted and then recreated Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.356198 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config-secret\") pod \"openstackclient\" (UID: \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.361293 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.364584 4753 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="38f66882-fe2f-495b-a3fc-8fbbdcd6e524" podUID="218a05db-5006-47e3-992f-2d49802ffe9f" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.400535 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.403280 4753 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="38f66882-fe2f-495b-a3fc-8fbbdcd6e524" podUID="218a05db-5006-47e3-992f-2d49802ffe9f" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.440755 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config-secret\") pod \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\" (UID: \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\") " Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.440873 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config\") pod \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\" (UID: \"38f66882-fe2f-495b-a3fc-8fbbdcd6e524\") " Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.440997 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/218a05db-5006-47e3-992f-2d49802ffe9f-openstack-config-secret\") pod \"openstackclient\" (UID: \"218a05db-5006-47e3-992f-2d49802ffe9f\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.441029 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/218a05db-5006-47e3-992f-2d49802ffe9f-openstack-config\") pod \"openstackclient\" (UID: \"218a05db-5006-47e3-992f-2d49802ffe9f\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.441087 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzwm5\" (UniqueName: \"kubernetes.io/projected/218a05db-5006-47e3-992f-2d49802ffe9f-kube-api-access-kzwm5\") pod \"openstackclient\" (UID: \"218a05db-5006-47e3-992f-2d49802ffe9f\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.441145 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddsfg\" (UniqueName: \"kubernetes.io/projected/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-kube-api-access-ddsfg\") on node \"crc\" DevicePath \"\"" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.441657 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "38f66882-fe2f-495b-a3fc-8fbbdcd6e524" (UID: "38f66882-fe2f-495b-a3fc-8fbbdcd6e524"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.445317 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "38f66882-fe2f-495b-a3fc-8fbbdcd6e524" (UID: "38f66882-fe2f-495b-a3fc-8fbbdcd6e524"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.542325 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzwm5\" (UniqueName: \"kubernetes.io/projected/218a05db-5006-47e3-992f-2d49802ffe9f-kube-api-access-kzwm5\") pod \"openstackclient\" (UID: \"218a05db-5006-47e3-992f-2d49802ffe9f\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.542503 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/218a05db-5006-47e3-992f-2d49802ffe9f-openstack-config-secret\") pod \"openstackclient\" (UID: \"218a05db-5006-47e3-992f-2d49802ffe9f\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.542537 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/218a05db-5006-47e3-992f-2d49802ffe9f-openstack-config\") pod \"openstackclient\" (UID: \"218a05db-5006-47e3-992f-2d49802ffe9f\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.543098 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.543113 4753 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/38f66882-fe2f-495b-a3fc-8fbbdcd6e524-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.543575 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/218a05db-5006-47e3-992f-2d49802ffe9f-openstack-config\") pod \"openstackclient\" (UID: \"218a05db-5006-47e3-992f-2d49802ffe9f\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.545963 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/218a05db-5006-47e3-992f-2d49802ffe9f-openstack-config-secret\") pod \"openstackclient\" (UID: \"218a05db-5006-47e3-992f-2d49802ffe9f\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.559575 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzwm5\" (UniqueName: \"kubernetes.io/projected/218a05db-5006-47e3-992f-2d49802ffe9f-kube-api-access-kzwm5\") pod \"openstackclient\" (UID: \"218a05db-5006-47e3-992f-2d49802ffe9f\") " pod="openstack/openstackclient" Jan 29 15:29:13 crc kubenswrapper[4753]: I0129 15:29:13.616927 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 15:29:14 crc kubenswrapper[4753]: I0129 15:29:14.054259 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 29 15:29:14 crc kubenswrapper[4753]: I0129 15:29:14.158041 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f66882-fe2f-495b-a3fc-8fbbdcd6e524" path="/var/lib/kubelet/pods/38f66882-fe2f-495b-a3fc-8fbbdcd6e524/volumes" Jan 29 15:29:14 crc kubenswrapper[4753]: I0129 15:29:14.371612 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 29 15:29:14 crc kubenswrapper[4753]: I0129 15:29:14.371685 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"218a05db-5006-47e3-992f-2d49802ffe9f","Type":"ContainerStarted","Data":"b75b639de1bd02b8eabd7043a3fd24aa891c9630912259daefc9b64bb55f570e"} Jan 29 15:29:14 crc kubenswrapper[4753]: I0129 15:29:14.371758 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"218a05db-5006-47e3-992f-2d49802ffe9f","Type":"ContainerStarted","Data":"715bab912195d58dab8789c1be1fbf3218e1078ec761f97533ee449f92bd63d6"} Jan 29 15:29:14 crc kubenswrapper[4753]: I0129 15:29:14.375944 4753 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="38f66882-fe2f-495b-a3fc-8fbbdcd6e524" podUID="218a05db-5006-47e3-992f-2d49802ffe9f" Jan 29 15:29:14 crc kubenswrapper[4753]: I0129 15:29:14.392699 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.392679288 podStartE2EDuration="1.392679288s" podCreationTimestamp="2026-01-29 15:29:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:29:14.389447991 +0000 UTC m=+5189.084182383" watchObservedRunningTime="2026-01-29 15:29:14.392679288 +0000 UTC m=+5189.087413670" Jan 29 15:29:14 crc kubenswrapper[4753]: I0129 15:29:14.399269 4753 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="38f66882-fe2f-495b-a3fc-8fbbdcd6e524" podUID="218a05db-5006-47e3-992f-2d49802ffe9f" Jan 29 15:29:21 crc kubenswrapper[4753]: I0129 15:29:21.150053 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:29:21 crc kubenswrapper[4753]: E0129 15:29:21.151005 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.704488 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5llvq"] Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.707040 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.716834 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5llvq"] Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.885861 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp242\" (UniqueName: \"kubernetes.io/projected/2f875247-b144-4ac2-9d5d-c10ab66cad52-kube-api-access-sp242\") pod \"redhat-marketplace-5llvq\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.886245 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-catalog-content\") pod \"redhat-marketplace-5llvq\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.886303 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-utilities\") pod \"redhat-marketplace-5llvq\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.988243 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp242\" (UniqueName: \"kubernetes.io/projected/2f875247-b144-4ac2-9d5d-c10ab66cad52-kube-api-access-sp242\") pod \"redhat-marketplace-5llvq\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.988460 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-catalog-content\") pod \"redhat-marketplace-5llvq\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.988499 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-utilities\") pod \"redhat-marketplace-5llvq\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.989078 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-catalog-content\") pod \"redhat-marketplace-5llvq\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:26 crc kubenswrapper[4753]: I0129 15:29:26.989208 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-utilities\") pod \"redhat-marketplace-5llvq\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:27 crc kubenswrapper[4753]: I0129 15:29:27.013097 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp242\" (UniqueName: \"kubernetes.io/projected/2f875247-b144-4ac2-9d5d-c10ab66cad52-kube-api-access-sp242\") pod \"redhat-marketplace-5llvq\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:27 crc kubenswrapper[4753]: I0129 15:29:27.028475 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:27 crc kubenswrapper[4753]: I0129 15:29:27.532219 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5llvq"] Jan 29 15:29:27 crc kubenswrapper[4753]: W0129 15:29:27.538770 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f875247_b144_4ac2_9d5d_c10ab66cad52.slice/crio-ad0d4b7ffaaa216bb7baf048d9586299a0f77343480b4114184d89b645e0e466 WatchSource:0}: Error finding container ad0d4b7ffaaa216bb7baf048d9586299a0f77343480b4114184d89b645e0e466: Status 404 returned error can't find the container with id ad0d4b7ffaaa216bb7baf048d9586299a0f77343480b4114184d89b645e0e466 Jan 29 15:29:28 crc kubenswrapper[4753]: I0129 15:29:28.503275 4753 generic.go:334] "Generic (PLEG): container finished" podID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerID="26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c" exitCode=0 Jan 29 15:29:28 crc kubenswrapper[4753]: I0129 15:29:28.503516 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5llvq" event={"ID":"2f875247-b144-4ac2-9d5d-c10ab66cad52","Type":"ContainerDied","Data":"26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c"} Jan 29 15:29:28 crc kubenswrapper[4753]: I0129 15:29:28.503543 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5llvq" event={"ID":"2f875247-b144-4ac2-9d5d-c10ab66cad52","Type":"ContainerStarted","Data":"ad0d4b7ffaaa216bb7baf048d9586299a0f77343480b4114184d89b645e0e466"} Jan 29 15:29:29 crc kubenswrapper[4753]: I0129 15:29:29.512892 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5llvq" event={"ID":"2f875247-b144-4ac2-9d5d-c10ab66cad52","Type":"ContainerStarted","Data":"ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a"} Jan 29 15:29:30 crc kubenswrapper[4753]: I0129 15:29:30.525320 4753 generic.go:334] "Generic (PLEG): container finished" podID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerID="ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a" exitCode=0 Jan 29 15:29:30 crc kubenswrapper[4753]: I0129 15:29:30.525424 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5llvq" event={"ID":"2f875247-b144-4ac2-9d5d-c10ab66cad52","Type":"ContainerDied","Data":"ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a"} Jan 29 15:29:31 crc kubenswrapper[4753]: I0129 15:29:31.534812 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5llvq" event={"ID":"2f875247-b144-4ac2-9d5d-c10ab66cad52","Type":"ContainerStarted","Data":"8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85"} Jan 29 15:29:31 crc kubenswrapper[4753]: I0129 15:29:31.560853 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5llvq" podStartSLOduration=3.128475294 podStartE2EDuration="5.560836264s" podCreationTimestamp="2026-01-29 15:29:26 +0000 UTC" firstStartedPulling="2026-01-29 15:29:28.506342927 +0000 UTC m=+5203.201077309" lastFinishedPulling="2026-01-29 15:29:30.938703887 +0000 UTC m=+5205.633438279" observedRunningTime="2026-01-29 15:29:31.554798732 +0000 UTC m=+5206.249533114" watchObservedRunningTime="2026-01-29 15:29:31.560836264 +0000 UTC m=+5206.255570646" Jan 29 15:29:36 crc kubenswrapper[4753]: I0129 15:29:36.160629 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:29:36 crc kubenswrapper[4753]: E0129 15:29:36.162240 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:29:37 crc kubenswrapper[4753]: I0129 15:29:37.029058 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:37 crc kubenswrapper[4753]: I0129 15:29:37.029220 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:37 crc kubenswrapper[4753]: I0129 15:29:37.095078 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:37 crc kubenswrapper[4753]: I0129 15:29:37.654290 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:37 crc kubenswrapper[4753]: I0129 15:29:37.716457 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5llvq"] Jan 29 15:29:39 crc kubenswrapper[4753]: I0129 15:29:39.602042 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5llvq" podUID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerName="registry-server" containerID="cri-o://8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85" gracePeriod=2 Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.103698 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.235496 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-catalog-content\") pod \"2f875247-b144-4ac2-9d5d-c10ab66cad52\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.235583 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-utilities\") pod \"2f875247-b144-4ac2-9d5d-c10ab66cad52\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.235760 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp242\" (UniqueName: \"kubernetes.io/projected/2f875247-b144-4ac2-9d5d-c10ab66cad52-kube-api-access-sp242\") pod \"2f875247-b144-4ac2-9d5d-c10ab66cad52\" (UID: \"2f875247-b144-4ac2-9d5d-c10ab66cad52\") " Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.236508 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-utilities" (OuterVolumeSpecName: "utilities") pod "2f875247-b144-4ac2-9d5d-c10ab66cad52" (UID: "2f875247-b144-4ac2-9d5d-c10ab66cad52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.242348 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f875247-b144-4ac2-9d5d-c10ab66cad52-kube-api-access-sp242" (OuterVolumeSpecName: "kube-api-access-sp242") pod "2f875247-b144-4ac2-9d5d-c10ab66cad52" (UID: "2f875247-b144-4ac2-9d5d-c10ab66cad52"). InnerVolumeSpecName "kube-api-access-sp242". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.257486 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f875247-b144-4ac2-9d5d-c10ab66cad52" (UID: "2f875247-b144-4ac2-9d5d-c10ab66cad52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.337729 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.337756 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp242\" (UniqueName: \"kubernetes.io/projected/2f875247-b144-4ac2-9d5d-c10ab66cad52-kube-api-access-sp242\") on node \"crc\" DevicePath \"\"" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.337768 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f875247-b144-4ac2-9d5d-c10ab66cad52-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.610326 4753 generic.go:334] "Generic (PLEG): container finished" podID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerID="8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85" exitCode=0 Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.610375 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5llvq" event={"ID":"2f875247-b144-4ac2-9d5d-c10ab66cad52","Type":"ContainerDied","Data":"8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85"} Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.610668 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5llvq" event={"ID":"2f875247-b144-4ac2-9d5d-c10ab66cad52","Type":"ContainerDied","Data":"ad0d4b7ffaaa216bb7baf048d9586299a0f77343480b4114184d89b645e0e466"} Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.610444 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5llvq" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.610689 4753 scope.go:117] "RemoveContainer" containerID="8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.631932 4753 scope.go:117] "RemoveContainer" containerID="ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.653430 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5llvq"] Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.659772 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5llvq"] Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.678428 4753 scope.go:117] "RemoveContainer" containerID="26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.713921 4753 scope.go:117] "RemoveContainer" containerID="8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85" Jan 29 15:29:40 crc kubenswrapper[4753]: E0129 15:29:40.714427 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85\": container with ID starting with 8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85 not found: ID does not exist" containerID="8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.714482 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85"} err="failed to get container status \"8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85\": rpc error: code = NotFound desc = could not find container \"8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85\": container with ID starting with 8780603b8b8f051ef74acbe46f4a61752fc9f2669959a2b11c6a87cae4848a85 not found: ID does not exist" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.714519 4753 scope.go:117] "RemoveContainer" containerID="ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a" Jan 29 15:29:40 crc kubenswrapper[4753]: E0129 15:29:40.714987 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a\": container with ID starting with ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a not found: ID does not exist" containerID="ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.715017 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a"} err="failed to get container status \"ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a\": rpc error: code = NotFound desc = could not find container \"ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a\": container with ID starting with ebc1503191aa95d440ef74cbf0d1c2b523d3453dec6495887c3616a32c96a62a not found: ID does not exist" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.715036 4753 scope.go:117] "RemoveContainer" containerID="26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c" Jan 29 15:29:40 crc kubenswrapper[4753]: E0129 15:29:40.715475 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c\": container with ID starting with 26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c not found: ID does not exist" containerID="26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c" Jan 29 15:29:40 crc kubenswrapper[4753]: I0129 15:29:40.715521 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c"} err="failed to get container status \"26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c\": rpc error: code = NotFound desc = could not find container \"26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c\": container with ID starting with 26e18292eccfed8b1d8f451ae0446d6eeec8256cdf58a5d10991f93c0a84bf7c not found: ID does not exist" Jan 29 15:29:42 crc kubenswrapper[4753]: I0129 15:29:42.165751 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f875247-b144-4ac2-9d5d-c10ab66cad52" path="/var/lib/kubelet/pods/2f875247-b144-4ac2-9d5d-c10ab66cad52/volumes" Jan 29 15:29:51 crc kubenswrapper[4753]: I0129 15:29:51.152044 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:29:51 crc kubenswrapper[4753]: E0129 15:29:51.152983 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.166496 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n"] Jan 29 15:30:00 crc kubenswrapper[4753]: E0129 15:30:00.168125 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerName="extract-utilities" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.168264 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerName="extract-utilities" Jan 29 15:30:00 crc kubenswrapper[4753]: E0129 15:30:00.168347 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerName="registry-server" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.168404 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerName="registry-server" Jan 29 15:30:00 crc kubenswrapper[4753]: E0129 15:30:00.168472 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerName="extract-content" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.169331 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerName="extract-content" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.169789 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f875247-b144-4ac2-9d5d-c10ab66cad52" containerName="registry-server" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.170544 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n"] Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.170652 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.173760 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.182885 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.247129 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9ftn\" (UniqueName: \"kubernetes.io/projected/6289196e-5cbb-47e7-88a5-9beb01531f47-kube-api-access-p9ftn\") pod \"collect-profiles-29495010-k6p2n\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.247252 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6289196e-5cbb-47e7-88a5-9beb01531f47-secret-volume\") pod \"collect-profiles-29495010-k6p2n\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.247282 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6289196e-5cbb-47e7-88a5-9beb01531f47-config-volume\") pod \"collect-profiles-29495010-k6p2n\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.348367 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6289196e-5cbb-47e7-88a5-9beb01531f47-secret-volume\") pod \"collect-profiles-29495010-k6p2n\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.348415 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6289196e-5cbb-47e7-88a5-9beb01531f47-config-volume\") pod \"collect-profiles-29495010-k6p2n\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.348519 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9ftn\" (UniqueName: \"kubernetes.io/projected/6289196e-5cbb-47e7-88a5-9beb01531f47-kube-api-access-p9ftn\") pod \"collect-profiles-29495010-k6p2n\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.349526 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6289196e-5cbb-47e7-88a5-9beb01531f47-config-volume\") pod \"collect-profiles-29495010-k6p2n\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.360158 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6289196e-5cbb-47e7-88a5-9beb01531f47-secret-volume\") pod \"collect-profiles-29495010-k6p2n\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.381209 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9ftn\" (UniqueName: \"kubernetes.io/projected/6289196e-5cbb-47e7-88a5-9beb01531f47-kube-api-access-p9ftn\") pod \"collect-profiles-29495010-k6p2n\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.495311 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:00 crc kubenswrapper[4753]: I0129 15:30:00.924259 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n"] Jan 29 15:30:01 crc kubenswrapper[4753]: I0129 15:30:01.800235 4753 generic.go:334] "Generic (PLEG): container finished" podID="6289196e-5cbb-47e7-88a5-9beb01531f47" containerID="3d86747227e10fc6082886a498f5fcf5706e500e79bed514b8c3989085ad4c7c" exitCode=0 Jan 29 15:30:01 crc kubenswrapper[4753]: I0129 15:30:01.800292 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" event={"ID":"6289196e-5cbb-47e7-88a5-9beb01531f47","Type":"ContainerDied","Data":"3d86747227e10fc6082886a498f5fcf5706e500e79bed514b8c3989085ad4c7c"} Jan 29 15:30:01 crc kubenswrapper[4753]: I0129 15:30:01.800830 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" event={"ID":"6289196e-5cbb-47e7-88a5-9beb01531f47","Type":"ContainerStarted","Data":"1b3d018e3fee912f9bd4e3c76115802f61ab868b1f6643142774fb7983ae5577"} Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.133794 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.193232 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6289196e-5cbb-47e7-88a5-9beb01531f47-secret-volume\") pod \"6289196e-5cbb-47e7-88a5-9beb01531f47\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.193333 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6289196e-5cbb-47e7-88a5-9beb01531f47-config-volume\") pod \"6289196e-5cbb-47e7-88a5-9beb01531f47\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.193388 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9ftn\" (UniqueName: \"kubernetes.io/projected/6289196e-5cbb-47e7-88a5-9beb01531f47-kube-api-access-p9ftn\") pod \"6289196e-5cbb-47e7-88a5-9beb01531f47\" (UID: \"6289196e-5cbb-47e7-88a5-9beb01531f47\") " Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.193955 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6289196e-5cbb-47e7-88a5-9beb01531f47-config-volume" (OuterVolumeSpecName: "config-volume") pod "6289196e-5cbb-47e7-88a5-9beb01531f47" (UID: "6289196e-5cbb-47e7-88a5-9beb01531f47"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.200513 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6289196e-5cbb-47e7-88a5-9beb01531f47-kube-api-access-p9ftn" (OuterVolumeSpecName: "kube-api-access-p9ftn") pod "6289196e-5cbb-47e7-88a5-9beb01531f47" (UID: "6289196e-5cbb-47e7-88a5-9beb01531f47"). InnerVolumeSpecName "kube-api-access-p9ftn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.206432 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6289196e-5cbb-47e7-88a5-9beb01531f47-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6289196e-5cbb-47e7-88a5-9beb01531f47" (UID: "6289196e-5cbb-47e7-88a5-9beb01531f47"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.295755 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6289196e-5cbb-47e7-88a5-9beb01531f47-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.295797 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9ftn\" (UniqueName: \"kubernetes.io/projected/6289196e-5cbb-47e7-88a5-9beb01531f47-kube-api-access-p9ftn\") on node \"crc\" DevicePath \"\"" Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.295808 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6289196e-5cbb-47e7-88a5-9beb01531f47-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.816543 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" event={"ID":"6289196e-5cbb-47e7-88a5-9beb01531f47","Type":"ContainerDied","Data":"1b3d018e3fee912f9bd4e3c76115802f61ab868b1f6643142774fb7983ae5577"} Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.816829 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b3d018e3fee912f9bd4e3c76115802f61ab868b1f6643142774fb7983ae5577" Jan 29 15:30:03 crc kubenswrapper[4753]: I0129 15:30:03.816823 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495010-k6p2n" Jan 29 15:30:04 crc kubenswrapper[4753]: I0129 15:30:04.150969 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:30:04 crc kubenswrapper[4753]: E0129 15:30:04.151417 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:30:04 crc kubenswrapper[4753]: I0129 15:30:04.229849 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g"] Jan 29 15:30:04 crc kubenswrapper[4753]: I0129 15:30:04.238410 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494965-nts8g"] Jan 29 15:30:06 crc kubenswrapper[4753]: I0129 15:30:06.171787 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ef84df-77f3-4784-9297-24192a945026" path="/var/lib/kubelet/pods/c0ef84df-77f3-4784-9297-24192a945026/volumes" Jan 29 15:30:07 crc kubenswrapper[4753]: I0129 15:30:07.073833 4753 scope.go:117] "RemoveContainer" containerID="ea199e04f3cbe79117dab2c6f58bb9d022ccfb9c01a1e0fb349945c347aefd7c" Jan 29 15:30:15 crc kubenswrapper[4753]: I0129 15:30:15.150019 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:30:15 crc kubenswrapper[4753]: E0129 15:30:15.151377 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:30:29 crc kubenswrapper[4753]: I0129 15:30:29.149646 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:30:29 crc kubenswrapper[4753]: E0129 15:30:29.150446 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:30:44 crc kubenswrapper[4753]: I0129 15:30:44.150038 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:30:44 crc kubenswrapper[4753]: E0129 15:30:44.151660 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:30:50 crc kubenswrapper[4753]: I0129 15:30:50.066583 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-nd6z4"] Jan 29 15:30:50 crc kubenswrapper[4753]: I0129 15:30:50.078236 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-nd6z4"] Jan 29 15:30:50 crc kubenswrapper[4753]: I0129 15:30:50.161296 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c997c121-d070-4686-ba00-f4a025096b7b" path="/var/lib/kubelet/pods/c997c121-d070-4686-ba00-f4a025096b7b/volumes" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.453617 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-fts8x"] Jan 29 15:30:51 crc kubenswrapper[4753]: E0129 15:30:51.454118 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6289196e-5cbb-47e7-88a5-9beb01531f47" containerName="collect-profiles" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.454129 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6289196e-5cbb-47e7-88a5-9beb01531f47" containerName="collect-profiles" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.454812 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6289196e-5cbb-47e7-88a5-9beb01531f47" containerName="collect-profiles" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.455539 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fts8x" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.468912 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fts8x"] Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.549872 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-aa39-account-create-update-mnzk6"] Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.551112 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aa39-account-create-update-mnzk6" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.552877 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.558127 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aa39-account-create-update-mnzk6"] Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.567394 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-operator-scripts\") pod \"barbican-db-create-fts8x\" (UID: \"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c\") " pod="openstack/barbican-db-create-fts8x" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.567463 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8svg\" (UniqueName: \"kubernetes.io/projected/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-kube-api-access-s8svg\") pod \"barbican-db-create-fts8x\" (UID: \"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c\") " pod="openstack/barbican-db-create-fts8x" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.669442 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpmvk\" (UniqueName: \"kubernetes.io/projected/beaf5340-3a9d-4712-8524-606a091f544e-kube-api-access-zpmvk\") pod \"barbican-aa39-account-create-update-mnzk6\" (UID: \"beaf5340-3a9d-4712-8524-606a091f544e\") " pod="openstack/barbican-aa39-account-create-update-mnzk6" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.669551 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8svg\" (UniqueName: \"kubernetes.io/projected/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-kube-api-access-s8svg\") pod \"barbican-db-create-fts8x\" (UID: \"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c\") " pod="openstack/barbican-db-create-fts8x" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.669718 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beaf5340-3a9d-4712-8524-606a091f544e-operator-scripts\") pod \"barbican-aa39-account-create-update-mnzk6\" (UID: \"beaf5340-3a9d-4712-8524-606a091f544e\") " pod="openstack/barbican-aa39-account-create-update-mnzk6" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.669818 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-operator-scripts\") pod \"barbican-db-create-fts8x\" (UID: \"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c\") " pod="openstack/barbican-db-create-fts8x" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.671142 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-operator-scripts\") pod \"barbican-db-create-fts8x\" (UID: \"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c\") " pod="openstack/barbican-db-create-fts8x" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.696583 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8svg\" (UniqueName: \"kubernetes.io/projected/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-kube-api-access-s8svg\") pod \"barbican-db-create-fts8x\" (UID: \"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c\") " pod="openstack/barbican-db-create-fts8x" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.771724 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beaf5340-3a9d-4712-8524-606a091f544e-operator-scripts\") pod \"barbican-aa39-account-create-update-mnzk6\" (UID: \"beaf5340-3a9d-4712-8524-606a091f544e\") " pod="openstack/barbican-aa39-account-create-update-mnzk6" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.771931 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpmvk\" (UniqueName: \"kubernetes.io/projected/beaf5340-3a9d-4712-8524-606a091f544e-kube-api-access-zpmvk\") pod \"barbican-aa39-account-create-update-mnzk6\" (UID: \"beaf5340-3a9d-4712-8524-606a091f544e\") " pod="openstack/barbican-aa39-account-create-update-mnzk6" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.773412 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beaf5340-3a9d-4712-8524-606a091f544e-operator-scripts\") pod \"barbican-aa39-account-create-update-mnzk6\" (UID: \"beaf5340-3a9d-4712-8524-606a091f544e\") " pod="openstack/barbican-aa39-account-create-update-mnzk6" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.775770 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fts8x" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.796795 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpmvk\" (UniqueName: \"kubernetes.io/projected/beaf5340-3a9d-4712-8524-606a091f544e-kube-api-access-zpmvk\") pod \"barbican-aa39-account-create-update-mnzk6\" (UID: \"beaf5340-3a9d-4712-8524-606a091f544e\") " pod="openstack/barbican-aa39-account-create-update-mnzk6" Jan 29 15:30:51 crc kubenswrapper[4753]: I0129 15:30:51.867742 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aa39-account-create-update-mnzk6" Jan 29 15:30:52 crc kubenswrapper[4753]: I0129 15:30:52.218007 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fts8x"] Jan 29 15:30:52 crc kubenswrapper[4753]: I0129 15:30:52.292794 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fts8x" event={"ID":"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c","Type":"ContainerStarted","Data":"343be93b9c0dcc720269242d51d0470ecccc43a75f23fa6b0065136266fa7a04"} Jan 29 15:30:52 crc kubenswrapper[4753]: I0129 15:30:52.305308 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aa39-account-create-update-mnzk6"] Jan 29 15:30:52 crc kubenswrapper[4753]: W0129 15:30:52.316709 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbeaf5340_3a9d_4712_8524_606a091f544e.slice/crio-64aae4286951c01c76bc80a63a1e6a043ce562c975c0389c503124b984c77754 WatchSource:0}: Error finding container 64aae4286951c01c76bc80a63a1e6a043ce562c975c0389c503124b984c77754: Status 404 returned error can't find the container with id 64aae4286951c01c76bc80a63a1e6a043ce562c975c0389c503124b984c77754 Jan 29 15:30:53 crc kubenswrapper[4753]: I0129 15:30:53.306568 4753 generic.go:334] "Generic (PLEG): container finished" podID="c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c" containerID="b08af0664387ab2af96e764a332519279b48e120cb4c65117e46057e6fdf2f43" exitCode=0 Jan 29 15:30:53 crc kubenswrapper[4753]: I0129 15:30:53.306641 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fts8x" event={"ID":"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c","Type":"ContainerDied","Data":"b08af0664387ab2af96e764a332519279b48e120cb4c65117e46057e6fdf2f43"} Jan 29 15:30:53 crc kubenswrapper[4753]: I0129 15:30:53.311008 4753 generic.go:334] "Generic (PLEG): container finished" podID="beaf5340-3a9d-4712-8524-606a091f544e" containerID="c1072d9eff4fa34a5c6e8776018646f90816190562caca57895700347741c858" exitCode=0 Jan 29 15:30:53 crc kubenswrapper[4753]: I0129 15:30:53.311212 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aa39-account-create-update-mnzk6" event={"ID":"beaf5340-3a9d-4712-8524-606a091f544e","Type":"ContainerDied","Data":"c1072d9eff4fa34a5c6e8776018646f90816190562caca57895700347741c858"} Jan 29 15:30:53 crc kubenswrapper[4753]: I0129 15:30:53.311266 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aa39-account-create-update-mnzk6" event={"ID":"beaf5340-3a9d-4712-8524-606a091f544e","Type":"ContainerStarted","Data":"64aae4286951c01c76bc80a63a1e6a043ce562c975c0389c503124b984c77754"} Jan 29 15:30:54 crc kubenswrapper[4753]: I0129 15:30:54.817350 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fts8x" Jan 29 15:30:54 crc kubenswrapper[4753]: I0129 15:30:54.828867 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aa39-account-create-update-mnzk6" Jan 29 15:30:54 crc kubenswrapper[4753]: I0129 15:30:54.937780 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-operator-scripts\") pod \"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c\" (UID: \"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c\") " Jan 29 15:30:54 crc kubenswrapper[4753]: I0129 15:30:54.937823 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpmvk\" (UniqueName: \"kubernetes.io/projected/beaf5340-3a9d-4712-8524-606a091f544e-kube-api-access-zpmvk\") pod \"beaf5340-3a9d-4712-8524-606a091f544e\" (UID: \"beaf5340-3a9d-4712-8524-606a091f544e\") " Jan 29 15:30:54 crc kubenswrapper[4753]: I0129 15:30:54.937858 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beaf5340-3a9d-4712-8524-606a091f544e-operator-scripts\") pod \"beaf5340-3a9d-4712-8524-606a091f544e\" (UID: \"beaf5340-3a9d-4712-8524-606a091f544e\") " Jan 29 15:30:54 crc kubenswrapper[4753]: I0129 15:30:54.937977 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8svg\" (UniqueName: \"kubernetes.io/projected/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-kube-api-access-s8svg\") pod \"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c\" (UID: \"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c\") " Jan 29 15:30:54 crc kubenswrapper[4753]: I0129 15:30:54.939906 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c" (UID: "c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:30:54 crc kubenswrapper[4753]: I0129 15:30:54.939980 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beaf5340-3a9d-4712-8524-606a091f544e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "beaf5340-3a9d-4712-8524-606a091f544e" (UID: "beaf5340-3a9d-4712-8524-606a091f544e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:30:54 crc kubenswrapper[4753]: I0129 15:30:54.943040 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-kube-api-access-s8svg" (OuterVolumeSpecName: "kube-api-access-s8svg") pod "c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c" (UID: "c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c"). InnerVolumeSpecName "kube-api-access-s8svg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:30:54 crc kubenswrapper[4753]: I0129 15:30:54.943470 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beaf5340-3a9d-4712-8524-606a091f544e-kube-api-access-zpmvk" (OuterVolumeSpecName: "kube-api-access-zpmvk") pod "beaf5340-3a9d-4712-8524-606a091f544e" (UID: "beaf5340-3a9d-4712-8524-606a091f544e"). InnerVolumeSpecName "kube-api-access-zpmvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.039737 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.039777 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpmvk\" (UniqueName: \"kubernetes.io/projected/beaf5340-3a9d-4712-8524-606a091f544e-kube-api-access-zpmvk\") on node \"crc\" DevicePath \"\"" Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.039791 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beaf5340-3a9d-4712-8524-606a091f544e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.039802 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8svg\" (UniqueName: \"kubernetes.io/projected/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c-kube-api-access-s8svg\") on node \"crc\" DevicePath \"\"" Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.149664 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:30:55 crc kubenswrapper[4753]: E0129 15:30:55.150118 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.333524 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fts8x" Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.333546 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fts8x" event={"ID":"c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c","Type":"ContainerDied","Data":"343be93b9c0dcc720269242d51d0470ecccc43a75f23fa6b0065136266fa7a04"} Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.334023 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="343be93b9c0dcc720269242d51d0470ecccc43a75f23fa6b0065136266fa7a04" Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.336321 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aa39-account-create-update-mnzk6" event={"ID":"beaf5340-3a9d-4712-8524-606a091f544e","Type":"ContainerDied","Data":"64aae4286951c01c76bc80a63a1e6a043ce562c975c0389c503124b984c77754"} Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.336365 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64aae4286951c01c76bc80a63a1e6a043ce562c975c0389c503124b984c77754" Jan 29 15:30:55 crc kubenswrapper[4753]: I0129 15:30:55.336414 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aa39-account-create-update-mnzk6" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.831341 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-46k8w"] Jan 29 15:30:56 crc kubenswrapper[4753]: E0129 15:30:56.831928 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beaf5340-3a9d-4712-8524-606a091f544e" containerName="mariadb-account-create-update" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.831946 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="beaf5340-3a9d-4712-8524-606a091f544e" containerName="mariadb-account-create-update" Jan 29 15:30:56 crc kubenswrapper[4753]: E0129 15:30:56.831990 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c" containerName="mariadb-database-create" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.831998 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c" containerName="mariadb-database-create" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.832209 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c" containerName="mariadb-database-create" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.832242 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="beaf5340-3a9d-4712-8524-606a091f544e" containerName="mariadb-account-create-update" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.832952 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.836195 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6hxgp" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.838056 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.844327 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-46k8w"] Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.981081 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-db-sync-config-data\") pod \"barbican-db-sync-46k8w\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.981206 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-combined-ca-bundle\") pod \"barbican-db-sync-46k8w\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:56 crc kubenswrapper[4753]: I0129 15:30:56.981283 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt6sn\" (UniqueName: \"kubernetes.io/projected/a2834447-2ee8-4608-85ff-805e2fcbe7c6-kube-api-access-zt6sn\") pod \"barbican-db-sync-46k8w\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:57 crc kubenswrapper[4753]: I0129 15:30:57.082602 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt6sn\" (UniqueName: \"kubernetes.io/projected/a2834447-2ee8-4608-85ff-805e2fcbe7c6-kube-api-access-zt6sn\") pod \"barbican-db-sync-46k8w\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:57 crc kubenswrapper[4753]: I0129 15:30:57.083034 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-db-sync-config-data\") pod \"barbican-db-sync-46k8w\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:57 crc kubenswrapper[4753]: I0129 15:30:57.083111 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-combined-ca-bundle\") pod \"barbican-db-sync-46k8w\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:57 crc kubenswrapper[4753]: I0129 15:30:57.105071 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-db-sync-config-data\") pod \"barbican-db-sync-46k8w\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:57 crc kubenswrapper[4753]: I0129 15:30:57.105854 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-combined-ca-bundle\") pod \"barbican-db-sync-46k8w\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:57 crc kubenswrapper[4753]: I0129 15:30:57.109999 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt6sn\" (UniqueName: \"kubernetes.io/projected/a2834447-2ee8-4608-85ff-805e2fcbe7c6-kube-api-access-zt6sn\") pod \"barbican-db-sync-46k8w\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:57 crc kubenswrapper[4753]: I0129 15:30:57.210192 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-46k8w" Jan 29 15:30:57 crc kubenswrapper[4753]: I0129 15:30:57.642388 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-46k8w"] Jan 29 15:30:58 crc kubenswrapper[4753]: I0129 15:30:58.368230 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-46k8w" event={"ID":"a2834447-2ee8-4608-85ff-805e2fcbe7c6","Type":"ContainerStarted","Data":"0ebb9fc893c0cbdfd16f60f2ea3e4e446d75221e24a32b2cf2df3561df3b9191"} Jan 29 15:30:58 crc kubenswrapper[4753]: I0129 15:30:58.368598 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-46k8w" event={"ID":"a2834447-2ee8-4608-85ff-805e2fcbe7c6","Type":"ContainerStarted","Data":"943e3894e00b1ca513697d8e2e1303a5efd73f7ceed6f60fb1302b8e87ef0f44"} Jan 29 15:30:58 crc kubenswrapper[4753]: I0129 15:30:58.401244 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-46k8w" podStartSLOduration=2.401218883 podStartE2EDuration="2.401218883s" podCreationTimestamp="2026-01-29 15:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:30:58.393842635 +0000 UTC m=+5293.088577057" watchObservedRunningTime="2026-01-29 15:30:58.401218883 +0000 UTC m=+5293.095953305" Jan 29 15:30:59 crc kubenswrapper[4753]: I0129 15:30:59.379501 4753 generic.go:334] "Generic (PLEG): container finished" podID="a2834447-2ee8-4608-85ff-805e2fcbe7c6" containerID="0ebb9fc893c0cbdfd16f60f2ea3e4e446d75221e24a32b2cf2df3561df3b9191" exitCode=0 Jan 29 15:30:59 crc kubenswrapper[4753]: I0129 15:30:59.379585 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-46k8w" event={"ID":"a2834447-2ee8-4608-85ff-805e2fcbe7c6","Type":"ContainerDied","Data":"0ebb9fc893c0cbdfd16f60f2ea3e4e446d75221e24a32b2cf2df3561df3b9191"} Jan 29 15:31:00 crc kubenswrapper[4753]: I0129 15:31:00.754245 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-46k8w" Jan 29 15:31:00 crc kubenswrapper[4753]: I0129 15:31:00.852879 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-db-sync-config-data\") pod \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " Jan 29 15:31:00 crc kubenswrapper[4753]: I0129 15:31:00.853087 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-combined-ca-bundle\") pod \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " Jan 29 15:31:00 crc kubenswrapper[4753]: I0129 15:31:00.853190 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt6sn\" (UniqueName: \"kubernetes.io/projected/a2834447-2ee8-4608-85ff-805e2fcbe7c6-kube-api-access-zt6sn\") pod \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\" (UID: \"a2834447-2ee8-4608-85ff-805e2fcbe7c6\") " Jan 29 15:31:00 crc kubenswrapper[4753]: I0129 15:31:00.857722 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a2834447-2ee8-4608-85ff-805e2fcbe7c6" (UID: "a2834447-2ee8-4608-85ff-805e2fcbe7c6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:31:00 crc kubenswrapper[4753]: I0129 15:31:00.857806 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2834447-2ee8-4608-85ff-805e2fcbe7c6-kube-api-access-zt6sn" (OuterVolumeSpecName: "kube-api-access-zt6sn") pod "a2834447-2ee8-4608-85ff-805e2fcbe7c6" (UID: "a2834447-2ee8-4608-85ff-805e2fcbe7c6"). InnerVolumeSpecName "kube-api-access-zt6sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:31:00 crc kubenswrapper[4753]: I0129 15:31:00.877224 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2834447-2ee8-4608-85ff-805e2fcbe7c6" (UID: "a2834447-2ee8-4608-85ff-805e2fcbe7c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:31:00 crc kubenswrapper[4753]: I0129 15:31:00.955956 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:00 crc kubenswrapper[4753]: I0129 15:31:00.956019 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt6sn\" (UniqueName: \"kubernetes.io/projected/a2834447-2ee8-4608-85ff-805e2fcbe7c6-kube-api-access-zt6sn\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:00 crc kubenswrapper[4753]: I0129 15:31:00.956040 4753 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2834447-2ee8-4608-85ff-805e2fcbe7c6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.398371 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-46k8w" event={"ID":"a2834447-2ee8-4608-85ff-805e2fcbe7c6","Type":"ContainerDied","Data":"943e3894e00b1ca513697d8e2e1303a5efd73f7ceed6f60fb1302b8e87ef0f44"} Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.398424 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="943e3894e00b1ca513697d8e2e1303a5efd73f7ceed6f60fb1302b8e87ef0f44" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.398531 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-46k8w" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.719709 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78c4f66974-59mhf"] Jan 29 15:31:01 crc kubenswrapper[4753]: E0129 15:31:01.720469 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2834447-2ee8-4608-85ff-805e2fcbe7c6" containerName="barbican-db-sync" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.720494 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2834447-2ee8-4608-85ff-805e2fcbe7c6" containerName="barbican-db-sync" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.720692 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2834447-2ee8-4608-85ff-805e2fcbe7c6" containerName="barbican-db-sync" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.721762 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.726669 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6hxgp" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.726861 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.726985 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.761631 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7cfd7cb57f-82mpw"] Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.763423 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.765563 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78c4f66974-59mhf"] Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.769058 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.790237 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cfd7cb57f-82mpw"] Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.818549 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7df66dbc59-6p6js"] Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.820250 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.839880 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7df66dbc59-6p6js"] Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.874992 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8d4h\" (UniqueName: \"kubernetes.io/projected/1b997396-38f1-426e-a2d8-318808c53a6c-kube-api-access-z8d4h\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.875135 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b997396-38f1-426e-a2d8-318808c53a6c-logs\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.875189 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdg56\" (UniqueName: \"kubernetes.io/projected/710ca968-bd29-41e7-9101-11e445b4fc1b-kube-api-access-zdg56\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.875238 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/710ca968-bd29-41e7-9101-11e445b4fc1b-config-data-custom\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.875278 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b997396-38f1-426e-a2d8-318808c53a6c-config-data-custom\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.875303 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710ca968-bd29-41e7-9101-11e445b4fc1b-combined-ca-bundle\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.875325 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710ca968-bd29-41e7-9101-11e445b4fc1b-config-data\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.875354 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b997396-38f1-426e-a2d8-318808c53a6c-combined-ca-bundle\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.875380 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b997396-38f1-426e-a2d8-318808c53a6c-config-data\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.875443 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710ca968-bd29-41e7-9101-11e445b4fc1b-logs\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.875911 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-699cfdb8d4-skqb8"] Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.877845 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.880268 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.887311 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-699cfdb8d4-skqb8"] Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977118 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b997396-38f1-426e-a2d8-318808c53a6c-logs\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977198 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdg56\" (UniqueName: \"kubernetes.io/projected/710ca968-bd29-41e7-9101-11e445b4fc1b-kube-api-access-zdg56\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977241 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-dns-svc\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977270 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a04bc36-b333-40bb-8a95-38a148b53e8b-combined-ca-bundle\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977307 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/710ca968-bd29-41e7-9101-11e445b4fc1b-config-data-custom\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977451 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-sb\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977490 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b997396-38f1-426e-a2d8-318808c53a6c-config-data-custom\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977518 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710ca968-bd29-41e7-9101-11e445b4fc1b-combined-ca-bundle\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977542 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-config\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977579 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710ca968-bd29-41e7-9101-11e445b4fc1b-config-data\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977618 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a04bc36-b333-40bb-8a95-38a148b53e8b-config-data-custom\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977644 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b997396-38f1-426e-a2d8-318808c53a6c-combined-ca-bundle\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977669 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b997396-38f1-426e-a2d8-318808c53a6c-config-data\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977699 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwskh\" (UniqueName: \"kubernetes.io/projected/691aa928-1e2b-4e6c-a43b-29c523569e2c-kube-api-access-qwskh\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977696 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b997396-38f1-426e-a2d8-318808c53a6c-logs\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977798 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710ca968-bd29-41e7-9101-11e445b4fc1b-logs\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977909 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-728zv\" (UniqueName: \"kubernetes.io/projected/2a04bc36-b333-40bb-8a95-38a148b53e8b-kube-api-access-728zv\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.977967 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-nb\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.978004 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a04bc36-b333-40bb-8a95-38a148b53e8b-config-data\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.978070 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8d4h\" (UniqueName: \"kubernetes.io/projected/1b997396-38f1-426e-a2d8-318808c53a6c-kube-api-access-z8d4h\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.978119 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a04bc36-b333-40bb-8a95-38a148b53e8b-logs\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.978531 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710ca968-bd29-41e7-9101-11e445b4fc1b-logs\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.981781 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710ca968-bd29-41e7-9101-11e445b4fc1b-combined-ca-bundle\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.981790 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b997396-38f1-426e-a2d8-318808c53a6c-combined-ca-bundle\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.981794 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/710ca968-bd29-41e7-9101-11e445b4fc1b-config-data-custom\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.983073 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b997396-38f1-426e-a2d8-318808c53a6c-config-data-custom\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.993802 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710ca968-bd29-41e7-9101-11e445b4fc1b-config-data\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.995320 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdg56\" (UniqueName: \"kubernetes.io/projected/710ca968-bd29-41e7-9101-11e445b4fc1b-kube-api-access-zdg56\") pod \"barbican-keystone-listener-78c4f66974-59mhf\" (UID: \"710ca968-bd29-41e7-9101-11e445b4fc1b\") " pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:01 crc kubenswrapper[4753]: I0129 15:31:01.995632 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b997396-38f1-426e-a2d8-318808c53a6c-config-data\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.007067 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8d4h\" (UniqueName: \"kubernetes.io/projected/1b997396-38f1-426e-a2d8-318808c53a6c-kube-api-access-z8d4h\") pod \"barbican-worker-7cfd7cb57f-82mpw\" (UID: \"1b997396-38f1-426e-a2d8-318808c53a6c\") " pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.048121 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.080395 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-sb\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.080444 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-config\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.080466 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a04bc36-b333-40bb-8a95-38a148b53e8b-config-data-custom\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.080496 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwskh\" (UniqueName: \"kubernetes.io/projected/691aa928-1e2b-4e6c-a43b-29c523569e2c-kube-api-access-qwskh\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.080535 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-728zv\" (UniqueName: \"kubernetes.io/projected/2a04bc36-b333-40bb-8a95-38a148b53e8b-kube-api-access-728zv\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.080555 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-nb\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.080574 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a04bc36-b333-40bb-8a95-38a148b53e8b-config-data\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.080603 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a04bc36-b333-40bb-8a95-38a148b53e8b-logs\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.080655 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-dns-svc\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.080675 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a04bc36-b333-40bb-8a95-38a148b53e8b-combined-ca-bundle\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.081384 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a04bc36-b333-40bb-8a95-38a148b53e8b-logs\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.081678 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-sb\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.081738 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-config\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.081958 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-nb\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.082224 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-dns-svc\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.085515 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a04bc36-b333-40bb-8a95-38a148b53e8b-combined-ca-bundle\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.085983 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a04bc36-b333-40bb-8a95-38a148b53e8b-config-data\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.086412 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cfd7cb57f-82mpw" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.101985 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a04bc36-b333-40bb-8a95-38a148b53e8b-config-data-custom\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.104232 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwskh\" (UniqueName: \"kubernetes.io/projected/691aa928-1e2b-4e6c-a43b-29c523569e2c-kube-api-access-qwskh\") pod \"dnsmasq-dns-7df66dbc59-6p6js\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.105433 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-728zv\" (UniqueName: \"kubernetes.io/projected/2a04bc36-b333-40bb-8a95-38a148b53e8b-kube-api-access-728zv\") pod \"barbican-api-699cfdb8d4-skqb8\" (UID: \"2a04bc36-b333-40bb-8a95-38a148b53e8b\") " pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.139934 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.200073 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.567010 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78c4f66974-59mhf"] Jan 29 15:31:02 crc kubenswrapper[4753]: W0129 15:31:02.568726 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod710ca968_bd29_41e7_9101_11e445b4fc1b.slice/crio-c125f5fa4e3594b773d43a951e38c2a57879cbe6e60a63be8f0858669c63e12b WatchSource:0}: Error finding container c125f5fa4e3594b773d43a951e38c2a57879cbe6e60a63be8f0858669c63e12b: Status 404 returned error can't find the container with id c125f5fa4e3594b773d43a951e38c2a57879cbe6e60a63be8f0858669c63e12b Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.660211 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cfd7cb57f-82mpw"] Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.667577 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7df66dbc59-6p6js"] Jan 29 15:31:02 crc kubenswrapper[4753]: I0129 15:31:02.752085 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-699cfdb8d4-skqb8"] Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.414024 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-699cfdb8d4-skqb8" event={"ID":"2a04bc36-b333-40bb-8a95-38a148b53e8b","Type":"ContainerStarted","Data":"797af24abad24336cb3cd57df7f960fb6a14a9256377a120909f83b36949c205"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.414382 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-699cfdb8d4-skqb8" event={"ID":"2a04bc36-b333-40bb-8a95-38a148b53e8b","Type":"ContainerStarted","Data":"8559777d7ce99af0c24d0164997497a92d034cc1c2081325e8ffa8c3e2082ff8"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.414393 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-699cfdb8d4-skqb8" event={"ID":"2a04bc36-b333-40bb-8a95-38a148b53e8b","Type":"ContainerStarted","Data":"d5a81673802fd33d7019897aa5ce98bfa9ba94496976410d5ef0a8658f8dba15"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.414409 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.414421 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.417940 4753 generic.go:334] "Generic (PLEG): container finished" podID="691aa928-1e2b-4e6c-a43b-29c523569e2c" containerID="f57e6f11bcf20068011a3c67cd574f0bbc022cff6ebd880f5e8728018b318d65" exitCode=0 Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.418052 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" event={"ID":"691aa928-1e2b-4e6c-a43b-29c523569e2c","Type":"ContainerDied","Data":"f57e6f11bcf20068011a3c67cd574f0bbc022cff6ebd880f5e8728018b318d65"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.418204 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" event={"ID":"691aa928-1e2b-4e6c-a43b-29c523569e2c","Type":"ContainerStarted","Data":"bcb788ff80e1e29c5c5e0992c93d385654ad944f8fb31d7a5c66856c5857ff67"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.421556 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cfd7cb57f-82mpw" event={"ID":"1b997396-38f1-426e-a2d8-318808c53a6c","Type":"ContainerStarted","Data":"fc335aeae4e6a9092b660973da65d28e70e4b10331167d4a05f1a01e0c50bc35"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.421592 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cfd7cb57f-82mpw" event={"ID":"1b997396-38f1-426e-a2d8-318808c53a6c","Type":"ContainerStarted","Data":"719e187beea4ac88dc191326d15fad1547ee5688e92ff5d12f84cbc19f41ef2e"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.421637 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cfd7cb57f-82mpw" event={"ID":"1b997396-38f1-426e-a2d8-318808c53a6c","Type":"ContainerStarted","Data":"39aa0f0161a08b8a508b2450a3bd2c07a91f135cd793cfc2354c3259161be9e7"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.424379 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" event={"ID":"710ca968-bd29-41e7-9101-11e445b4fc1b","Type":"ContainerStarted","Data":"de901fb29df0bab547e9006ebef5cdb98214c0422393f61ece31a9e1b644ef11"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.424414 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" event={"ID":"710ca968-bd29-41e7-9101-11e445b4fc1b","Type":"ContainerStarted","Data":"5abef1a0cf441dbaaf9db43d7bd77dadd9cafdb62b09bf625220779b11885426"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.424424 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" event={"ID":"710ca968-bd29-41e7-9101-11e445b4fc1b","Type":"ContainerStarted","Data":"c125f5fa4e3594b773d43a951e38c2a57879cbe6e60a63be8f0858669c63e12b"} Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.448384 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-699cfdb8d4-skqb8" podStartSLOduration=2.448359249 podStartE2EDuration="2.448359249s" podCreationTimestamp="2026-01-29 15:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:31:03.443316182 +0000 UTC m=+5298.138050584" watchObservedRunningTime="2026-01-29 15:31:03.448359249 +0000 UTC m=+5298.143093671" Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.472140 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7cfd7cb57f-82mpw" podStartSLOduration=2.47212627 podStartE2EDuration="2.47212627s" podCreationTimestamp="2026-01-29 15:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:31:03.468860681 +0000 UTC m=+5298.163595063" watchObservedRunningTime="2026-01-29 15:31:03.47212627 +0000 UTC m=+5298.166860652" Jan 29 15:31:03 crc kubenswrapper[4753]: I0129 15:31:03.494562 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78c4f66974-59mhf" podStartSLOduration=2.494543624 podStartE2EDuration="2.494543624s" podCreationTimestamp="2026-01-29 15:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:31:03.486866336 +0000 UTC m=+5298.181600718" watchObservedRunningTime="2026-01-29 15:31:03.494543624 +0000 UTC m=+5298.189278006" Jan 29 15:31:04 crc kubenswrapper[4753]: I0129 15:31:04.433516 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" event={"ID":"691aa928-1e2b-4e6c-a43b-29c523569e2c","Type":"ContainerStarted","Data":"3a33f4204986474dfbf4b9c7d2a77142f19b53e3d3c4b19769ac7d1b2f59eabf"} Jan 29 15:31:04 crc kubenswrapper[4753]: I0129 15:31:04.433981 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:04 crc kubenswrapper[4753]: I0129 15:31:04.454872 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" podStartSLOduration=3.454850132 podStartE2EDuration="3.454850132s" podCreationTimestamp="2026-01-29 15:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:31:04.448603854 +0000 UTC m=+5299.143338236" watchObservedRunningTime="2026-01-29 15:31:04.454850132 +0000 UTC m=+5299.149584514" Jan 29 15:31:06 crc kubenswrapper[4753]: I0129 15:31:06.156868 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:31:06 crc kubenswrapper[4753]: E0129 15:31:06.157476 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:31:07 crc kubenswrapper[4753]: I0129 15:31:07.132709 4753 scope.go:117] "RemoveContainer" containerID="0bab75a23c795ae9cb14470ba54f6cc6c33394ce0ca512bb1a47e3e31d29f186" Jan 29 15:31:08 crc kubenswrapper[4753]: I0129 15:31:08.622335 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:10 crc kubenswrapper[4753]: I0129 15:31:10.106606 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-699cfdb8d4-skqb8" Jan 29 15:31:12 crc kubenswrapper[4753]: I0129 15:31:12.142333 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:12 crc kubenswrapper[4753]: I0129 15:31:12.249230 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64dc58b59-5pmzk"] Jan 29 15:31:12 crc kubenswrapper[4753]: I0129 15:31:12.249514 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" podUID="a5ed2a79-bd49-4241-a902-633a212cb20a" containerName="dnsmasq-dns" containerID="cri-o://a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144" gracePeriod=10 Jan 29 15:31:12 crc kubenswrapper[4753]: I0129 15:31:12.945321 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.115784 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-config\") pod \"a5ed2a79-bd49-4241-a902-633a212cb20a\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.115829 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-nb\") pod \"a5ed2a79-bd49-4241-a902-633a212cb20a\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.115855 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skv8l\" (UniqueName: \"kubernetes.io/projected/a5ed2a79-bd49-4241-a902-633a212cb20a-kube-api-access-skv8l\") pod \"a5ed2a79-bd49-4241-a902-633a212cb20a\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.115985 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-sb\") pod \"a5ed2a79-bd49-4241-a902-633a212cb20a\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.116020 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-dns-svc\") pod \"a5ed2a79-bd49-4241-a902-633a212cb20a\" (UID: \"a5ed2a79-bd49-4241-a902-633a212cb20a\") " Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.124331 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ed2a79-bd49-4241-a902-633a212cb20a-kube-api-access-skv8l" (OuterVolumeSpecName: "kube-api-access-skv8l") pod "a5ed2a79-bd49-4241-a902-633a212cb20a" (UID: "a5ed2a79-bd49-4241-a902-633a212cb20a"). InnerVolumeSpecName "kube-api-access-skv8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.155762 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a5ed2a79-bd49-4241-a902-633a212cb20a" (UID: "a5ed2a79-bd49-4241-a902-633a212cb20a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.169463 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a5ed2a79-bd49-4241-a902-633a212cb20a" (UID: "a5ed2a79-bd49-4241-a902-633a212cb20a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.179415 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-config" (OuterVolumeSpecName: "config") pod "a5ed2a79-bd49-4241-a902-633a212cb20a" (UID: "a5ed2a79-bd49-4241-a902-633a212cb20a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.188035 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a5ed2a79-bd49-4241-a902-633a212cb20a" (UID: "a5ed2a79-bd49-4241-a902-633a212cb20a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.217922 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.217952 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.217961 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.217969 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5ed2a79-bd49-4241-a902-633a212cb20a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.217979 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skv8l\" (UniqueName: \"kubernetes.io/projected/a5ed2a79-bd49-4241-a902-633a212cb20a-kube-api-access-skv8l\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.520127 4753 generic.go:334] "Generic (PLEG): container finished" podID="a5ed2a79-bd49-4241-a902-633a212cb20a" containerID="a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144" exitCode=0 Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.520193 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" event={"ID":"a5ed2a79-bd49-4241-a902-633a212cb20a","Type":"ContainerDied","Data":"a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144"} Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.520232 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" event={"ID":"a5ed2a79-bd49-4241-a902-633a212cb20a","Type":"ContainerDied","Data":"e137a76c5d21ed02d00c79e6b367c3f432fc0186d1294ff8421857a925d07aa2"} Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.520251 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dc58b59-5pmzk" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.520256 4753 scope.go:117] "RemoveContainer" containerID="a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.572591 4753 scope.go:117] "RemoveContainer" containerID="7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.572876 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64dc58b59-5pmzk"] Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.588722 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64dc58b59-5pmzk"] Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.630494 4753 scope.go:117] "RemoveContainer" containerID="a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144" Jan 29 15:31:13 crc kubenswrapper[4753]: E0129 15:31:13.631299 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144\": container with ID starting with a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144 not found: ID does not exist" containerID="a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.631338 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144"} err="failed to get container status \"a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144\": rpc error: code = NotFound desc = could not find container \"a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144\": container with ID starting with a69cf3083a883937156e5ff853f6c5595c8acbfb64420ebeaa41587dc8631144 not found: ID does not exist" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.631364 4753 scope.go:117] "RemoveContainer" containerID="7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922" Jan 29 15:31:13 crc kubenswrapper[4753]: E0129 15:31:13.631817 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922\": container with ID starting with 7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922 not found: ID does not exist" containerID="7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922" Jan 29 15:31:13 crc kubenswrapper[4753]: I0129 15:31:13.631853 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922"} err="failed to get container status \"7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922\": rpc error: code = NotFound desc = could not find container \"7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922\": container with ID starting with 7e46e29e243fab4923ce3282c58673099fbecc59e741ca62bcd7a4debb6da922 not found: ID does not exist" Jan 29 15:31:14 crc kubenswrapper[4753]: I0129 15:31:14.166218 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ed2a79-bd49-4241-a902-633a212cb20a" path="/var/lib/kubelet/pods/a5ed2a79-bd49-4241-a902-633a212cb20a/volumes" Jan 29 15:31:17 crc kubenswrapper[4753]: I0129 15:31:17.149574 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:31:17 crc kubenswrapper[4753]: E0129 15:31:17.150226 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.399659 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qslf4"] Jan 29 15:31:22 crc kubenswrapper[4753]: E0129 15:31:22.400348 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ed2a79-bd49-4241-a902-633a212cb20a" containerName="init" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.400367 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ed2a79-bd49-4241-a902-633a212cb20a" containerName="init" Jan 29 15:31:22 crc kubenswrapper[4753]: E0129 15:31:22.400388 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ed2a79-bd49-4241-a902-633a212cb20a" containerName="dnsmasq-dns" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.400394 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ed2a79-bd49-4241-a902-633a212cb20a" containerName="dnsmasq-dns" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.400549 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ed2a79-bd49-4241-a902-633a212cb20a" containerName="dnsmasq-dns" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.401270 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qslf4" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.417284 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qslf4"] Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.423605 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9cac-account-create-update-fwgr5"] Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.425418 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9cac-account-create-update-fwgr5" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.427576 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.454627 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9cac-account-create-update-fwgr5"] Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.578002 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swtt4\" (UniqueName: \"kubernetes.io/projected/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-kube-api-access-swtt4\") pod \"neutron-db-create-qslf4\" (UID: \"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b\") " pod="openstack/neutron-db-create-qslf4" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.578050 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c5xw\" (UniqueName: \"kubernetes.io/projected/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-kube-api-access-2c5xw\") pod \"neutron-9cac-account-create-update-fwgr5\" (UID: \"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95\") " pod="openstack/neutron-9cac-account-create-update-fwgr5" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.578215 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-operator-scripts\") pod \"neutron-db-create-qslf4\" (UID: \"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b\") " pod="openstack/neutron-db-create-qslf4" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.578241 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-operator-scripts\") pod \"neutron-9cac-account-create-update-fwgr5\" (UID: \"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95\") " pod="openstack/neutron-9cac-account-create-update-fwgr5" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.680085 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swtt4\" (UniqueName: \"kubernetes.io/projected/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-kube-api-access-swtt4\") pod \"neutron-db-create-qslf4\" (UID: \"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b\") " pod="openstack/neutron-db-create-qslf4" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.680144 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c5xw\" (UniqueName: \"kubernetes.io/projected/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-kube-api-access-2c5xw\") pod \"neutron-9cac-account-create-update-fwgr5\" (UID: \"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95\") " pod="openstack/neutron-9cac-account-create-update-fwgr5" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.680221 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-operator-scripts\") pod \"neutron-db-create-qslf4\" (UID: \"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b\") " pod="openstack/neutron-db-create-qslf4" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.680242 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-operator-scripts\") pod \"neutron-9cac-account-create-update-fwgr5\" (UID: \"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95\") " pod="openstack/neutron-9cac-account-create-update-fwgr5" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.681138 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-operator-scripts\") pod \"neutron-9cac-account-create-update-fwgr5\" (UID: \"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95\") " pod="openstack/neutron-9cac-account-create-update-fwgr5" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.681606 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-operator-scripts\") pod \"neutron-db-create-qslf4\" (UID: \"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b\") " pod="openstack/neutron-db-create-qslf4" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.699519 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c5xw\" (UniqueName: \"kubernetes.io/projected/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-kube-api-access-2c5xw\") pod \"neutron-9cac-account-create-update-fwgr5\" (UID: \"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95\") " pod="openstack/neutron-9cac-account-create-update-fwgr5" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.699552 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swtt4\" (UniqueName: \"kubernetes.io/projected/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-kube-api-access-swtt4\") pod \"neutron-db-create-qslf4\" (UID: \"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b\") " pod="openstack/neutron-db-create-qslf4" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.718742 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qslf4" Jan 29 15:31:22 crc kubenswrapper[4753]: I0129 15:31:22.739826 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9cac-account-create-update-fwgr5" Jan 29 15:31:23 crc kubenswrapper[4753]: I0129 15:31:23.082924 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9cac-account-create-update-fwgr5"] Jan 29 15:31:23 crc kubenswrapper[4753]: W0129 15:31:23.086856 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8b3b498_38f4_46cd_b5a8_7fb0512ffb95.slice/crio-be3ea2ce9e243e39e0d1bfaa1372b1640160ce488501d67bbc72811e81477ba1 WatchSource:0}: Error finding container be3ea2ce9e243e39e0d1bfaa1372b1640160ce488501d67bbc72811e81477ba1: Status 404 returned error can't find the container with id be3ea2ce9e243e39e0d1bfaa1372b1640160ce488501d67bbc72811e81477ba1 Jan 29 15:31:23 crc kubenswrapper[4753]: I0129 15:31:23.203102 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qslf4"] Jan 29 15:31:23 crc kubenswrapper[4753]: I0129 15:31:23.609740 4753 generic.go:334] "Generic (PLEG): container finished" podID="e8b3b498-38f4-46cd-b5a8-7fb0512ffb95" containerID="3c45b582985d1e02bd7d28b086406d7fd8c82f48df6231daa33bf78cb067a865" exitCode=0 Jan 29 15:31:23 crc kubenswrapper[4753]: I0129 15:31:23.609810 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9cac-account-create-update-fwgr5" event={"ID":"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95","Type":"ContainerDied","Data":"3c45b582985d1e02bd7d28b086406d7fd8c82f48df6231daa33bf78cb067a865"} Jan 29 15:31:23 crc kubenswrapper[4753]: I0129 15:31:23.610135 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9cac-account-create-update-fwgr5" event={"ID":"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95","Type":"ContainerStarted","Data":"be3ea2ce9e243e39e0d1bfaa1372b1640160ce488501d67bbc72811e81477ba1"} Jan 29 15:31:23 crc kubenswrapper[4753]: I0129 15:31:23.613104 4753 generic.go:334] "Generic (PLEG): container finished" podID="b717bb47-ad09-46c8-8f5c-6d760dcdfc9b" containerID="55719bfd1035cecf78229922aae1343249d3fcc17d4b4201732714e31a0ae252" exitCode=0 Jan 29 15:31:23 crc kubenswrapper[4753]: I0129 15:31:23.613178 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qslf4" event={"ID":"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b","Type":"ContainerDied","Data":"55719bfd1035cecf78229922aae1343249d3fcc17d4b4201732714e31a0ae252"} Jan 29 15:31:23 crc kubenswrapper[4753]: I0129 15:31:23.613234 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qslf4" event={"ID":"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b","Type":"ContainerStarted","Data":"6bc229ca35e01bf0a36e7da59bae0cb82551769cef68dfcf44d2988f1a4ed649"} Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.031191 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9cac-account-create-update-fwgr5" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.039957 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qslf4" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.124921 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-operator-scripts\") pod \"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b\" (UID: \"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b\") " Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.125089 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swtt4\" (UniqueName: \"kubernetes.io/projected/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-kube-api-access-swtt4\") pod \"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b\" (UID: \"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b\") " Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.125136 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c5xw\" (UniqueName: \"kubernetes.io/projected/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-kube-api-access-2c5xw\") pod \"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95\" (UID: \"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95\") " Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.125229 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-operator-scripts\") pod \"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95\" (UID: \"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95\") " Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.125410 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b717bb47-ad09-46c8-8f5c-6d760dcdfc9b" (UID: "b717bb47-ad09-46c8-8f5c-6d760dcdfc9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.125937 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.125949 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8b3b498-38f4-46cd-b5a8-7fb0512ffb95" (UID: "e8b3b498-38f4-46cd-b5a8-7fb0512ffb95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.142974 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-kube-api-access-2c5xw" (OuterVolumeSpecName: "kube-api-access-2c5xw") pod "e8b3b498-38f4-46cd-b5a8-7fb0512ffb95" (UID: "e8b3b498-38f4-46cd-b5a8-7fb0512ffb95"). InnerVolumeSpecName "kube-api-access-2c5xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.143478 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-kube-api-access-swtt4" (OuterVolumeSpecName: "kube-api-access-swtt4") pod "b717bb47-ad09-46c8-8f5c-6d760dcdfc9b" (UID: "b717bb47-ad09-46c8-8f5c-6d760dcdfc9b"). InnerVolumeSpecName "kube-api-access-swtt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.227382 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c5xw\" (UniqueName: \"kubernetes.io/projected/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-kube-api-access-2c5xw\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.227429 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.227439 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swtt4\" (UniqueName: \"kubernetes.io/projected/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b-kube-api-access-swtt4\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.635392 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9cac-account-create-update-fwgr5" event={"ID":"e8b3b498-38f4-46cd-b5a8-7fb0512ffb95","Type":"ContainerDied","Data":"be3ea2ce9e243e39e0d1bfaa1372b1640160ce488501d67bbc72811e81477ba1"} Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.635708 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be3ea2ce9e243e39e0d1bfaa1372b1640160ce488501d67bbc72811e81477ba1" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.635420 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9cac-account-create-update-fwgr5" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.637359 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qslf4" event={"ID":"b717bb47-ad09-46c8-8f5c-6d760dcdfc9b","Type":"ContainerDied","Data":"6bc229ca35e01bf0a36e7da59bae0cb82551769cef68dfcf44d2988f1a4ed649"} Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.637465 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bc229ca35e01bf0a36e7da59bae0cb82551769cef68dfcf44d2988f1a4ed649" Jan 29 15:31:25 crc kubenswrapper[4753]: I0129 15:31:25.637386 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qslf4" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.740186 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8j5qx"] Jan 29 15:31:27 crc kubenswrapper[4753]: E0129 15:31:27.740865 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b3b498-38f4-46cd-b5a8-7fb0512ffb95" containerName="mariadb-account-create-update" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.740878 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b3b498-38f4-46cd-b5a8-7fb0512ffb95" containerName="mariadb-account-create-update" Jan 29 15:31:27 crc kubenswrapper[4753]: E0129 15:31:27.740898 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b717bb47-ad09-46c8-8f5c-6d760dcdfc9b" containerName="mariadb-database-create" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.740905 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b717bb47-ad09-46c8-8f5c-6d760dcdfc9b" containerName="mariadb-database-create" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.741063 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b3b498-38f4-46cd-b5a8-7fb0512ffb95" containerName="mariadb-account-create-update" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.741084 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b717bb47-ad09-46c8-8f5c-6d760dcdfc9b" containerName="mariadb-database-create" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.741770 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.744965 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.745373 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-22jzv" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.746518 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.748953 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8j5qx"] Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.870284 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjgl4\" (UniqueName: \"kubernetes.io/projected/df72e2f5-4140-4320-9057-573e0d202332-kube-api-access-vjgl4\") pod \"neutron-db-sync-8j5qx\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.870416 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-combined-ca-bundle\") pod \"neutron-db-sync-8j5qx\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.870508 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-config\") pod \"neutron-db-sync-8j5qx\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.971863 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-combined-ca-bundle\") pod \"neutron-db-sync-8j5qx\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.971970 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-config\") pod \"neutron-db-sync-8j5qx\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.972043 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjgl4\" (UniqueName: \"kubernetes.io/projected/df72e2f5-4140-4320-9057-573e0d202332-kube-api-access-vjgl4\") pod \"neutron-db-sync-8j5qx\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.977738 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-combined-ca-bundle\") pod \"neutron-db-sync-8j5qx\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.978561 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-config\") pod \"neutron-db-sync-8j5qx\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:27 crc kubenswrapper[4753]: I0129 15:31:27.988109 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjgl4\" (UniqueName: \"kubernetes.io/projected/df72e2f5-4140-4320-9057-573e0d202332-kube-api-access-vjgl4\") pod \"neutron-db-sync-8j5qx\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:28 crc kubenswrapper[4753]: I0129 15:31:28.069243 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:28 crc kubenswrapper[4753]: I0129 15:31:28.149862 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:31:28 crc kubenswrapper[4753]: E0129 15:31:28.150130 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:31:28 crc kubenswrapper[4753]: I0129 15:31:28.572440 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8j5qx"] Jan 29 15:31:28 crc kubenswrapper[4753]: W0129 15:31:28.575807 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf72e2f5_4140_4320_9057_573e0d202332.slice/crio-d5843dfcaea103749ace5037994454b8a9f466897abe206a6d7559b30be5e814 WatchSource:0}: Error finding container d5843dfcaea103749ace5037994454b8a9f466897abe206a6d7559b30be5e814: Status 404 returned error can't find the container with id d5843dfcaea103749ace5037994454b8a9f466897abe206a6d7559b30be5e814 Jan 29 15:31:28 crc kubenswrapper[4753]: I0129 15:31:28.658745 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8j5qx" event={"ID":"df72e2f5-4140-4320-9057-573e0d202332","Type":"ContainerStarted","Data":"d5843dfcaea103749ace5037994454b8a9f466897abe206a6d7559b30be5e814"} Jan 29 15:31:29 crc kubenswrapper[4753]: I0129 15:31:29.667190 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8j5qx" event={"ID":"df72e2f5-4140-4320-9057-573e0d202332","Type":"ContainerStarted","Data":"592d94274dcce8c03f05a8a7563552395e5dc3d1e4c1531a077c14acb7ab27cc"} Jan 29 15:31:33 crc kubenswrapper[4753]: I0129 15:31:33.698381 4753 generic.go:334] "Generic (PLEG): container finished" podID="df72e2f5-4140-4320-9057-573e0d202332" containerID="592d94274dcce8c03f05a8a7563552395e5dc3d1e4c1531a077c14acb7ab27cc" exitCode=0 Jan 29 15:31:33 crc kubenswrapper[4753]: I0129 15:31:33.698460 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8j5qx" event={"ID":"df72e2f5-4140-4320-9057-573e0d202332","Type":"ContainerDied","Data":"592d94274dcce8c03f05a8a7563552395e5dc3d1e4c1531a077c14acb7ab27cc"} Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.085872 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.193197 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjgl4\" (UniqueName: \"kubernetes.io/projected/df72e2f5-4140-4320-9057-573e0d202332-kube-api-access-vjgl4\") pod \"df72e2f5-4140-4320-9057-573e0d202332\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.193255 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-combined-ca-bundle\") pod \"df72e2f5-4140-4320-9057-573e0d202332\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.193377 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-config\") pod \"df72e2f5-4140-4320-9057-573e0d202332\" (UID: \"df72e2f5-4140-4320-9057-573e0d202332\") " Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.214435 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df72e2f5-4140-4320-9057-573e0d202332-kube-api-access-vjgl4" (OuterVolumeSpecName: "kube-api-access-vjgl4") pod "df72e2f5-4140-4320-9057-573e0d202332" (UID: "df72e2f5-4140-4320-9057-573e0d202332"). InnerVolumeSpecName "kube-api-access-vjgl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.223338 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-config" (OuterVolumeSpecName: "config") pod "df72e2f5-4140-4320-9057-573e0d202332" (UID: "df72e2f5-4140-4320-9057-573e0d202332"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.228047 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df72e2f5-4140-4320-9057-573e0d202332" (UID: "df72e2f5-4140-4320-9057-573e0d202332"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.296062 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.296104 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjgl4\" (UniqueName: \"kubernetes.io/projected/df72e2f5-4140-4320-9057-573e0d202332-kube-api-access-vjgl4\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.296120 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df72e2f5-4140-4320-9057-573e0d202332-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.719171 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8j5qx" event={"ID":"df72e2f5-4140-4320-9057-573e0d202332","Type":"ContainerDied","Data":"d5843dfcaea103749ace5037994454b8a9f466897abe206a6d7559b30be5e814"} Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.719456 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5843dfcaea103749ace5037994454b8a9f466897abe206a6d7559b30be5e814" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.719324 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8j5qx" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.957412 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcb785c5-whf45"] Jan 29 15:31:35 crc kubenswrapper[4753]: E0129 15:31:35.958074 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df72e2f5-4140-4320-9057-573e0d202332" containerName="neutron-db-sync" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.958092 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="df72e2f5-4140-4320-9057-573e0d202332" containerName="neutron-db-sync" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.958295 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="df72e2f5-4140-4320-9057-573e0d202332" containerName="neutron-db-sync" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.959175 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:35 crc kubenswrapper[4753]: I0129 15:31:35.995218 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcb785c5-whf45"] Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.031472 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c88956d6f-wsn9g"] Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.036932 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.039957 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-22jzv" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.040090 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.040892 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.044706 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c88956d6f-wsn9g"] Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.118415 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj9vq\" (UniqueName: \"kubernetes.io/projected/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-kube-api-access-gj9vq\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.118514 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgpk6\" (UniqueName: \"kubernetes.io/projected/b32c6f3c-139b-42b7-8f6c-5a93c188c343-kube-api-access-kgpk6\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.118545 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-combined-ca-bundle\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.118577 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-dns-svc\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.118602 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.118655 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.118677 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-httpd-config\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.118695 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-config\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.118724 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-config\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.219678 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj9vq\" (UniqueName: \"kubernetes.io/projected/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-kube-api-access-gj9vq\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.219772 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgpk6\" (UniqueName: \"kubernetes.io/projected/b32c6f3c-139b-42b7-8f6c-5a93c188c343-kube-api-access-kgpk6\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.219802 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-combined-ca-bundle\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.219830 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-dns-svc\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.219853 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.219876 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.219899 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-httpd-config\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.219921 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-config\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.219939 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-config\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.220965 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-config\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.220966 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.221182 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-dns-svc\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.221715 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.226415 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-combined-ca-bundle\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.227020 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-httpd-config\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.228510 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-config\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.235880 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgpk6\" (UniqueName: \"kubernetes.io/projected/b32c6f3c-139b-42b7-8f6c-5a93c188c343-kube-api-access-kgpk6\") pod \"dnsmasq-dns-76fcb785c5-whf45\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.236919 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj9vq\" (UniqueName: \"kubernetes.io/projected/56cda1a5-a73b-4da0-b9e1-0d95f12387c8-kube-api-access-gj9vq\") pod \"neutron-7c88956d6f-wsn9g\" (UID: \"56cda1a5-a73b-4da0-b9e1-0d95f12387c8\") " pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.332936 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.367365 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:36 crc kubenswrapper[4753]: I0129 15:31:36.848414 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcb785c5-whf45"] Jan 29 15:31:37 crc kubenswrapper[4753]: I0129 15:31:37.029112 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c88956d6f-wsn9g"] Jan 29 15:31:37 crc kubenswrapper[4753]: W0129 15:31:37.035444 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56cda1a5_a73b_4da0_b9e1_0d95f12387c8.slice/crio-69537ff847046dab2e275caf7f2f6ff7701b019a17197a0d8fa9baa9fabd02af WatchSource:0}: Error finding container 69537ff847046dab2e275caf7f2f6ff7701b019a17197a0d8fa9baa9fabd02af: Status 404 returned error can't find the container with id 69537ff847046dab2e275caf7f2f6ff7701b019a17197a0d8fa9baa9fabd02af Jan 29 15:31:37 crc kubenswrapper[4753]: I0129 15:31:37.748841 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c88956d6f-wsn9g" event={"ID":"56cda1a5-a73b-4da0-b9e1-0d95f12387c8","Type":"ContainerStarted","Data":"89af8283a0fbab760f56c4124f09ceaa4846a3659dd352dfec3ee8ec3e7afac1"} Jan 29 15:31:37 crc kubenswrapper[4753]: I0129 15:31:37.750410 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:31:37 crc kubenswrapper[4753]: I0129 15:31:37.750450 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c88956d6f-wsn9g" event={"ID":"56cda1a5-a73b-4da0-b9e1-0d95f12387c8","Type":"ContainerStarted","Data":"88621dc35bb2ce2246f110b5d57aaebf3f1caa38855ad2e783d0a7c40e79f796"} Jan 29 15:31:37 crc kubenswrapper[4753]: I0129 15:31:37.750469 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c88956d6f-wsn9g" event={"ID":"56cda1a5-a73b-4da0-b9e1-0d95f12387c8","Type":"ContainerStarted","Data":"69537ff847046dab2e275caf7f2f6ff7701b019a17197a0d8fa9baa9fabd02af"} Jan 29 15:31:37 crc kubenswrapper[4753]: I0129 15:31:37.753557 4753 generic.go:334] "Generic (PLEG): container finished" podID="b32c6f3c-139b-42b7-8f6c-5a93c188c343" containerID="70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320" exitCode=0 Jan 29 15:31:37 crc kubenswrapper[4753]: I0129 15:31:37.753738 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" event={"ID":"b32c6f3c-139b-42b7-8f6c-5a93c188c343","Type":"ContainerDied","Data":"70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320"} Jan 29 15:31:37 crc kubenswrapper[4753]: I0129 15:31:37.753779 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" event={"ID":"b32c6f3c-139b-42b7-8f6c-5a93c188c343","Type":"ContainerStarted","Data":"51eeadff935a19b5197ff40980293dc303e5c92b231b0c4de230df0335ff721c"} Jan 29 15:31:37 crc kubenswrapper[4753]: I0129 15:31:37.786500 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c88956d6f-wsn9g" podStartSLOduration=1.786481748 podStartE2EDuration="1.786481748s" podCreationTimestamp="2026-01-29 15:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:31:37.77764927 +0000 UTC m=+5332.472383662" watchObservedRunningTime="2026-01-29 15:31:37.786481748 +0000 UTC m=+5332.481216130" Jan 29 15:31:38 crc kubenswrapper[4753]: I0129 15:31:38.772239 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" event={"ID":"b32c6f3c-139b-42b7-8f6c-5a93c188c343","Type":"ContainerStarted","Data":"fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe"} Jan 29 15:31:38 crc kubenswrapper[4753]: I0129 15:31:38.772613 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:38 crc kubenswrapper[4753]: I0129 15:31:38.799469 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" podStartSLOduration=3.799450957 podStartE2EDuration="3.799450957s" podCreationTimestamp="2026-01-29 15:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:31:38.791861542 +0000 UTC m=+5333.486595954" watchObservedRunningTime="2026-01-29 15:31:38.799450957 +0000 UTC m=+5333.494185359" Jan 29 15:31:42 crc kubenswrapper[4753]: I0129 15:31:42.150795 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:31:42 crc kubenswrapper[4753]: E0129 15:31:42.151335 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:31:46 crc kubenswrapper[4753]: I0129 15:31:46.334331 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:31:46 crc kubenswrapper[4753]: I0129 15:31:46.394841 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7df66dbc59-6p6js"] Jan 29 15:31:46 crc kubenswrapper[4753]: I0129 15:31:46.395069 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" podUID="691aa928-1e2b-4e6c-a43b-29c523569e2c" containerName="dnsmasq-dns" containerID="cri-o://3a33f4204986474dfbf4b9c7d2a77142f19b53e3d3c4b19769ac7d1b2f59eabf" gracePeriod=10 Jan 29 15:31:46 crc kubenswrapper[4753]: I0129 15:31:46.840853 4753 generic.go:334] "Generic (PLEG): container finished" podID="691aa928-1e2b-4e6c-a43b-29c523569e2c" containerID="3a33f4204986474dfbf4b9c7d2a77142f19b53e3d3c4b19769ac7d1b2f59eabf" exitCode=0 Jan 29 15:31:46 crc kubenswrapper[4753]: I0129 15:31:46.841115 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" event={"ID":"691aa928-1e2b-4e6c-a43b-29c523569e2c","Type":"ContainerDied","Data":"3a33f4204986474dfbf4b9c7d2a77142f19b53e3d3c4b19769ac7d1b2f59eabf"} Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.005105 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.125112 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-config\") pod \"691aa928-1e2b-4e6c-a43b-29c523569e2c\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.125210 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-sb\") pod \"691aa928-1e2b-4e6c-a43b-29c523569e2c\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.125275 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwskh\" (UniqueName: \"kubernetes.io/projected/691aa928-1e2b-4e6c-a43b-29c523569e2c-kube-api-access-qwskh\") pod \"691aa928-1e2b-4e6c-a43b-29c523569e2c\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.125311 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-dns-svc\") pod \"691aa928-1e2b-4e6c-a43b-29c523569e2c\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.125342 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-nb\") pod \"691aa928-1e2b-4e6c-a43b-29c523569e2c\" (UID: \"691aa928-1e2b-4e6c-a43b-29c523569e2c\") " Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.133453 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/691aa928-1e2b-4e6c-a43b-29c523569e2c-kube-api-access-qwskh" (OuterVolumeSpecName: "kube-api-access-qwskh") pod "691aa928-1e2b-4e6c-a43b-29c523569e2c" (UID: "691aa928-1e2b-4e6c-a43b-29c523569e2c"). InnerVolumeSpecName "kube-api-access-qwskh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.169885 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "691aa928-1e2b-4e6c-a43b-29c523569e2c" (UID: "691aa928-1e2b-4e6c-a43b-29c523569e2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.175892 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "691aa928-1e2b-4e6c-a43b-29c523569e2c" (UID: "691aa928-1e2b-4e6c-a43b-29c523569e2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.189663 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "691aa928-1e2b-4e6c-a43b-29c523569e2c" (UID: "691aa928-1e2b-4e6c-a43b-29c523569e2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.199853 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-config" (OuterVolumeSpecName: "config") pod "691aa928-1e2b-4e6c-a43b-29c523569e2c" (UID: "691aa928-1e2b-4e6c-a43b-29c523569e2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.228136 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwskh\" (UniqueName: \"kubernetes.io/projected/691aa928-1e2b-4e6c-a43b-29c523569e2c-kube-api-access-qwskh\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.228185 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.228194 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.228203 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.228213 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/691aa928-1e2b-4e6c-a43b-29c523569e2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.850857 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" event={"ID":"691aa928-1e2b-4e6c-a43b-29c523569e2c","Type":"ContainerDied","Data":"bcb788ff80e1e29c5c5e0992c93d385654ad944f8fb31d7a5c66856c5857ff67"} Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.850907 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7df66dbc59-6p6js" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.850919 4753 scope.go:117] "RemoveContainer" containerID="3a33f4204986474dfbf4b9c7d2a77142f19b53e3d3c4b19769ac7d1b2f59eabf" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.877458 4753 scope.go:117] "RemoveContainer" containerID="f57e6f11bcf20068011a3c67cd574f0bbc022cff6ebd880f5e8728018b318d65" Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.882923 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7df66dbc59-6p6js"] Jan 29 15:31:47 crc kubenswrapper[4753]: I0129 15:31:47.895443 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7df66dbc59-6p6js"] Jan 29 15:31:48 crc kubenswrapper[4753]: I0129 15:31:48.178491 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="691aa928-1e2b-4e6c-a43b-29c523569e2c" path="/var/lib/kubelet/pods/691aa928-1e2b-4e6c-a43b-29c523569e2c/volumes" Jan 29 15:31:57 crc kubenswrapper[4753]: I0129 15:31:57.149913 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:31:57 crc kubenswrapper[4753]: I0129 15:31:57.976955 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"20e283980ad77b065d0dfa0d4018e594dc6a0c2625911542352b6567ce9e5f09"} Jan 29 15:32:06 crc kubenswrapper[4753]: I0129 15:32:06.386947 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7c88956d6f-wsn9g" Jan 29 15:32:12 crc kubenswrapper[4753]: I0129 15:32:12.916245 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lgmwt"] Jan 29 15:32:12 crc kubenswrapper[4753]: E0129 15:32:12.917616 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691aa928-1e2b-4e6c-a43b-29c523569e2c" containerName="init" Jan 29 15:32:12 crc kubenswrapper[4753]: I0129 15:32:12.917634 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="691aa928-1e2b-4e6c-a43b-29c523569e2c" containerName="init" Jan 29 15:32:12 crc kubenswrapper[4753]: E0129 15:32:12.917688 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691aa928-1e2b-4e6c-a43b-29c523569e2c" containerName="dnsmasq-dns" Jan 29 15:32:12 crc kubenswrapper[4753]: I0129 15:32:12.917697 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="691aa928-1e2b-4e6c-a43b-29c523569e2c" containerName="dnsmasq-dns" Jan 29 15:32:12 crc kubenswrapper[4753]: I0129 15:32:12.918001 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="691aa928-1e2b-4e6c-a43b-29c523569e2c" containerName="dnsmasq-dns" Jan 29 15:32:12 crc kubenswrapper[4753]: I0129 15:32:12.918912 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lgmwt" Jan 29 15:32:12 crc kubenswrapper[4753]: I0129 15:32:12.940614 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lgmwt"] Jan 29 15:32:12 crc kubenswrapper[4753]: I0129 15:32:12.985217 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f16af97b-955b-4903-b680-f7250f57874f-operator-scripts\") pod \"glance-db-create-lgmwt\" (UID: \"f16af97b-955b-4903-b680-f7250f57874f\") " pod="openstack/glance-db-create-lgmwt" Jan 29 15:32:12 crc kubenswrapper[4753]: I0129 15:32:12.985312 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td8pz\" (UniqueName: \"kubernetes.io/projected/f16af97b-955b-4903-b680-f7250f57874f-kube-api-access-td8pz\") pod \"glance-db-create-lgmwt\" (UID: \"f16af97b-955b-4903-b680-f7250f57874f\") " pod="openstack/glance-db-create-lgmwt" Jan 29 15:32:12 crc kubenswrapper[4753]: I0129 15:32:12.997464 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bc3f-account-create-update-94pk6"] Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.001849 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc3f-account-create-update-94pk6" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.012140 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bc3f-account-create-update-94pk6"] Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.017598 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.086292 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/515e5952-854e-4cbe-9d9b-e6b27d558e69-operator-scripts\") pod \"glance-bc3f-account-create-update-94pk6\" (UID: \"515e5952-854e-4cbe-9d9b-e6b27d558e69\") " pod="openstack/glance-bc3f-account-create-update-94pk6" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.086386 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgsrr\" (UniqueName: \"kubernetes.io/projected/515e5952-854e-4cbe-9d9b-e6b27d558e69-kube-api-access-bgsrr\") pod \"glance-bc3f-account-create-update-94pk6\" (UID: \"515e5952-854e-4cbe-9d9b-e6b27d558e69\") " pod="openstack/glance-bc3f-account-create-update-94pk6" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.086503 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f16af97b-955b-4903-b680-f7250f57874f-operator-scripts\") pod \"glance-db-create-lgmwt\" (UID: \"f16af97b-955b-4903-b680-f7250f57874f\") " pod="openstack/glance-db-create-lgmwt" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.086660 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td8pz\" (UniqueName: \"kubernetes.io/projected/f16af97b-955b-4903-b680-f7250f57874f-kube-api-access-td8pz\") pod \"glance-db-create-lgmwt\" (UID: \"f16af97b-955b-4903-b680-f7250f57874f\") " pod="openstack/glance-db-create-lgmwt" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.087202 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f16af97b-955b-4903-b680-f7250f57874f-operator-scripts\") pod \"glance-db-create-lgmwt\" (UID: \"f16af97b-955b-4903-b680-f7250f57874f\") " pod="openstack/glance-db-create-lgmwt" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.121750 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td8pz\" (UniqueName: \"kubernetes.io/projected/f16af97b-955b-4903-b680-f7250f57874f-kube-api-access-td8pz\") pod \"glance-db-create-lgmwt\" (UID: \"f16af97b-955b-4903-b680-f7250f57874f\") " pod="openstack/glance-db-create-lgmwt" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.188887 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/515e5952-854e-4cbe-9d9b-e6b27d558e69-operator-scripts\") pod \"glance-bc3f-account-create-update-94pk6\" (UID: \"515e5952-854e-4cbe-9d9b-e6b27d558e69\") " pod="openstack/glance-bc3f-account-create-update-94pk6" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.188991 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgsrr\" (UniqueName: \"kubernetes.io/projected/515e5952-854e-4cbe-9d9b-e6b27d558e69-kube-api-access-bgsrr\") pod \"glance-bc3f-account-create-update-94pk6\" (UID: \"515e5952-854e-4cbe-9d9b-e6b27d558e69\") " pod="openstack/glance-bc3f-account-create-update-94pk6" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.189609 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/515e5952-854e-4cbe-9d9b-e6b27d558e69-operator-scripts\") pod \"glance-bc3f-account-create-update-94pk6\" (UID: \"515e5952-854e-4cbe-9d9b-e6b27d558e69\") " pod="openstack/glance-bc3f-account-create-update-94pk6" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.206938 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgsrr\" (UniqueName: \"kubernetes.io/projected/515e5952-854e-4cbe-9d9b-e6b27d558e69-kube-api-access-bgsrr\") pod \"glance-bc3f-account-create-update-94pk6\" (UID: \"515e5952-854e-4cbe-9d9b-e6b27d558e69\") " pod="openstack/glance-bc3f-account-create-update-94pk6" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.255269 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lgmwt" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.332608 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc3f-account-create-update-94pk6" Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.715025 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lgmwt"] Jan 29 15:32:13 crc kubenswrapper[4753]: I0129 15:32:13.871906 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bc3f-account-create-update-94pk6"] Jan 29 15:32:13 crc kubenswrapper[4753]: W0129 15:32:13.872575 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod515e5952_854e_4cbe_9d9b_e6b27d558e69.slice/crio-5b2a4bdd89e75729cb7644323169e2844f9d67e593205081f1588799aa94e824 WatchSource:0}: Error finding container 5b2a4bdd89e75729cb7644323169e2844f9d67e593205081f1588799aa94e824: Status 404 returned error can't find the container with id 5b2a4bdd89e75729cb7644323169e2844f9d67e593205081f1588799aa94e824 Jan 29 15:32:14 crc kubenswrapper[4753]: I0129 15:32:14.112010 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc3f-account-create-update-94pk6" event={"ID":"515e5952-854e-4cbe-9d9b-e6b27d558e69","Type":"ContainerStarted","Data":"d829463f61eaf3c4db63ee69993e3899aad905441208fee8b6d0864b72523a57"} Jan 29 15:32:14 crc kubenswrapper[4753]: I0129 15:32:14.112071 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc3f-account-create-update-94pk6" event={"ID":"515e5952-854e-4cbe-9d9b-e6b27d558e69","Type":"ContainerStarted","Data":"5b2a4bdd89e75729cb7644323169e2844f9d67e593205081f1588799aa94e824"} Jan 29 15:32:14 crc kubenswrapper[4753]: I0129 15:32:14.115709 4753 generic.go:334] "Generic (PLEG): container finished" podID="f16af97b-955b-4903-b680-f7250f57874f" containerID="f596a6ac800a4d5f38e9ac5b72f7c6590e212839d6b81773272a7724d90c3c65" exitCode=0 Jan 29 15:32:14 crc kubenswrapper[4753]: I0129 15:32:14.115795 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lgmwt" event={"ID":"f16af97b-955b-4903-b680-f7250f57874f","Type":"ContainerDied","Data":"f596a6ac800a4d5f38e9ac5b72f7c6590e212839d6b81773272a7724d90c3c65"} Jan 29 15:32:14 crc kubenswrapper[4753]: I0129 15:32:14.116005 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lgmwt" event={"ID":"f16af97b-955b-4903-b680-f7250f57874f","Type":"ContainerStarted","Data":"75ac2ae13a84e86f2cd4c396f1e2d531ec9fa8470df491899dfaa2d15858ca30"} Jan 29 15:32:14 crc kubenswrapper[4753]: I0129 15:32:14.136687 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-bc3f-account-create-update-94pk6" podStartSLOduration=2.136666162 podStartE2EDuration="2.136666162s" podCreationTimestamp="2026-01-29 15:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:32:14.128286996 +0000 UTC m=+5368.823021408" watchObservedRunningTime="2026-01-29 15:32:14.136666162 +0000 UTC m=+5368.831400534" Jan 29 15:32:15 crc kubenswrapper[4753]: I0129 15:32:15.127120 4753 generic.go:334] "Generic (PLEG): container finished" podID="515e5952-854e-4cbe-9d9b-e6b27d558e69" containerID="d829463f61eaf3c4db63ee69993e3899aad905441208fee8b6d0864b72523a57" exitCode=0 Jan 29 15:32:15 crc kubenswrapper[4753]: I0129 15:32:15.127209 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc3f-account-create-update-94pk6" event={"ID":"515e5952-854e-4cbe-9d9b-e6b27d558e69","Type":"ContainerDied","Data":"d829463f61eaf3c4db63ee69993e3899aad905441208fee8b6d0864b72523a57"} Jan 29 15:32:15 crc kubenswrapper[4753]: I0129 15:32:15.410638 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lgmwt" Jan 29 15:32:15 crc kubenswrapper[4753]: I0129 15:32:15.563815 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f16af97b-955b-4903-b680-f7250f57874f-operator-scripts\") pod \"f16af97b-955b-4903-b680-f7250f57874f\" (UID: \"f16af97b-955b-4903-b680-f7250f57874f\") " Jan 29 15:32:15 crc kubenswrapper[4753]: I0129 15:32:15.563930 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td8pz\" (UniqueName: \"kubernetes.io/projected/f16af97b-955b-4903-b680-f7250f57874f-kube-api-access-td8pz\") pod \"f16af97b-955b-4903-b680-f7250f57874f\" (UID: \"f16af97b-955b-4903-b680-f7250f57874f\") " Jan 29 15:32:15 crc kubenswrapper[4753]: I0129 15:32:15.570842 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f16af97b-955b-4903-b680-f7250f57874f-kube-api-access-td8pz" (OuterVolumeSpecName: "kube-api-access-td8pz") pod "f16af97b-955b-4903-b680-f7250f57874f" (UID: "f16af97b-955b-4903-b680-f7250f57874f"). InnerVolumeSpecName "kube-api-access-td8pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:32:15 crc kubenswrapper[4753]: I0129 15:32:15.571299 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f16af97b-955b-4903-b680-f7250f57874f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f16af97b-955b-4903-b680-f7250f57874f" (UID: "f16af97b-955b-4903-b680-f7250f57874f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:32:15 crc kubenswrapper[4753]: I0129 15:32:15.665707 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td8pz\" (UniqueName: \"kubernetes.io/projected/f16af97b-955b-4903-b680-f7250f57874f-kube-api-access-td8pz\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:15 crc kubenswrapper[4753]: I0129 15:32:15.665762 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f16af97b-955b-4903-b680-f7250f57874f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:16 crc kubenswrapper[4753]: I0129 15:32:16.139491 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lgmwt" Jan 29 15:32:16 crc kubenswrapper[4753]: I0129 15:32:16.140264 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lgmwt" event={"ID":"f16af97b-955b-4903-b680-f7250f57874f","Type":"ContainerDied","Data":"75ac2ae13a84e86f2cd4c396f1e2d531ec9fa8470df491899dfaa2d15858ca30"} Jan 29 15:32:16 crc kubenswrapper[4753]: I0129 15:32:16.140295 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75ac2ae13a84e86f2cd4c396f1e2d531ec9fa8470df491899dfaa2d15858ca30" Jan 29 15:32:16 crc kubenswrapper[4753]: I0129 15:32:16.461381 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc3f-account-create-update-94pk6" Jan 29 15:32:16 crc kubenswrapper[4753]: I0129 15:32:16.581743 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgsrr\" (UniqueName: \"kubernetes.io/projected/515e5952-854e-4cbe-9d9b-e6b27d558e69-kube-api-access-bgsrr\") pod \"515e5952-854e-4cbe-9d9b-e6b27d558e69\" (UID: \"515e5952-854e-4cbe-9d9b-e6b27d558e69\") " Jan 29 15:32:16 crc kubenswrapper[4753]: I0129 15:32:16.581978 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/515e5952-854e-4cbe-9d9b-e6b27d558e69-operator-scripts\") pod \"515e5952-854e-4cbe-9d9b-e6b27d558e69\" (UID: \"515e5952-854e-4cbe-9d9b-e6b27d558e69\") " Jan 29 15:32:16 crc kubenswrapper[4753]: I0129 15:32:16.582532 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/515e5952-854e-4cbe-9d9b-e6b27d558e69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "515e5952-854e-4cbe-9d9b-e6b27d558e69" (UID: "515e5952-854e-4cbe-9d9b-e6b27d558e69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:32:16 crc kubenswrapper[4753]: I0129 15:32:16.582865 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/515e5952-854e-4cbe-9d9b-e6b27d558e69-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:16 crc kubenswrapper[4753]: I0129 15:32:16.586460 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515e5952-854e-4cbe-9d9b-e6b27d558e69-kube-api-access-bgsrr" (OuterVolumeSpecName: "kube-api-access-bgsrr") pod "515e5952-854e-4cbe-9d9b-e6b27d558e69" (UID: "515e5952-854e-4cbe-9d9b-e6b27d558e69"). InnerVolumeSpecName "kube-api-access-bgsrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:32:16 crc kubenswrapper[4753]: I0129 15:32:16.684736 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgsrr\" (UniqueName: \"kubernetes.io/projected/515e5952-854e-4cbe-9d9b-e6b27d558e69-kube-api-access-bgsrr\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:17 crc kubenswrapper[4753]: I0129 15:32:17.168604 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc3f-account-create-update-94pk6" event={"ID":"515e5952-854e-4cbe-9d9b-e6b27d558e69","Type":"ContainerDied","Data":"5b2a4bdd89e75729cb7644323169e2844f9d67e593205081f1588799aa94e824"} Jan 29 15:32:17 crc kubenswrapper[4753]: I0129 15:32:17.169325 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b2a4bdd89e75729cb7644323169e2844f9d67e593205081f1588799aa94e824" Jan 29 15:32:17 crc kubenswrapper[4753]: I0129 15:32:17.170930 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc3f-account-create-update-94pk6" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.163347 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hw54d"] Jan 29 15:32:18 crc kubenswrapper[4753]: E0129 15:32:18.163838 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16af97b-955b-4903-b680-f7250f57874f" containerName="mariadb-database-create" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.163863 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16af97b-955b-4903-b680-f7250f57874f" containerName="mariadb-database-create" Jan 29 15:32:18 crc kubenswrapper[4753]: E0129 15:32:18.163892 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515e5952-854e-4cbe-9d9b-e6b27d558e69" containerName="mariadb-account-create-update" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.163904 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="515e5952-854e-4cbe-9d9b-e6b27d558e69" containerName="mariadb-account-create-update" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.164189 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="515e5952-854e-4cbe-9d9b-e6b27d558e69" containerName="mariadb-account-create-update" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.164226 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f16af97b-955b-4903-b680-f7250f57874f" containerName="mariadb-database-create" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.164939 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.170785 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rltdb" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.170827 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.177615 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hw54d"] Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.314069 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-combined-ca-bundle\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.314208 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-config-data\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.314295 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-db-sync-config-data\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.314318 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8g2n\" (UniqueName: \"kubernetes.io/projected/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-kube-api-access-n8g2n\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.415930 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-config-data\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.416040 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-db-sync-config-data\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.416073 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8g2n\" (UniqueName: \"kubernetes.io/projected/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-kube-api-access-n8g2n\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.416110 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-combined-ca-bundle\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.421142 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-combined-ca-bundle\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.430399 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-config-data\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.433107 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8g2n\" (UniqueName: \"kubernetes.io/projected/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-kube-api-access-n8g2n\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.433617 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-db-sync-config-data\") pod \"glance-db-sync-hw54d\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:18 crc kubenswrapper[4753]: I0129 15:32:18.496024 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:19 crc kubenswrapper[4753]: I0129 15:32:19.005033 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hw54d"] Jan 29 15:32:19 crc kubenswrapper[4753]: I0129 15:32:19.199690 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hw54d" event={"ID":"a2871ffb-46c8-4307-90d7-6fc9402cb8cc","Type":"ContainerStarted","Data":"4e583625be30963da11f2ae3b4a78811d24aaa63e72837dee0220fcb1812010f"} Jan 29 15:32:20 crc kubenswrapper[4753]: I0129 15:32:20.218116 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hw54d" event={"ID":"a2871ffb-46c8-4307-90d7-6fc9402cb8cc","Type":"ContainerStarted","Data":"966fe858ffe3e9ff480a826a2682f30e7d74bfd056261085cf0293ab64c2dd7c"} Jan 29 15:32:20 crc kubenswrapper[4753]: I0129 15:32:20.243966 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hw54d" podStartSLOduration=2.243949168 podStartE2EDuration="2.243949168s" podCreationTimestamp="2026-01-29 15:32:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:32:20.229832978 +0000 UTC m=+5374.924567360" watchObservedRunningTime="2026-01-29 15:32:20.243949168 +0000 UTC m=+5374.938683550" Jan 29 15:32:23 crc kubenswrapper[4753]: I0129 15:32:23.244609 4753 generic.go:334] "Generic (PLEG): container finished" podID="a2871ffb-46c8-4307-90d7-6fc9402cb8cc" containerID="966fe858ffe3e9ff480a826a2682f30e7d74bfd056261085cf0293ab64c2dd7c" exitCode=0 Jan 29 15:32:23 crc kubenswrapper[4753]: I0129 15:32:23.244704 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hw54d" event={"ID":"a2871ffb-46c8-4307-90d7-6fc9402cb8cc","Type":"ContainerDied","Data":"966fe858ffe3e9ff480a826a2682f30e7d74bfd056261085cf0293ab64c2dd7c"} Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.608409 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.749308 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-config-data\") pod \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.749558 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-db-sync-config-data\") pod \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.749588 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8g2n\" (UniqueName: \"kubernetes.io/projected/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-kube-api-access-n8g2n\") pod \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.749615 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-combined-ca-bundle\") pod \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\" (UID: \"a2871ffb-46c8-4307-90d7-6fc9402cb8cc\") " Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.755124 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-kube-api-access-n8g2n" (OuterVolumeSpecName: "kube-api-access-n8g2n") pod "a2871ffb-46c8-4307-90d7-6fc9402cb8cc" (UID: "a2871ffb-46c8-4307-90d7-6fc9402cb8cc"). InnerVolumeSpecName "kube-api-access-n8g2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.755310 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a2871ffb-46c8-4307-90d7-6fc9402cb8cc" (UID: "a2871ffb-46c8-4307-90d7-6fc9402cb8cc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.772874 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2871ffb-46c8-4307-90d7-6fc9402cb8cc" (UID: "a2871ffb-46c8-4307-90d7-6fc9402cb8cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.796534 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-config-data" (OuterVolumeSpecName: "config-data") pod "a2871ffb-46c8-4307-90d7-6fc9402cb8cc" (UID: "a2871ffb-46c8-4307-90d7-6fc9402cb8cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.851143 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.851187 4753 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.851199 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8g2n\" (UniqueName: \"kubernetes.io/projected/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-kube-api-access-n8g2n\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:24 crc kubenswrapper[4753]: I0129 15:32:24.851209 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2871ffb-46c8-4307-90d7-6fc9402cb8cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.266894 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hw54d" event={"ID":"a2871ffb-46c8-4307-90d7-6fc9402cb8cc","Type":"ContainerDied","Data":"4e583625be30963da11f2ae3b4a78811d24aaa63e72837dee0220fcb1812010f"} Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.267180 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e583625be30963da11f2ae3b4a78811d24aaa63e72837dee0220fcb1812010f" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.267002 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hw54d" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.568214 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:32:25 crc kubenswrapper[4753]: E0129 15:32:25.568908 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2871ffb-46c8-4307-90d7-6fc9402cb8cc" containerName="glance-db-sync" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.568932 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2871ffb-46c8-4307-90d7-6fc9402cb8cc" containerName="glance-db-sync" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.569144 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2871ffb-46c8-4307-90d7-6fc9402cb8cc" containerName="glance-db-sync" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.570262 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.572819 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.572990 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.573174 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.574097 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rltdb" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.583032 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.667788 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.667867 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-logs\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.668020 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-ceph\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.668075 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.668181 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6t5t\" (UniqueName: \"kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-kube-api-access-j6t5t\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.668417 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.668522 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.697309 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b76cd765c-m6k2s"] Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.698813 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.730201 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b76cd765c-m6k2s"] Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.770300 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.770384 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.770438 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.770467 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-logs\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.770715 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-ceph\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.770732 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.770762 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6t5t\" (UniqueName: \"kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-kube-api-access-j6t5t\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.771512 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-logs\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.771670 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.775661 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.776995 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.777215 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-ceph\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.777247 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.789492 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6t5t\" (UniqueName: \"kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-kube-api-access-j6t5t\") pod \"glance-default-external-api-0\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.848649 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.851661 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.854849 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.861317 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.873850 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.873943 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl4xf\" (UniqueName: \"kubernetes.io/projected/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-kube-api-access-sl4xf\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.874041 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.874093 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-config\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.874186 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-dns-svc\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.914387 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.975893 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.975954 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckkvl\" (UniqueName: \"kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-kube-api-access-ckkvl\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.976021 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl4xf\" (UniqueName: \"kubernetes.io/projected/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-kube-api-access-sl4xf\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.976109 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.976279 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.976369 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.976420 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.976506 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-config\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.976606 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.976640 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-dns-svc\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.976718 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.976780 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.977525 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.977548 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-config\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.977708 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-dns-svc\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:25 crc kubenswrapper[4753]: I0129 15:32:25.978031 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.018321 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl4xf\" (UniqueName: \"kubernetes.io/projected/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-kube-api-access-sl4xf\") pod \"dnsmasq-dns-7b76cd765c-m6k2s\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.078412 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.078506 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.078535 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.078590 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckkvl\" (UniqueName: \"kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-kube-api-access-ckkvl\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.078685 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.078737 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.078787 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.080802 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.081429 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.083054 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.088784 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.089109 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.091548 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.108863 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckkvl\" (UniqueName: \"kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-kube-api-access-ckkvl\") pod \"glance-default-internal-api-0\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.183819 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.316331 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.498099 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.611509 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.706799 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:32:26 crc kubenswrapper[4753]: W0129 15:32:26.711420 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c94e83d_b413_4414_82ff_22ee5021e2f1.slice/crio-1ad839a1e79df0f33c891a47e01fff1ee2fd69cbdfebbe4f44807ca4263c0be5 WatchSource:0}: Error finding container 1ad839a1e79df0f33c891a47e01fff1ee2fd69cbdfebbe4f44807ca4263c0be5: Status 404 returned error can't find the container with id 1ad839a1e79df0f33c891a47e01fff1ee2fd69cbdfebbe4f44807ca4263c0be5 Jan 29 15:32:26 crc kubenswrapper[4753]: I0129 15:32:26.837210 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b76cd765c-m6k2s"] Jan 29 15:32:27 crc kubenswrapper[4753]: I0129 15:32:27.291926 4753 generic.go:334] "Generic (PLEG): container finished" podID="bb56f0c2-00da-4cc4-abe7-5f1ee656190b" containerID="69af38e1cbfd5738cde4283f5589ed9cd1922619257a1e05eb6e271e5c8c20d8" exitCode=0 Jan 29 15:32:27 crc kubenswrapper[4753]: I0129 15:32:27.292021 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" event={"ID":"bb56f0c2-00da-4cc4-abe7-5f1ee656190b","Type":"ContainerDied","Data":"69af38e1cbfd5738cde4283f5589ed9cd1922619257a1e05eb6e271e5c8c20d8"} Jan 29 15:32:27 crc kubenswrapper[4753]: I0129 15:32:27.292389 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" event={"ID":"bb56f0c2-00da-4cc4-abe7-5f1ee656190b","Type":"ContainerStarted","Data":"9e7fd66f1781b9103ed24cb6b4e752d8f9db839759f26cfc4301f2cddc2db42c"} Jan 29 15:32:27 crc kubenswrapper[4753]: I0129 15:32:27.294949 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6bb26f7-539a-421f-ad9a-e70f29fdadfe","Type":"ContainerStarted","Data":"5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143"} Jan 29 15:32:27 crc kubenswrapper[4753]: I0129 15:32:27.295005 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6bb26f7-539a-421f-ad9a-e70f29fdadfe","Type":"ContainerStarted","Data":"c0332a2afe05770ec792a393026f895190b7e15e1b0f4c647a817d6dae1361fb"} Jan 29 15:32:27 crc kubenswrapper[4753]: I0129 15:32:27.297931 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c94e83d-b413-4414-82ff-22ee5021e2f1","Type":"ContainerStarted","Data":"1ad839a1e79df0f33c891a47e01fff1ee2fd69cbdfebbe4f44807ca4263c0be5"} Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.305775 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c94e83d-b413-4414-82ff-22ee5021e2f1","Type":"ContainerStarted","Data":"8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df"} Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.306078 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c94e83d-b413-4414-82ff-22ee5021e2f1","Type":"ContainerStarted","Data":"083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b"} Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.309029 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" event={"ID":"bb56f0c2-00da-4cc4-abe7-5f1ee656190b","Type":"ContainerStarted","Data":"4eac7d072e82edbbd9f29931d6540e3f2eaad20486cb98fe983cf9c4b0cd47dd"} Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.309181 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.311609 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6bb26f7-539a-421f-ad9a-e70f29fdadfe","Type":"ContainerStarted","Data":"cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee"} Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.311733 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" containerName="glance-log" containerID="cri-o://5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143" gracePeriod=30 Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.311936 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" containerName="glance-httpd" containerID="cri-o://cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee" gracePeriod=30 Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.326951 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.326936287 podStartE2EDuration="3.326936287s" podCreationTimestamp="2026-01-29 15:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:32:28.325316894 +0000 UTC m=+5383.020051276" watchObservedRunningTime="2026-01-29 15:32:28.326936287 +0000 UTC m=+5383.021670669" Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.349848 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" podStartSLOduration=3.349829464 podStartE2EDuration="3.349829464s" podCreationTimestamp="2026-01-29 15:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:32:28.348102118 +0000 UTC m=+5383.042836510" watchObservedRunningTime="2026-01-29 15:32:28.349829464 +0000 UTC m=+5383.044563846" Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.377601 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.377576022 podStartE2EDuration="3.377576022s" podCreationTimestamp="2026-01-29 15:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:32:28.369013712 +0000 UTC m=+5383.063748094" watchObservedRunningTime="2026-01-29 15:32:28.377576022 +0000 UTC m=+5383.072310404" Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.673845 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:32:28 crc kubenswrapper[4753]: I0129 15:32:28.911252 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.046413 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6t5t\" (UniqueName: \"kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-kube-api-access-j6t5t\") pod \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.046468 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-ceph\") pod \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.046531 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-config-data\") pod \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.046637 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-logs\") pod \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.046663 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-httpd-run\") pod \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.046713 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-scripts\") pod \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.046760 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-combined-ca-bundle\") pod \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\" (UID: \"a6bb26f7-539a-421f-ad9a-e70f29fdadfe\") " Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.047080 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-logs" (OuterVolumeSpecName: "logs") pod "a6bb26f7-539a-421f-ad9a-e70f29fdadfe" (UID: "a6bb26f7-539a-421f-ad9a-e70f29fdadfe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.052608 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a6bb26f7-539a-421f-ad9a-e70f29fdadfe" (UID: "a6bb26f7-539a-421f-ad9a-e70f29fdadfe"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.053423 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-kube-api-access-j6t5t" (OuterVolumeSpecName: "kube-api-access-j6t5t") pod "a6bb26f7-539a-421f-ad9a-e70f29fdadfe" (UID: "a6bb26f7-539a-421f-ad9a-e70f29fdadfe"). InnerVolumeSpecName "kube-api-access-j6t5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.053535 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-ceph" (OuterVolumeSpecName: "ceph") pod "a6bb26f7-539a-421f-ad9a-e70f29fdadfe" (UID: "a6bb26f7-539a-421f-ad9a-e70f29fdadfe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.070531 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-scripts" (OuterVolumeSpecName: "scripts") pod "a6bb26f7-539a-421f-ad9a-e70f29fdadfe" (UID: "a6bb26f7-539a-421f-ad9a-e70f29fdadfe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.072792 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6bb26f7-539a-421f-ad9a-e70f29fdadfe" (UID: "a6bb26f7-539a-421f-ad9a-e70f29fdadfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.126313 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-config-data" (OuterVolumeSpecName: "config-data") pod "a6bb26f7-539a-421f-ad9a-e70f29fdadfe" (UID: "a6bb26f7-539a-421f-ad9a-e70f29fdadfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.151445 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.151484 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.151498 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.151510 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.151528 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6t5t\" (UniqueName: \"kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-kube-api-access-j6t5t\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.151539 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-ceph\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.151550 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6bb26f7-539a-421f-ad9a-e70f29fdadfe-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.322959 4753 generic.go:334] "Generic (PLEG): container finished" podID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" containerID="cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee" exitCode=0 Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.323303 4753 generic.go:334] "Generic (PLEG): container finished" podID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" containerID="5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143" exitCode=143 Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.323019 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6bb26f7-539a-421f-ad9a-e70f29fdadfe","Type":"ContainerDied","Data":"cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee"} Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.324082 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6bb26f7-539a-421f-ad9a-e70f29fdadfe","Type":"ContainerDied","Data":"5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143"} Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.324098 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6bb26f7-539a-421f-ad9a-e70f29fdadfe","Type":"ContainerDied","Data":"c0332a2afe05770ec792a393026f895190b7e15e1b0f4c647a817d6dae1361fb"} Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.324114 4753 scope.go:117] "RemoveContainer" containerID="cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.323047 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.356941 4753 scope.go:117] "RemoveContainer" containerID="5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.363270 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.379756 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.389803 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:32:29 crc kubenswrapper[4753]: E0129 15:32:29.390245 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" containerName="glance-log" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.390267 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" containerName="glance-log" Jan 29 15:32:29 crc kubenswrapper[4753]: E0129 15:32:29.390303 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" containerName="glance-httpd" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.390309 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" containerName="glance-httpd" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.390448 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" containerName="glance-httpd" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.390469 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" containerName="glance-log" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.391312 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.391557 4753 scope.go:117] "RemoveContainer" containerID="cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee" Jan 29 15:32:29 crc kubenswrapper[4753]: E0129 15:32:29.393777 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee\": container with ID starting with cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee not found: ID does not exist" containerID="cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.393901 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee"} err="failed to get container status \"cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee\": rpc error: code = NotFound desc = could not find container \"cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee\": container with ID starting with cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee not found: ID does not exist" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.393990 4753 scope.go:117] "RemoveContainer" containerID="5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.393830 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 15:32:29 crc kubenswrapper[4753]: E0129 15:32:29.395530 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143\": container with ID starting with 5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143 not found: ID does not exist" containerID="5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.395568 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143"} err="failed to get container status \"5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143\": rpc error: code = NotFound desc = could not find container \"5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143\": container with ID starting with 5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143 not found: ID does not exist" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.395590 4753 scope.go:117] "RemoveContainer" containerID="cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.396017 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee"} err="failed to get container status \"cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee\": rpc error: code = NotFound desc = could not find container \"cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee\": container with ID starting with cb85a59b12c7db4c46965fb7106c6d6c6a82b74a66d077cd13538edab38c9eee not found: ID does not exist" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.396037 4753 scope.go:117] "RemoveContainer" containerID="5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.396385 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143"} err="failed to get container status \"5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143\": rpc error: code = NotFound desc = could not find container \"5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143\": container with ID starting with 5122cfa8c7c89503610602ec28f2708bab9c176678073ca269e9a2e95d177143 not found: ID does not exist" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.408686 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.562875 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.562927 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.562966 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.562988 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-logs\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.563078 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6bgc\" (UniqueName: \"kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-kube-api-access-v6bgc\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.563134 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.563263 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.665332 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-logs\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.665398 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6bgc\" (UniqueName: \"kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-kube-api-access-v6bgc\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.665450 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.665512 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.665559 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.665583 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.665615 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.666112 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.666504 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-logs\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.670363 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-ceph\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.670467 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.670584 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.672253 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.684568 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6bgc\" (UniqueName: \"kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-kube-api-access-v6bgc\") pod \"glance-default-external-api-0\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " pod="openstack/glance-default-external-api-0" Jan 29 15:32:29 crc kubenswrapper[4753]: I0129 15:32:29.740946 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 15:32:30 crc kubenswrapper[4753]: I0129 15:32:30.163103 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6bb26f7-539a-421f-ad9a-e70f29fdadfe" path="/var/lib/kubelet/pods/a6bb26f7-539a-421f-ad9a-e70f29fdadfe/volumes" Jan 29 15:32:30 crc kubenswrapper[4753]: I0129 15:32:30.305111 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:32:30 crc kubenswrapper[4753]: I0129 15:32:30.338872 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71bc7903-cddb-464d-a5ae-ef660282f4b7","Type":"ContainerStarted","Data":"ca5beb51d8041f71c92171d514c8ab7e90f6f768b56c8ff5572babbf1ee1b393"} Jan 29 15:32:30 crc kubenswrapper[4753]: I0129 15:32:30.339030 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3c94e83d-b413-4414-82ff-22ee5021e2f1" containerName="glance-log" containerID="cri-o://083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b" gracePeriod=30 Jan 29 15:32:30 crc kubenswrapper[4753]: I0129 15:32:30.339105 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3c94e83d-b413-4414-82ff-22ee5021e2f1" containerName="glance-httpd" containerID="cri-o://8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df" gracePeriod=30 Jan 29 15:32:30 crc kubenswrapper[4753]: I0129 15:32:30.991731 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.092645 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-config-data\") pod \"3c94e83d-b413-4414-82ff-22ee5021e2f1\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.092733 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-combined-ca-bundle\") pod \"3c94e83d-b413-4414-82ff-22ee5021e2f1\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.092792 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-logs\") pod \"3c94e83d-b413-4414-82ff-22ee5021e2f1\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.092825 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckkvl\" (UniqueName: \"kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-kube-api-access-ckkvl\") pod \"3c94e83d-b413-4414-82ff-22ee5021e2f1\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.092839 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-httpd-run\") pod \"3c94e83d-b413-4414-82ff-22ee5021e2f1\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.092918 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-scripts\") pod \"3c94e83d-b413-4414-82ff-22ee5021e2f1\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.092947 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-ceph\") pod \"3c94e83d-b413-4414-82ff-22ee5021e2f1\" (UID: \"3c94e83d-b413-4414-82ff-22ee5021e2f1\") " Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.094557 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3c94e83d-b413-4414-82ff-22ee5021e2f1" (UID: "3c94e83d-b413-4414-82ff-22ee5021e2f1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.094581 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-logs" (OuterVolumeSpecName: "logs") pod "3c94e83d-b413-4414-82ff-22ee5021e2f1" (UID: "3c94e83d-b413-4414-82ff-22ee5021e2f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.098058 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-kube-api-access-ckkvl" (OuterVolumeSpecName: "kube-api-access-ckkvl") pod "3c94e83d-b413-4414-82ff-22ee5021e2f1" (UID: "3c94e83d-b413-4414-82ff-22ee5021e2f1"). InnerVolumeSpecName "kube-api-access-ckkvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.099022 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-ceph" (OuterVolumeSpecName: "ceph") pod "3c94e83d-b413-4414-82ff-22ee5021e2f1" (UID: "3c94e83d-b413-4414-82ff-22ee5021e2f1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.101563 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-scripts" (OuterVolumeSpecName: "scripts") pod "3c94e83d-b413-4414-82ff-22ee5021e2f1" (UID: "3c94e83d-b413-4414-82ff-22ee5021e2f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.123637 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c94e83d-b413-4414-82ff-22ee5021e2f1" (UID: "3c94e83d-b413-4414-82ff-22ee5021e2f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.153824 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-config-data" (OuterVolumeSpecName: "config-data") pod "3c94e83d-b413-4414-82ff-22ee5021e2f1" (UID: "3c94e83d-b413-4414-82ff-22ee5021e2f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.196691 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.196735 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.196752 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.196765 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckkvl\" (UniqueName: \"kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-kube-api-access-ckkvl\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.196778 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c94e83d-b413-4414-82ff-22ee5021e2f1-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.196788 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c94e83d-b413-4414-82ff-22ee5021e2f1-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.196821 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3c94e83d-b413-4414-82ff-22ee5021e2f1-ceph\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.365856 4753 generic.go:334] "Generic (PLEG): container finished" podID="3c94e83d-b413-4414-82ff-22ee5021e2f1" containerID="8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df" exitCode=0 Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.366545 4753 generic.go:334] "Generic (PLEG): container finished" podID="3c94e83d-b413-4414-82ff-22ee5021e2f1" containerID="083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b" exitCode=143 Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.366655 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c94e83d-b413-4414-82ff-22ee5021e2f1","Type":"ContainerDied","Data":"8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df"} Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.366697 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c94e83d-b413-4414-82ff-22ee5021e2f1","Type":"ContainerDied","Data":"083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b"} Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.366713 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c94e83d-b413-4414-82ff-22ee5021e2f1","Type":"ContainerDied","Data":"1ad839a1e79df0f33c891a47e01fff1ee2fd69cbdfebbe4f44807ca4263c0be5"} Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.366743 4753 scope.go:117] "RemoveContainer" containerID="8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.367015 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.370500 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71bc7903-cddb-464d-a5ae-ef660282f4b7","Type":"ContainerStarted","Data":"808b00d324845124c97752b5ba087e9e0abc721ec5465939c7268e00de26fe70"} Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.422795 4753 scope.go:117] "RemoveContainer" containerID="083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.441282 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.447131 4753 scope.go:117] "RemoveContainer" containerID="8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df" Jan 29 15:32:31 crc kubenswrapper[4753]: E0129 15:32:31.448367 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df\": container with ID starting with 8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df not found: ID does not exist" containerID="8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.448448 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df"} err="failed to get container status \"8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df\": rpc error: code = NotFound desc = could not find container \"8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df\": container with ID starting with 8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df not found: ID does not exist" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.448497 4753 scope.go:117] "RemoveContainer" containerID="083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b" Jan 29 15:32:31 crc kubenswrapper[4753]: E0129 15:32:31.453310 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b\": container with ID starting with 083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b not found: ID does not exist" containerID="083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.453346 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b"} err="failed to get container status \"083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b\": rpc error: code = NotFound desc = could not find container \"083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b\": container with ID starting with 083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b not found: ID does not exist" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.453369 4753 scope.go:117] "RemoveContainer" containerID="8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.453940 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.454096 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df"} err="failed to get container status \"8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df\": rpc error: code = NotFound desc = could not find container \"8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df\": container with ID starting with 8e5aa95ca4da6331801877540564a861a14cb3718ddb2b2749563ad5ba8a16df not found: ID does not exist" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.454140 4753 scope.go:117] "RemoveContainer" containerID="083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.457772 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b"} err="failed to get container status \"083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b\": rpc error: code = NotFound desc = could not find container \"083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b\": container with ID starting with 083344aa014ff346bb0e1bd22e577491c3f2acf44dfb79a7fdcbad326268510b not found: ID does not exist" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.468129 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:32:31 crc kubenswrapper[4753]: E0129 15:32:31.468566 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c94e83d-b413-4414-82ff-22ee5021e2f1" containerName="glance-httpd" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.468585 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c94e83d-b413-4414-82ff-22ee5021e2f1" containerName="glance-httpd" Jan 29 15:32:31 crc kubenswrapper[4753]: E0129 15:32:31.468608 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c94e83d-b413-4414-82ff-22ee5021e2f1" containerName="glance-log" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.468615 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c94e83d-b413-4414-82ff-22ee5021e2f1" containerName="glance-log" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.468774 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c94e83d-b413-4414-82ff-22ee5021e2f1" containerName="glance-log" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.468794 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c94e83d-b413-4414-82ff-22ee5021e2f1" containerName="glance-httpd" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.475545 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.478138 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.499358 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.616237 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8svll\" (UniqueName: \"kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-kube-api-access-8svll\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.616313 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.616338 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.616363 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.616382 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.616687 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-logs\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.616738 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.718982 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-logs\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.719304 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.719352 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8svll\" (UniqueName: \"kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-kube-api-access-8svll\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.719413 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.719433 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.719460 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.719479 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.719761 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-logs\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.720212 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.725779 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.727084 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.728653 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.733628 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.743248 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8svll\" (UniqueName: \"kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-kube-api-access-8svll\") pod \"glance-default-internal-api-0\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:32:31 crc kubenswrapper[4753]: I0129 15:32:31.799106 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:32 crc kubenswrapper[4753]: I0129 15:32:32.162054 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c94e83d-b413-4414-82ff-22ee5021e2f1" path="/var/lib/kubelet/pods/3c94e83d-b413-4414-82ff-22ee5021e2f1/volumes" Jan 29 15:32:32 crc kubenswrapper[4753]: I0129 15:32:32.337839 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:32:32 crc kubenswrapper[4753]: I0129 15:32:32.382396 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71bc7903-cddb-464d-a5ae-ef660282f4b7","Type":"ContainerStarted","Data":"d16311801fd926af5c1796ec322bc0e2dc4d10badd0b2df0b6e5e7ad1891d62d"} Jan 29 15:32:32 crc kubenswrapper[4753]: I0129 15:32:32.386080 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4abaca50-6e4d-4947-b3e1-6b627376a788","Type":"ContainerStarted","Data":"87f5ef070045c9fae1265eb3aee9c473b256b36ac042a21a98a75232860db2ca"} Jan 29 15:32:32 crc kubenswrapper[4753]: I0129 15:32:32.417567 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.417547115 podStartE2EDuration="3.417547115s" podCreationTimestamp="2026-01-29 15:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:32:32.404875094 +0000 UTC m=+5387.099609476" watchObservedRunningTime="2026-01-29 15:32:32.417547115 +0000 UTC m=+5387.112281497" Jan 29 15:32:33 crc kubenswrapper[4753]: I0129 15:32:33.403868 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4abaca50-6e4d-4947-b3e1-6b627376a788","Type":"ContainerStarted","Data":"4505286665ec8df6d6f490fb8862aaf09f3220c1a84751ff2c473a2b109913b7"} Jan 29 15:32:34 crc kubenswrapper[4753]: I0129 15:32:34.416623 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4abaca50-6e4d-4947-b3e1-6b627376a788","Type":"ContainerStarted","Data":"891128b69da3f3bbbd8efd446640966b20c7879b34a670191412181677ba15c5"} Jan 29 15:32:34 crc kubenswrapper[4753]: I0129 15:32:34.445661 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.445639731 podStartE2EDuration="3.445639731s" podCreationTimestamp="2026-01-29 15:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:32:34.444325826 +0000 UTC m=+5389.139060208" watchObservedRunningTime="2026-01-29 15:32:34.445639731 +0000 UTC m=+5389.140374113" Jan 29 15:32:36 crc kubenswrapper[4753]: I0129 15:32:36.318386 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:32:36 crc kubenswrapper[4753]: I0129 15:32:36.398113 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcb785c5-whf45"] Jan 29 15:32:36 crc kubenswrapper[4753]: I0129 15:32:36.398391 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" podUID="b32c6f3c-139b-42b7-8f6c-5a93c188c343" containerName="dnsmasq-dns" containerID="cri-o://fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe" gracePeriod=10 Jan 29 15:32:36 crc kubenswrapper[4753]: I0129 15:32:36.939336 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.121068 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-nb\") pod \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.121192 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-dns-svc\") pod \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.121219 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-sb\") pod \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.121283 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgpk6\" (UniqueName: \"kubernetes.io/projected/b32c6f3c-139b-42b7-8f6c-5a93c188c343-kube-api-access-kgpk6\") pod \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.121351 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-config\") pod \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\" (UID: \"b32c6f3c-139b-42b7-8f6c-5a93c188c343\") " Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.129579 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32c6f3c-139b-42b7-8f6c-5a93c188c343-kube-api-access-kgpk6" (OuterVolumeSpecName: "kube-api-access-kgpk6") pod "b32c6f3c-139b-42b7-8f6c-5a93c188c343" (UID: "b32c6f3c-139b-42b7-8f6c-5a93c188c343"). InnerVolumeSpecName "kube-api-access-kgpk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.165113 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-config" (OuterVolumeSpecName: "config") pod "b32c6f3c-139b-42b7-8f6c-5a93c188c343" (UID: "b32c6f3c-139b-42b7-8f6c-5a93c188c343"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.166982 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b32c6f3c-139b-42b7-8f6c-5a93c188c343" (UID: "b32c6f3c-139b-42b7-8f6c-5a93c188c343"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.169477 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b32c6f3c-139b-42b7-8f6c-5a93c188c343" (UID: "b32c6f3c-139b-42b7-8f6c-5a93c188c343"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.181244 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b32c6f3c-139b-42b7-8f6c-5a93c188c343" (UID: "b32c6f3c-139b-42b7-8f6c-5a93c188c343"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.224496 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.224530 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.224543 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.224557 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgpk6\" (UniqueName: \"kubernetes.io/projected/b32c6f3c-139b-42b7-8f6c-5a93c188c343-kube-api-access-kgpk6\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.224570 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32c6f3c-139b-42b7-8f6c-5a93c188c343-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.449978 4753 generic.go:334] "Generic (PLEG): container finished" podID="b32c6f3c-139b-42b7-8f6c-5a93c188c343" containerID="fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe" exitCode=0 Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.450022 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" event={"ID":"b32c6f3c-139b-42b7-8f6c-5a93c188c343","Type":"ContainerDied","Data":"fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe"} Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.450047 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" event={"ID":"b32c6f3c-139b-42b7-8f6c-5a93c188c343","Type":"ContainerDied","Data":"51eeadff935a19b5197ff40980293dc303e5c92b231b0c4de230df0335ff721c"} Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.450062 4753 scope.go:117] "RemoveContainer" containerID="fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.450190 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcb785c5-whf45" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.484501 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcb785c5-whf45"] Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.486143 4753 scope.go:117] "RemoveContainer" containerID="70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.492399 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcb785c5-whf45"] Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.515114 4753 scope.go:117] "RemoveContainer" containerID="fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe" Jan 29 15:32:37 crc kubenswrapper[4753]: E0129 15:32:37.518728 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe\": container with ID starting with fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe not found: ID does not exist" containerID="fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.518809 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe"} err="failed to get container status \"fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe\": rpc error: code = NotFound desc = could not find container \"fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe\": container with ID starting with fca38da8f123b5664e65ab33a6cfe683a1b1ad5eb4f4e0018ade3106c7f06ebe not found: ID does not exist" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.518859 4753 scope.go:117] "RemoveContainer" containerID="70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320" Jan 29 15:32:37 crc kubenswrapper[4753]: E0129 15:32:37.519505 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320\": container with ID starting with 70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320 not found: ID does not exist" containerID="70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320" Jan 29 15:32:37 crc kubenswrapper[4753]: I0129 15:32:37.519550 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320"} err="failed to get container status \"70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320\": rpc error: code = NotFound desc = could not find container \"70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320\": container with ID starting with 70ef443b38fba621373bab4b8e89d28c90acc6614484122f6637895943637320 not found: ID does not exist" Jan 29 15:32:38 crc kubenswrapper[4753]: I0129 15:32:38.163583 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b32c6f3c-139b-42b7-8f6c-5a93c188c343" path="/var/lib/kubelet/pods/b32c6f3c-139b-42b7-8f6c-5a93c188c343/volumes" Jan 29 15:32:39 crc kubenswrapper[4753]: I0129 15:32:39.741485 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 15:32:39 crc kubenswrapper[4753]: I0129 15:32:39.741660 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 15:32:39 crc kubenswrapper[4753]: I0129 15:32:39.774254 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 15:32:39 crc kubenswrapper[4753]: I0129 15:32:39.792690 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 15:32:40 crc kubenswrapper[4753]: I0129 15:32:40.482356 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 15:32:40 crc kubenswrapper[4753]: I0129 15:32:40.482656 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 15:32:41 crc kubenswrapper[4753]: I0129 15:32:41.799771 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:41 crc kubenswrapper[4753]: I0129 15:32:41.799836 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:41 crc kubenswrapper[4753]: I0129 15:32:41.829240 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:41 crc kubenswrapper[4753]: I0129 15:32:41.845222 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:42 crc kubenswrapper[4753]: I0129 15:32:42.500223 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:42 crc kubenswrapper[4753]: I0129 15:32:42.500638 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:42 crc kubenswrapper[4753]: I0129 15:32:42.536580 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 15:32:42 crc kubenswrapper[4753]: I0129 15:32:42.536679 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 15:32:42 crc kubenswrapper[4753]: I0129 15:32:42.567650 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 15:32:44 crc kubenswrapper[4753]: I0129 15:32:44.584589 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:44 crc kubenswrapper[4753]: I0129 15:32:44.584894 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.507520 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-bqwgr"] Jan 29 15:32:50 crc kubenswrapper[4753]: E0129 15:32:50.508421 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32c6f3c-139b-42b7-8f6c-5a93c188c343" containerName="dnsmasq-dns" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.508434 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32c6f3c-139b-42b7-8f6c-5a93c188c343" containerName="dnsmasq-dns" Jan 29 15:32:50 crc kubenswrapper[4753]: E0129 15:32:50.508448 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32c6f3c-139b-42b7-8f6c-5a93c188c343" containerName="init" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.508455 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32c6f3c-139b-42b7-8f6c-5a93c188c343" containerName="init" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.508601 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32c6f3c-139b-42b7-8f6c-5a93c188c343" containerName="dnsmasq-dns" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.509213 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bqwgr" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.520246 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bqwgr"] Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.608661 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8d6c-account-create-update-nnjmp"] Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.608973 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba286775-81ab-4630-bf6e-0824c2a89a6b-operator-scripts\") pod \"placement-db-create-bqwgr\" (UID: \"ba286775-81ab-4630-bf6e-0824c2a89a6b\") " pod="openstack/placement-db-create-bqwgr" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.609138 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5drq\" (UniqueName: \"kubernetes.io/projected/ba286775-81ab-4630-bf6e-0824c2a89a6b-kube-api-access-m5drq\") pod \"placement-db-create-bqwgr\" (UID: \"ba286775-81ab-4630-bf6e-0824c2a89a6b\") " pod="openstack/placement-db-create-bqwgr" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.610189 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d6c-account-create-update-nnjmp" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.612630 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.617181 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d6c-account-create-update-nnjmp"] Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.710740 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba286775-81ab-4630-bf6e-0824c2a89a6b-operator-scripts\") pod \"placement-db-create-bqwgr\" (UID: \"ba286775-81ab-4630-bf6e-0824c2a89a6b\") " pod="openstack/placement-db-create-bqwgr" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.710829 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvwl4\" (UniqueName: \"kubernetes.io/projected/f1b0f669-00e6-4f72-bc26-91272c722e84-kube-api-access-pvwl4\") pod \"placement-8d6c-account-create-update-nnjmp\" (UID: \"f1b0f669-00e6-4f72-bc26-91272c722e84\") " pod="openstack/placement-8d6c-account-create-update-nnjmp" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.710888 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b0f669-00e6-4f72-bc26-91272c722e84-operator-scripts\") pod \"placement-8d6c-account-create-update-nnjmp\" (UID: \"f1b0f669-00e6-4f72-bc26-91272c722e84\") " pod="openstack/placement-8d6c-account-create-update-nnjmp" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.710916 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5drq\" (UniqueName: \"kubernetes.io/projected/ba286775-81ab-4630-bf6e-0824c2a89a6b-kube-api-access-m5drq\") pod \"placement-db-create-bqwgr\" (UID: \"ba286775-81ab-4630-bf6e-0824c2a89a6b\") " pod="openstack/placement-db-create-bqwgr" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.711630 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba286775-81ab-4630-bf6e-0824c2a89a6b-operator-scripts\") pod \"placement-db-create-bqwgr\" (UID: \"ba286775-81ab-4630-bf6e-0824c2a89a6b\") " pod="openstack/placement-db-create-bqwgr" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.730483 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5drq\" (UniqueName: \"kubernetes.io/projected/ba286775-81ab-4630-bf6e-0824c2a89a6b-kube-api-access-m5drq\") pod \"placement-db-create-bqwgr\" (UID: \"ba286775-81ab-4630-bf6e-0824c2a89a6b\") " pod="openstack/placement-db-create-bqwgr" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.812908 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvwl4\" (UniqueName: \"kubernetes.io/projected/f1b0f669-00e6-4f72-bc26-91272c722e84-kube-api-access-pvwl4\") pod \"placement-8d6c-account-create-update-nnjmp\" (UID: \"f1b0f669-00e6-4f72-bc26-91272c722e84\") " pod="openstack/placement-8d6c-account-create-update-nnjmp" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.812985 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b0f669-00e6-4f72-bc26-91272c722e84-operator-scripts\") pod \"placement-8d6c-account-create-update-nnjmp\" (UID: \"f1b0f669-00e6-4f72-bc26-91272c722e84\") " pod="openstack/placement-8d6c-account-create-update-nnjmp" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.813839 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b0f669-00e6-4f72-bc26-91272c722e84-operator-scripts\") pod \"placement-8d6c-account-create-update-nnjmp\" (UID: \"f1b0f669-00e6-4f72-bc26-91272c722e84\") " pod="openstack/placement-8d6c-account-create-update-nnjmp" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.826823 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bqwgr" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.828855 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvwl4\" (UniqueName: \"kubernetes.io/projected/f1b0f669-00e6-4f72-bc26-91272c722e84-kube-api-access-pvwl4\") pod \"placement-8d6c-account-create-update-nnjmp\" (UID: \"f1b0f669-00e6-4f72-bc26-91272c722e84\") " pod="openstack/placement-8d6c-account-create-update-nnjmp" Jan 29 15:32:50 crc kubenswrapper[4753]: I0129 15:32:50.926417 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d6c-account-create-update-nnjmp" Jan 29 15:32:51 crc kubenswrapper[4753]: I0129 15:32:51.263992 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bqwgr"] Jan 29 15:32:51 crc kubenswrapper[4753]: W0129 15:32:51.265300 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba286775_81ab_4630_bf6e_0824c2a89a6b.slice/crio-5b4ac6a9ad5d9d7b97b8b750bc950271fc91c12f7e5359eedeefb2bcb542d2f4 WatchSource:0}: Error finding container 5b4ac6a9ad5d9d7b97b8b750bc950271fc91c12f7e5359eedeefb2bcb542d2f4: Status 404 returned error can't find the container with id 5b4ac6a9ad5d9d7b97b8b750bc950271fc91c12f7e5359eedeefb2bcb542d2f4 Jan 29 15:32:51 crc kubenswrapper[4753]: I0129 15:32:51.394371 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d6c-account-create-update-nnjmp"] Jan 29 15:32:51 crc kubenswrapper[4753]: I0129 15:32:51.574181 4753 generic.go:334] "Generic (PLEG): container finished" podID="ba286775-81ab-4630-bf6e-0824c2a89a6b" containerID="c12fc654db8e9dad7f73f6cafd5bb44659472fea9254158b325fb2406d1ae05d" exitCode=0 Jan 29 15:32:51 crc kubenswrapper[4753]: I0129 15:32:51.574270 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bqwgr" event={"ID":"ba286775-81ab-4630-bf6e-0824c2a89a6b","Type":"ContainerDied","Data":"c12fc654db8e9dad7f73f6cafd5bb44659472fea9254158b325fb2406d1ae05d"} Jan 29 15:32:51 crc kubenswrapper[4753]: I0129 15:32:51.574515 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bqwgr" event={"ID":"ba286775-81ab-4630-bf6e-0824c2a89a6b","Type":"ContainerStarted","Data":"5b4ac6a9ad5d9d7b97b8b750bc950271fc91c12f7e5359eedeefb2bcb542d2f4"} Jan 29 15:32:51 crc kubenswrapper[4753]: I0129 15:32:51.576750 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d6c-account-create-update-nnjmp" event={"ID":"f1b0f669-00e6-4f72-bc26-91272c722e84","Type":"ContainerStarted","Data":"8c5b4650ab8532e0dce229abbef4cae22e01c1012f7c1891031c24f505c83b31"} Jan 29 15:32:51 crc kubenswrapper[4753]: I0129 15:32:51.576797 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d6c-account-create-update-nnjmp" event={"ID":"f1b0f669-00e6-4f72-bc26-91272c722e84","Type":"ContainerStarted","Data":"e8f4fd1ceb6f624ecffd202384fa17031774c785797b83a9314dc56ae09cd62f"} Jan 29 15:32:51 crc kubenswrapper[4753]: I0129 15:32:51.610232 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8d6c-account-create-update-nnjmp" podStartSLOduration=1.61021283 podStartE2EDuration="1.61021283s" podCreationTimestamp="2026-01-29 15:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:32:51.603977972 +0000 UTC m=+5406.298712354" watchObservedRunningTime="2026-01-29 15:32:51.61021283 +0000 UTC m=+5406.304947212" Jan 29 15:32:52 crc kubenswrapper[4753]: I0129 15:32:52.585999 4753 generic.go:334] "Generic (PLEG): container finished" podID="f1b0f669-00e6-4f72-bc26-91272c722e84" containerID="8c5b4650ab8532e0dce229abbef4cae22e01c1012f7c1891031c24f505c83b31" exitCode=0 Jan 29 15:32:52 crc kubenswrapper[4753]: I0129 15:32:52.586803 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d6c-account-create-update-nnjmp" event={"ID":"f1b0f669-00e6-4f72-bc26-91272c722e84","Type":"ContainerDied","Data":"8c5b4650ab8532e0dce229abbef4cae22e01c1012f7c1891031c24f505c83b31"} Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.402076 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d6c-account-create-update-nnjmp" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.408203 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bqwgr" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.480905 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5drq\" (UniqueName: \"kubernetes.io/projected/ba286775-81ab-4630-bf6e-0824c2a89a6b-kube-api-access-m5drq\") pod \"ba286775-81ab-4630-bf6e-0824c2a89a6b\" (UID: \"ba286775-81ab-4630-bf6e-0824c2a89a6b\") " Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.481226 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba286775-81ab-4630-bf6e-0824c2a89a6b-operator-scripts\") pod \"ba286775-81ab-4630-bf6e-0824c2a89a6b\" (UID: \"ba286775-81ab-4630-bf6e-0824c2a89a6b\") " Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.481406 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvwl4\" (UniqueName: \"kubernetes.io/projected/f1b0f669-00e6-4f72-bc26-91272c722e84-kube-api-access-pvwl4\") pod \"f1b0f669-00e6-4f72-bc26-91272c722e84\" (UID: \"f1b0f669-00e6-4f72-bc26-91272c722e84\") " Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.481614 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b0f669-00e6-4f72-bc26-91272c722e84-operator-scripts\") pod \"f1b0f669-00e6-4f72-bc26-91272c722e84\" (UID: \"f1b0f669-00e6-4f72-bc26-91272c722e84\") " Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.481833 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba286775-81ab-4630-bf6e-0824c2a89a6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba286775-81ab-4630-bf6e-0824c2a89a6b" (UID: "ba286775-81ab-4630-bf6e-0824c2a89a6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.482058 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1b0f669-00e6-4f72-bc26-91272c722e84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1b0f669-00e6-4f72-bc26-91272c722e84" (UID: "f1b0f669-00e6-4f72-bc26-91272c722e84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.482285 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b0f669-00e6-4f72-bc26-91272c722e84-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.482380 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba286775-81ab-4630-bf6e-0824c2a89a6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.486570 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1b0f669-00e6-4f72-bc26-91272c722e84-kube-api-access-pvwl4" (OuterVolumeSpecName: "kube-api-access-pvwl4") pod "f1b0f669-00e6-4f72-bc26-91272c722e84" (UID: "f1b0f669-00e6-4f72-bc26-91272c722e84"). InnerVolumeSpecName "kube-api-access-pvwl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.490269 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba286775-81ab-4630-bf6e-0824c2a89a6b-kube-api-access-m5drq" (OuterVolumeSpecName: "kube-api-access-m5drq") pod "ba286775-81ab-4630-bf6e-0824c2a89a6b" (UID: "ba286775-81ab-4630-bf6e-0824c2a89a6b"). InnerVolumeSpecName "kube-api-access-m5drq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.583720 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvwl4\" (UniqueName: \"kubernetes.io/projected/f1b0f669-00e6-4f72-bc26-91272c722e84-kube-api-access-pvwl4\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.583934 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5drq\" (UniqueName: \"kubernetes.io/projected/ba286775-81ab-4630-bf6e-0824c2a89a6b-kube-api-access-m5drq\") on node \"crc\" DevicePath \"\"" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.607701 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bqwgr" event={"ID":"ba286775-81ab-4630-bf6e-0824c2a89a6b","Type":"ContainerDied","Data":"5b4ac6a9ad5d9d7b97b8b750bc950271fc91c12f7e5359eedeefb2bcb542d2f4"} Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.607927 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4ac6a9ad5d9d7b97b8b750bc950271fc91c12f7e5359eedeefb2bcb542d2f4" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.607730 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bqwgr" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.609713 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d6c-account-create-update-nnjmp" event={"ID":"f1b0f669-00e6-4f72-bc26-91272c722e84","Type":"ContainerDied","Data":"e8f4fd1ceb6f624ecffd202384fa17031774c785797b83a9314dc56ae09cd62f"} Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.609746 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8f4fd1ceb6f624ecffd202384fa17031774c785797b83a9314dc56ae09cd62f" Jan 29 15:32:54 crc kubenswrapper[4753]: I0129 15:32:54.609797 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d6c-account-create-update-nnjmp" Jan 29 15:32:55 crc kubenswrapper[4753]: I0129 15:32:55.972708 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-cpnns"] Jan 29 15:32:55 crc kubenswrapper[4753]: E0129 15:32:55.973837 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b0f669-00e6-4f72-bc26-91272c722e84" containerName="mariadb-account-create-update" Jan 29 15:32:55 crc kubenswrapper[4753]: I0129 15:32:55.973857 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b0f669-00e6-4f72-bc26-91272c722e84" containerName="mariadb-account-create-update" Jan 29 15:32:55 crc kubenswrapper[4753]: E0129 15:32:55.973907 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba286775-81ab-4630-bf6e-0824c2a89a6b" containerName="mariadb-database-create" Jan 29 15:32:55 crc kubenswrapper[4753]: I0129 15:32:55.973915 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba286775-81ab-4630-bf6e-0824c2a89a6b" containerName="mariadb-database-create" Jan 29 15:32:55 crc kubenswrapper[4753]: I0129 15:32:55.974326 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba286775-81ab-4630-bf6e-0824c2a89a6b" containerName="mariadb-database-create" Jan 29 15:32:55 crc kubenswrapper[4753]: I0129 15:32:55.974353 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b0f669-00e6-4f72-bc26-91272c722e84" containerName="mariadb-account-create-update" Jan 29 15:32:55 crc kubenswrapper[4753]: I0129 15:32:55.975654 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:55 crc kubenswrapper[4753]: I0129 15:32:55.979028 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 29 15:32:55 crc kubenswrapper[4753]: I0129 15:32:55.979325 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 29 15:32:55 crc kubenswrapper[4753]: I0129 15:32:55.979395 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f2t7w" Jan 29 15:32:55 crc kubenswrapper[4753]: I0129 15:32:55.995425 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cpnns"] Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.013487 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtb4w\" (UniqueName: \"kubernetes.io/projected/b4bb3975-ee1c-4d33-b619-d0f898300c93-kube-api-access-qtb4w\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.013531 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-scripts\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.013568 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-combined-ca-bundle\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.013667 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4bb3975-ee1c-4d33-b619-d0f898300c93-logs\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.013688 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-config-data\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.015478 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69df65c7dc-vl6ck"] Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.022864 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.026922 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69df65c7dc-vl6ck"] Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117092 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-sb\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117192 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfk2t\" (UniqueName: \"kubernetes.io/projected/73c9c3a0-87c7-4f79-b0a7-3964416ea053-kube-api-access-jfk2t\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117250 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4bb3975-ee1c-4d33-b619-d0f898300c93-logs\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117276 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-config-data\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117336 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-nb\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117377 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-dns-svc\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117403 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtb4w\" (UniqueName: \"kubernetes.io/projected/b4bb3975-ee1c-4d33-b619-d0f898300c93-kube-api-access-qtb4w\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117431 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-scripts\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117597 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-combined-ca-bundle\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117698 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-config\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.117698 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4bb3975-ee1c-4d33-b619-d0f898300c93-logs\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.125123 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-config-data\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.130999 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-scripts\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.131888 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-combined-ca-bundle\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.134390 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtb4w\" (UniqueName: \"kubernetes.io/projected/b4bb3975-ee1c-4d33-b619-d0f898300c93-kube-api-access-qtb4w\") pod \"placement-db-sync-cpnns\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.218968 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-config\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.219106 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-sb\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.219161 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfk2t\" (UniqueName: \"kubernetes.io/projected/73c9c3a0-87c7-4f79-b0a7-3964416ea053-kube-api-access-jfk2t\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.219246 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-nb\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.219289 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-dns-svc\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.220043 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-config\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.220862 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-nb\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.220992 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-sb\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.221015 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-dns-svc\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.240688 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfk2t\" (UniqueName: \"kubernetes.io/projected/73c9c3a0-87c7-4f79-b0a7-3964416ea053-kube-api-access-jfk2t\") pod \"dnsmasq-dns-69df65c7dc-vl6ck\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.314834 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cpnns" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.342744 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.743412 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cpnns"] Jan 29 15:32:56 crc kubenswrapper[4753]: W0129 15:32:56.746547 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4bb3975_ee1c_4d33_b619_d0f898300c93.slice/crio-ffcdfa1d0509481ed5a7a40540b8bba03021f890bfa7397d18d761301ea1fc20 WatchSource:0}: Error finding container ffcdfa1d0509481ed5a7a40540b8bba03021f890bfa7397d18d761301ea1fc20: Status 404 returned error can't find the container with id ffcdfa1d0509481ed5a7a40540b8bba03021f890bfa7397d18d761301ea1fc20 Jan 29 15:32:56 crc kubenswrapper[4753]: I0129 15:32:56.842410 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69df65c7dc-vl6ck"] Jan 29 15:32:57 crc kubenswrapper[4753]: I0129 15:32:57.641529 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cpnns" event={"ID":"b4bb3975-ee1c-4d33-b619-d0f898300c93","Type":"ContainerStarted","Data":"78e54756122595ece8fadea5a2a1241966847d69027f124e6fcdde3ed271d95c"} Jan 29 15:32:57 crc kubenswrapper[4753]: I0129 15:32:57.642013 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cpnns" event={"ID":"b4bb3975-ee1c-4d33-b619-d0f898300c93","Type":"ContainerStarted","Data":"ffcdfa1d0509481ed5a7a40540b8bba03021f890bfa7397d18d761301ea1fc20"} Jan 29 15:32:57 crc kubenswrapper[4753]: I0129 15:32:57.644607 4753 generic.go:334] "Generic (PLEG): container finished" podID="73c9c3a0-87c7-4f79-b0a7-3964416ea053" containerID="8ec22344cf77cc426962964ec25f6888fe513c8ec20ee46c886ef1f8cf055a83" exitCode=0 Jan 29 15:32:57 crc kubenswrapper[4753]: I0129 15:32:57.644669 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" event={"ID":"73c9c3a0-87c7-4f79-b0a7-3964416ea053","Type":"ContainerDied","Data":"8ec22344cf77cc426962964ec25f6888fe513c8ec20ee46c886ef1f8cf055a83"} Jan 29 15:32:57 crc kubenswrapper[4753]: I0129 15:32:57.644701 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" event={"ID":"73c9c3a0-87c7-4f79-b0a7-3964416ea053","Type":"ContainerStarted","Data":"d800ae7096ed727b106b47773d98b332fc1b9e68663a2b7006a829e4c653d061"} Jan 29 15:32:57 crc kubenswrapper[4753]: I0129 15:32:57.669281 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-cpnns" podStartSLOduration=2.669264536 podStartE2EDuration="2.669264536s" podCreationTimestamp="2026-01-29 15:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:32:57.665916035 +0000 UTC m=+5412.360650417" watchObservedRunningTime="2026-01-29 15:32:57.669264536 +0000 UTC m=+5412.363998918" Jan 29 15:32:58 crc kubenswrapper[4753]: I0129 15:32:58.653524 4753 generic.go:334] "Generic (PLEG): container finished" podID="b4bb3975-ee1c-4d33-b619-d0f898300c93" containerID="78e54756122595ece8fadea5a2a1241966847d69027f124e6fcdde3ed271d95c" exitCode=0 Jan 29 15:32:58 crc kubenswrapper[4753]: I0129 15:32:58.653575 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cpnns" event={"ID":"b4bb3975-ee1c-4d33-b619-d0f898300c93","Type":"ContainerDied","Data":"78e54756122595ece8fadea5a2a1241966847d69027f124e6fcdde3ed271d95c"} Jan 29 15:32:58 crc kubenswrapper[4753]: I0129 15:32:58.656777 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" event={"ID":"73c9c3a0-87c7-4f79-b0a7-3964416ea053","Type":"ContainerStarted","Data":"5d38142ecaab2439dc4320a1292f6aa03921c4d9d55f4c65cf8ba2929c50bcf3"} Jan 29 15:32:58 crc kubenswrapper[4753]: I0129 15:32:58.657318 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:32:58 crc kubenswrapper[4753]: I0129 15:32:58.694100 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" podStartSLOduration=3.694082243 podStartE2EDuration="3.694082243s" podCreationTimestamp="2026-01-29 15:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:32:58.689401047 +0000 UTC m=+5413.384135449" watchObservedRunningTime="2026-01-29 15:32:58.694082243 +0000 UTC m=+5413.388816625" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.101800 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cpnns" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.209381 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-scripts\") pod \"b4bb3975-ee1c-4d33-b619-d0f898300c93\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.209510 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-config-data\") pod \"b4bb3975-ee1c-4d33-b619-d0f898300c93\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.209604 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-combined-ca-bundle\") pod \"b4bb3975-ee1c-4d33-b619-d0f898300c93\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.209661 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtb4w\" (UniqueName: \"kubernetes.io/projected/b4bb3975-ee1c-4d33-b619-d0f898300c93-kube-api-access-qtb4w\") pod \"b4bb3975-ee1c-4d33-b619-d0f898300c93\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.209765 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4bb3975-ee1c-4d33-b619-d0f898300c93-logs\") pod \"b4bb3975-ee1c-4d33-b619-d0f898300c93\" (UID: \"b4bb3975-ee1c-4d33-b619-d0f898300c93\") " Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.210365 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4bb3975-ee1c-4d33-b619-d0f898300c93-logs" (OuterVolumeSpecName: "logs") pod "b4bb3975-ee1c-4d33-b619-d0f898300c93" (UID: "b4bb3975-ee1c-4d33-b619-d0f898300c93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.211496 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4bb3975-ee1c-4d33-b619-d0f898300c93-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.214821 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4bb3975-ee1c-4d33-b619-d0f898300c93-kube-api-access-qtb4w" (OuterVolumeSpecName: "kube-api-access-qtb4w") pod "b4bb3975-ee1c-4d33-b619-d0f898300c93" (UID: "b4bb3975-ee1c-4d33-b619-d0f898300c93"). InnerVolumeSpecName "kube-api-access-qtb4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.215491 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-scripts" (OuterVolumeSpecName: "scripts") pod "b4bb3975-ee1c-4d33-b619-d0f898300c93" (UID: "b4bb3975-ee1c-4d33-b619-d0f898300c93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.248308 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4bb3975-ee1c-4d33-b619-d0f898300c93" (UID: "b4bb3975-ee1c-4d33-b619-d0f898300c93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.249838 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-config-data" (OuterVolumeSpecName: "config-data") pod "b4bb3975-ee1c-4d33-b619-d0f898300c93" (UID: "b4bb3975-ee1c-4d33-b619-d0f898300c93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.313321 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.313355 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtb4w\" (UniqueName: \"kubernetes.io/projected/b4bb3975-ee1c-4d33-b619-d0f898300c93-kube-api-access-qtb4w\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.313368 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.313376 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4bb3975-ee1c-4d33-b619-d0f898300c93-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.681089 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cpnns" event={"ID":"b4bb3975-ee1c-4d33-b619-d0f898300c93","Type":"ContainerDied","Data":"ffcdfa1d0509481ed5a7a40540b8bba03021f890bfa7397d18d761301ea1fc20"} Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.681427 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffcdfa1d0509481ed5a7a40540b8bba03021f890bfa7397d18d761301ea1fc20" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.681347 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cpnns" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.752482 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7649df9fd-nqw7s"] Jan 29 15:33:00 crc kubenswrapper[4753]: E0129 15:33:00.752831 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bb3975-ee1c-4d33-b619-d0f898300c93" containerName="placement-db-sync" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.752848 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bb3975-ee1c-4d33-b619-d0f898300c93" containerName="placement-db-sync" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.752994 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bb3975-ee1c-4d33-b619-d0f898300c93" containerName="placement-db-sync" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.753863 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.756041 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.756248 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f2t7w" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.757814 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.765350 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7649df9fd-nqw7s"] Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.822446 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f9e94f3-540e-4cc6-a623-59b52104d6c8-config-data\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.822496 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9e94f3-540e-4cc6-a623-59b52104d6c8-combined-ca-bundle\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.822530 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkdkh\" (UniqueName: \"kubernetes.io/projected/0f9e94f3-540e-4cc6-a623-59b52104d6c8-kube-api-access-nkdkh\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.822693 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f9e94f3-540e-4cc6-a623-59b52104d6c8-logs\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.822917 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f9e94f3-540e-4cc6-a623-59b52104d6c8-scripts\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.924833 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f9e94f3-540e-4cc6-a623-59b52104d6c8-logs\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.924933 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f9e94f3-540e-4cc6-a623-59b52104d6c8-scripts\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.925001 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f9e94f3-540e-4cc6-a623-59b52104d6c8-config-data\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.925022 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9e94f3-540e-4cc6-a623-59b52104d6c8-combined-ca-bundle\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.925055 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkdkh\" (UniqueName: \"kubernetes.io/projected/0f9e94f3-540e-4cc6-a623-59b52104d6c8-kube-api-access-nkdkh\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.925968 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f9e94f3-540e-4cc6-a623-59b52104d6c8-logs\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.930691 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9e94f3-540e-4cc6-a623-59b52104d6c8-combined-ca-bundle\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.930853 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f9e94f3-540e-4cc6-a623-59b52104d6c8-config-data\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.932234 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f9e94f3-540e-4cc6-a623-59b52104d6c8-scripts\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:00 crc kubenswrapper[4753]: I0129 15:33:00.942728 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkdkh\" (UniqueName: \"kubernetes.io/projected/0f9e94f3-540e-4cc6-a623-59b52104d6c8-kube-api-access-nkdkh\") pod \"placement-7649df9fd-nqw7s\" (UID: \"0f9e94f3-540e-4cc6-a623-59b52104d6c8\") " pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:01 crc kubenswrapper[4753]: I0129 15:33:01.072013 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:01 crc kubenswrapper[4753]: I0129 15:33:01.548439 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7649df9fd-nqw7s"] Jan 29 15:33:01 crc kubenswrapper[4753]: W0129 15:33:01.550798 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f9e94f3_540e_4cc6_a623_59b52104d6c8.slice/crio-b238af6777730e23567217dc29d82b43cba1a834d976ce06d795ce8809c72215 WatchSource:0}: Error finding container b238af6777730e23567217dc29d82b43cba1a834d976ce06d795ce8809c72215: Status 404 returned error can't find the container with id b238af6777730e23567217dc29d82b43cba1a834d976ce06d795ce8809c72215 Jan 29 15:33:01 crc kubenswrapper[4753]: I0129 15:33:01.691404 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7649df9fd-nqw7s" event={"ID":"0f9e94f3-540e-4cc6-a623-59b52104d6c8","Type":"ContainerStarted","Data":"b238af6777730e23567217dc29d82b43cba1a834d976ce06d795ce8809c72215"} Jan 29 15:33:02 crc kubenswrapper[4753]: I0129 15:33:02.703384 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7649df9fd-nqw7s" event={"ID":"0f9e94f3-540e-4cc6-a623-59b52104d6c8","Type":"ContainerStarted","Data":"15557b50bfe2f974970038074d53fa64f1782da1595293ad3173b16a3591154f"} Jan 29 15:33:02 crc kubenswrapper[4753]: I0129 15:33:02.704703 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7649df9fd-nqw7s" event={"ID":"0f9e94f3-540e-4cc6-a623-59b52104d6c8","Type":"ContainerStarted","Data":"d5a3810d25395661c7840698270be37ada294d464cff2216e0b8d754840cc9f8"} Jan 29 15:33:02 crc kubenswrapper[4753]: I0129 15:33:02.704855 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:02 crc kubenswrapper[4753]: I0129 15:33:02.704882 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:02 crc kubenswrapper[4753]: I0129 15:33:02.730323 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7649df9fd-nqw7s" podStartSLOduration=2.730300075 podStartE2EDuration="2.730300075s" podCreationTimestamp="2026-01-29 15:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:33:02.718495677 +0000 UTC m=+5417.413230079" watchObservedRunningTime="2026-01-29 15:33:02.730300075 +0000 UTC m=+5417.425034467" Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.344833 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.416440 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b76cd765c-m6k2s"] Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.416701 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" podUID="bb56f0c2-00da-4cc4-abe7-5f1ee656190b" containerName="dnsmasq-dns" containerID="cri-o://4eac7d072e82edbbd9f29931d6540e3f2eaad20486cb98fe983cf9c4b0cd47dd" gracePeriod=10 Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.739119 4753 generic.go:334] "Generic (PLEG): container finished" podID="bb56f0c2-00da-4cc4-abe7-5f1ee656190b" containerID="4eac7d072e82edbbd9f29931d6540e3f2eaad20486cb98fe983cf9c4b0cd47dd" exitCode=0 Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.739192 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" event={"ID":"bb56f0c2-00da-4cc4-abe7-5f1ee656190b","Type":"ContainerDied","Data":"4eac7d072e82edbbd9f29931d6540e3f2eaad20486cb98fe983cf9c4b0cd47dd"} Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.920902 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.947774 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-nb\") pod \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.947975 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-sb\") pod \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.948076 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl4xf\" (UniqueName: \"kubernetes.io/projected/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-kube-api-access-sl4xf\") pod \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.948248 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-config\") pod \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.948363 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-dns-svc\") pod \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\" (UID: \"bb56f0c2-00da-4cc4-abe7-5f1ee656190b\") " Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.970342 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-kube-api-access-sl4xf" (OuterVolumeSpecName: "kube-api-access-sl4xf") pod "bb56f0c2-00da-4cc4-abe7-5f1ee656190b" (UID: "bb56f0c2-00da-4cc4-abe7-5f1ee656190b"). InnerVolumeSpecName "kube-api-access-sl4xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.997390 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb56f0c2-00da-4cc4-abe7-5f1ee656190b" (UID: "bb56f0c2-00da-4cc4-abe7-5f1ee656190b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.998096 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb56f0c2-00da-4cc4-abe7-5f1ee656190b" (UID: "bb56f0c2-00da-4cc4-abe7-5f1ee656190b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:33:06 crc kubenswrapper[4753]: I0129 15:33:06.999080 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb56f0c2-00da-4cc4-abe7-5f1ee656190b" (UID: "bb56f0c2-00da-4cc4-abe7-5f1ee656190b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.013522 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-config" (OuterVolumeSpecName: "config") pod "bb56f0c2-00da-4cc4-abe7-5f1ee656190b" (UID: "bb56f0c2-00da-4cc4-abe7-5f1ee656190b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.050390 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.050426 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.050440 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl4xf\" (UniqueName: \"kubernetes.io/projected/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-kube-api-access-sl4xf\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.050457 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.050468 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb56f0c2-00da-4cc4-abe7-5f1ee656190b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.764011 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" event={"ID":"bb56f0c2-00da-4cc4-abe7-5f1ee656190b","Type":"ContainerDied","Data":"9e7fd66f1781b9103ed24cb6b4e752d8f9db839759f26cfc4301f2cddc2db42c"} Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.764122 4753 scope.go:117] "RemoveContainer" containerID="4eac7d072e82edbbd9f29931d6540e3f2eaad20486cb98fe983cf9c4b0cd47dd" Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.764142 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b76cd765c-m6k2s" Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.788399 4753 scope.go:117] "RemoveContainer" containerID="69af38e1cbfd5738cde4283f5589ed9cd1922619257a1e05eb6e271e5c8c20d8" Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.813319 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b76cd765c-m6k2s"] Jan 29 15:33:07 crc kubenswrapper[4753]: I0129 15:33:07.827015 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b76cd765c-m6k2s"] Jan 29 15:33:08 crc kubenswrapper[4753]: I0129 15:33:08.163297 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb56f0c2-00da-4cc4-abe7-5f1ee656190b" path="/var/lib/kubelet/pods/bb56f0c2-00da-4cc4-abe7-5f1ee656190b/volumes" Jan 29 15:33:32 crc kubenswrapper[4753]: I0129 15:33:32.136939 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:32 crc kubenswrapper[4753]: I0129 15:33:32.158679 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7649df9fd-nqw7s" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.719239 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-bsz4g"] Jan 29 15:33:53 crc kubenswrapper[4753]: E0129 15:33:53.720288 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb56f0c2-00da-4cc4-abe7-5f1ee656190b" containerName="init" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.720307 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb56f0c2-00da-4cc4-abe7-5f1ee656190b" containerName="init" Jan 29 15:33:53 crc kubenswrapper[4753]: E0129 15:33:53.720338 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb56f0c2-00da-4cc4-abe7-5f1ee656190b" containerName="dnsmasq-dns" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.720348 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb56f0c2-00da-4cc4-abe7-5f1ee656190b" containerName="dnsmasq-dns" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.720552 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb56f0c2-00da-4cc4-abe7-5f1ee656190b" containerName="dnsmasq-dns" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.721350 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bsz4g" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.735748 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bsz4g"] Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.771139 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh79k\" (UniqueName: \"kubernetes.io/projected/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-kube-api-access-mh79k\") pod \"nova-api-db-create-bsz4g\" (UID: \"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7\") " pod="openstack/nova-api-db-create-bsz4g" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.772270 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-operator-scripts\") pod \"nova-api-db-create-bsz4g\" (UID: \"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7\") " pod="openstack/nova-api-db-create-bsz4g" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.811290 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tmg7x"] Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.812491 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tmg7x" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.817749 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tmg7x"] Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.873719 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh79k\" (UniqueName: \"kubernetes.io/projected/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-kube-api-access-mh79k\") pod \"nova-api-db-create-bsz4g\" (UID: \"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7\") " pod="openstack/nova-api-db-create-bsz4g" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.873789 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg9wg\" (UniqueName: \"kubernetes.io/projected/2df4bfd0-cd78-4225-9eac-0903d4df186d-kube-api-access-zg9wg\") pod \"nova-cell0-db-create-tmg7x\" (UID: \"2df4bfd0-cd78-4225-9eac-0903d4df186d\") " pod="openstack/nova-cell0-db-create-tmg7x" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.873865 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2df4bfd0-cd78-4225-9eac-0903d4df186d-operator-scripts\") pod \"nova-cell0-db-create-tmg7x\" (UID: \"2df4bfd0-cd78-4225-9eac-0903d4df186d\") " pod="openstack/nova-cell0-db-create-tmg7x" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.873944 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-operator-scripts\") pod \"nova-api-db-create-bsz4g\" (UID: \"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7\") " pod="openstack/nova-api-db-create-bsz4g" Jan 29 15:33:53 crc kubenswrapper[4753]: I0129 15:33:53.874704 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-operator-scripts\") pod \"nova-api-db-create-bsz4g\" (UID: \"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7\") " pod="openstack/nova-api-db-create-bsz4g" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.595688 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg9wg\" (UniqueName: \"kubernetes.io/projected/2df4bfd0-cd78-4225-9eac-0903d4df186d-kube-api-access-zg9wg\") pod \"nova-cell0-db-create-tmg7x\" (UID: \"2df4bfd0-cd78-4225-9eac-0903d4df186d\") " pod="openstack/nova-cell0-db-create-tmg7x" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.597008 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2df4bfd0-cd78-4225-9eac-0903d4df186d-operator-scripts\") pod \"nova-cell0-db-create-tmg7x\" (UID: \"2df4bfd0-cd78-4225-9eac-0903d4df186d\") " pod="openstack/nova-cell0-db-create-tmg7x" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.598374 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2df4bfd0-cd78-4225-9eac-0903d4df186d-operator-scripts\") pod \"nova-cell0-db-create-tmg7x\" (UID: \"2df4bfd0-cd78-4225-9eac-0903d4df186d\") " pod="openstack/nova-cell0-db-create-tmg7x" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.602113 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh79k\" (UniqueName: \"kubernetes.io/projected/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-kube-api-access-mh79k\") pod \"nova-api-db-create-bsz4g\" (UID: \"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7\") " pod="openstack/nova-api-db-create-bsz4g" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.624769 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg9wg\" (UniqueName: \"kubernetes.io/projected/2df4bfd0-cd78-4225-9eac-0903d4df186d-kube-api-access-zg9wg\") pod \"nova-cell0-db-create-tmg7x\" (UID: \"2df4bfd0-cd78-4225-9eac-0903d4df186d\") " pod="openstack/nova-cell0-db-create-tmg7x" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.637620 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5kggc"] Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.639088 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5kggc" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.642233 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bsz4g" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.659790 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-93e2-account-create-update-7dxp5"] Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.661208 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-93e2-account-create-update-7dxp5" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.663903 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.670702 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5kggc"] Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.678984 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-93e2-account-create-update-7dxp5"] Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.698014 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3f5b7b-6054-4179-aefb-8ac06bf44628-operator-scripts\") pod \"nova-api-93e2-account-create-update-7dxp5\" (UID: \"6f3f5b7b-6054-4179-aefb-8ac06bf44628\") " pod="openstack/nova-api-93e2-account-create-update-7dxp5" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.698103 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zp4\" (UniqueName: \"kubernetes.io/projected/33591b63-9910-4e2b-8188-c8598c1b510f-kube-api-access-n5zp4\") pod \"nova-cell1-db-create-5kggc\" (UID: \"33591b63-9910-4e2b-8188-c8598c1b510f\") " pod="openstack/nova-cell1-db-create-5kggc" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.698196 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2sbv\" (UniqueName: \"kubernetes.io/projected/6f3f5b7b-6054-4179-aefb-8ac06bf44628-kube-api-access-q2sbv\") pod \"nova-api-93e2-account-create-update-7dxp5\" (UID: \"6f3f5b7b-6054-4179-aefb-8ac06bf44628\") " pod="openstack/nova-api-93e2-account-create-update-7dxp5" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.698262 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33591b63-9910-4e2b-8188-c8598c1b510f-operator-scripts\") pod \"nova-cell1-db-create-5kggc\" (UID: \"33591b63-9910-4e2b-8188-c8598c1b510f\") " pod="openstack/nova-cell1-db-create-5kggc" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.707317 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4efd-account-create-update-9rt7v"] Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.708594 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.711102 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.733232 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4efd-account-create-update-9rt7v"] Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.735602 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tmg7x" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.750012 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-57e6-account-create-update-zm6k8"] Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.751299 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.753952 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.761443 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-57e6-account-create-update-zm6k8"] Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.803545 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3f5b7b-6054-4179-aefb-8ac06bf44628-operator-scripts\") pod \"nova-api-93e2-account-create-update-7dxp5\" (UID: \"6f3f5b7b-6054-4179-aefb-8ac06bf44628\") " pod="openstack/nova-api-93e2-account-create-update-7dxp5" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.803919 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zp4\" (UniqueName: \"kubernetes.io/projected/33591b63-9910-4e2b-8188-c8598c1b510f-kube-api-access-n5zp4\") pod \"nova-cell1-db-create-5kggc\" (UID: \"33591b63-9910-4e2b-8188-c8598c1b510f\") " pod="openstack/nova-cell1-db-create-5kggc" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.804003 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2sbv\" (UniqueName: \"kubernetes.io/projected/6f3f5b7b-6054-4179-aefb-8ac06bf44628-kube-api-access-q2sbv\") pod \"nova-api-93e2-account-create-update-7dxp5\" (UID: \"6f3f5b7b-6054-4179-aefb-8ac06bf44628\") " pod="openstack/nova-api-93e2-account-create-update-7dxp5" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.804115 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33591b63-9910-4e2b-8188-c8598c1b510f-operator-scripts\") pod \"nova-cell1-db-create-5kggc\" (UID: \"33591b63-9910-4e2b-8188-c8598c1b510f\") " pod="openstack/nova-cell1-db-create-5kggc" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.804837 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33591b63-9910-4e2b-8188-c8598c1b510f-operator-scripts\") pod \"nova-cell1-db-create-5kggc\" (UID: \"33591b63-9910-4e2b-8188-c8598c1b510f\") " pod="openstack/nova-cell1-db-create-5kggc" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.805108 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3f5b7b-6054-4179-aefb-8ac06bf44628-operator-scripts\") pod \"nova-api-93e2-account-create-update-7dxp5\" (UID: \"6f3f5b7b-6054-4179-aefb-8ac06bf44628\") " pod="openstack/nova-api-93e2-account-create-update-7dxp5" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.828509 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2sbv\" (UniqueName: \"kubernetes.io/projected/6f3f5b7b-6054-4179-aefb-8ac06bf44628-kube-api-access-q2sbv\") pod \"nova-api-93e2-account-create-update-7dxp5\" (UID: \"6f3f5b7b-6054-4179-aefb-8ac06bf44628\") " pod="openstack/nova-api-93e2-account-create-update-7dxp5" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.836335 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zp4\" (UniqueName: \"kubernetes.io/projected/33591b63-9910-4e2b-8188-c8598c1b510f-kube-api-access-n5zp4\") pod \"nova-cell1-db-create-5kggc\" (UID: \"33591b63-9910-4e2b-8188-c8598c1b510f\") " pod="openstack/nova-cell1-db-create-5kggc" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.909168 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b26381a9-132d-429c-bbaf-396b609c273c-operator-scripts\") pod \"nova-cell0-4efd-account-create-update-9rt7v\" (UID: \"b26381a9-132d-429c-bbaf-396b609c273c\") " pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.909411 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d53a245-058d-428a-9de6-4d65eff12330-operator-scripts\") pod \"nova-cell1-57e6-account-create-update-zm6k8\" (UID: \"1d53a245-058d-428a-9de6-4d65eff12330\") " pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.909475 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v6rd\" (UniqueName: \"kubernetes.io/projected/b26381a9-132d-429c-bbaf-396b609c273c-kube-api-access-2v6rd\") pod \"nova-cell0-4efd-account-create-update-9rt7v\" (UID: \"b26381a9-132d-429c-bbaf-396b609c273c\") " pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" Jan 29 15:33:54 crc kubenswrapper[4753]: I0129 15:33:54.909931 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7zg4\" (UniqueName: \"kubernetes.io/projected/1d53a245-058d-428a-9de6-4d65eff12330-kube-api-access-x7zg4\") pod \"nova-cell1-57e6-account-create-update-zm6k8\" (UID: \"1d53a245-058d-428a-9de6-4d65eff12330\") " pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.014818 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d53a245-058d-428a-9de6-4d65eff12330-operator-scripts\") pod \"nova-cell1-57e6-account-create-update-zm6k8\" (UID: \"1d53a245-058d-428a-9de6-4d65eff12330\") " pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.014886 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v6rd\" (UniqueName: \"kubernetes.io/projected/b26381a9-132d-429c-bbaf-396b609c273c-kube-api-access-2v6rd\") pod \"nova-cell0-4efd-account-create-update-9rt7v\" (UID: \"b26381a9-132d-429c-bbaf-396b609c273c\") " pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.015016 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7zg4\" (UniqueName: \"kubernetes.io/projected/1d53a245-058d-428a-9de6-4d65eff12330-kube-api-access-x7zg4\") pod \"nova-cell1-57e6-account-create-update-zm6k8\" (UID: \"1d53a245-058d-428a-9de6-4d65eff12330\") " pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.015073 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b26381a9-132d-429c-bbaf-396b609c273c-operator-scripts\") pod \"nova-cell0-4efd-account-create-update-9rt7v\" (UID: \"b26381a9-132d-429c-bbaf-396b609c273c\") " pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.015694 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d53a245-058d-428a-9de6-4d65eff12330-operator-scripts\") pod \"nova-cell1-57e6-account-create-update-zm6k8\" (UID: \"1d53a245-058d-428a-9de6-4d65eff12330\") " pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.016146 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b26381a9-132d-429c-bbaf-396b609c273c-operator-scripts\") pod \"nova-cell0-4efd-account-create-update-9rt7v\" (UID: \"b26381a9-132d-429c-bbaf-396b609c273c\") " pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.034169 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7zg4\" (UniqueName: \"kubernetes.io/projected/1d53a245-058d-428a-9de6-4d65eff12330-kube-api-access-x7zg4\") pod \"nova-cell1-57e6-account-create-update-zm6k8\" (UID: \"1d53a245-058d-428a-9de6-4d65eff12330\") " pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.035459 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v6rd\" (UniqueName: \"kubernetes.io/projected/b26381a9-132d-429c-bbaf-396b609c273c-kube-api-access-2v6rd\") pod \"nova-cell0-4efd-account-create-update-9rt7v\" (UID: \"b26381a9-132d-429c-bbaf-396b609c273c\") " pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.062690 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5kggc" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.104839 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-93e2-account-create-update-7dxp5" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.120989 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.154471 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.205832 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bsz4g"] Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.268479 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tmg7x"] Jan 29 15:33:55 crc kubenswrapper[4753]: W0129 15:33:55.281536 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2df4bfd0_cd78_4225_9eac_0903d4df186d.slice/crio-3802356226a4ff005a61a8ef73275f80e4f09d82f940fffc51a928a1ea4619f8 WatchSource:0}: Error finding container 3802356226a4ff005a61a8ef73275f80e4f09d82f940fffc51a928a1ea4619f8: Status 404 returned error can't find the container with id 3802356226a4ff005a61a8ef73275f80e4f09d82f940fffc51a928a1ea4619f8 Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.562810 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5kggc"] Jan 29 15:33:55 crc kubenswrapper[4753]: W0129 15:33:55.564329 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33591b63_9910_4e2b_8188_c8598c1b510f.slice/crio-c1082982f1748bd3f1233601c0c2e88766cac7a5e01f873b5a2e437c9149052f WatchSource:0}: Error finding container c1082982f1748bd3f1233601c0c2e88766cac7a5e01f873b5a2e437c9149052f: Status 404 returned error can't find the container with id c1082982f1748bd3f1233601c0c2e88766cac7a5e01f873b5a2e437c9149052f Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.617859 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5kggc" event={"ID":"33591b63-9910-4e2b-8188-c8598c1b510f","Type":"ContainerStarted","Data":"c1082982f1748bd3f1233601c0c2e88766cac7a5e01f873b5a2e437c9149052f"} Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.619557 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bsz4g" event={"ID":"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7","Type":"ContainerStarted","Data":"0a6be775f2cd5e5d07c78d6c98ef48e7324779a0346174264aa991bdcbeccd33"} Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.619596 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bsz4g" event={"ID":"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7","Type":"ContainerStarted","Data":"883050ed18d3b4d74a011f205923a7ef3773657459e2e50b19729c93337609f5"} Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.622591 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tmg7x" event={"ID":"2df4bfd0-cd78-4225-9eac-0903d4df186d","Type":"ContainerStarted","Data":"db01f621e3933d8854d001cfb96afddc78b096bf6c50af9bf725b27ada74f3bc"} Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.622630 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tmg7x" event={"ID":"2df4bfd0-cd78-4225-9eac-0903d4df186d","Type":"ContainerStarted","Data":"3802356226a4ff005a61a8ef73275f80e4f09d82f940fffc51a928a1ea4619f8"} Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.642492 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-bsz4g" podStartSLOduration=2.642470706 podStartE2EDuration="2.642470706s" podCreationTimestamp="2026-01-29 15:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:33:55.632415915 +0000 UTC m=+5470.327150317" watchObservedRunningTime="2026-01-29 15:33:55.642470706 +0000 UTC m=+5470.337205088" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.658092 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-tmg7x" podStartSLOduration=2.658074207 podStartE2EDuration="2.658074207s" podCreationTimestamp="2026-01-29 15:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:33:55.654717596 +0000 UTC m=+5470.349451978" watchObservedRunningTime="2026-01-29 15:33:55.658074207 +0000 UTC m=+5470.352808589" Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.673260 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-93e2-account-create-update-7dxp5"] Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.690488 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4efd-account-create-update-9rt7v"] Jan 29 15:33:55 crc kubenswrapper[4753]: I0129 15:33:55.733395 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-57e6-account-create-update-zm6k8"] Jan 29 15:33:55 crc kubenswrapper[4753]: W0129 15:33:55.733962 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d53a245_058d_428a_9de6_4d65eff12330.slice/crio-0758d302336f402a81a36a88b6fe4601d1dd972b36d1fe130dd5b9d9075f2227 WatchSource:0}: Error finding container 0758d302336f402a81a36a88b6fe4601d1dd972b36d1fe130dd5b9d9075f2227: Status 404 returned error can't find the container with id 0758d302336f402a81a36a88b6fe4601d1dd972b36d1fe130dd5b9d9075f2227 Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.304797 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sf6s8"] Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.307413 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.319943 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sf6s8"] Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.439599 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-utilities\") pod \"certified-operators-sf6s8\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.439724 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-catalog-content\") pod \"certified-operators-sf6s8\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.439755 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdpts\" (UniqueName: \"kubernetes.io/projected/1880c639-6cce-4b11-9806-9ec0ea6309b3-kube-api-access-kdpts\") pod \"certified-operators-sf6s8\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.540858 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-catalog-content\") pod \"certified-operators-sf6s8\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.540915 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdpts\" (UniqueName: \"kubernetes.io/projected/1880c639-6cce-4b11-9806-9ec0ea6309b3-kube-api-access-kdpts\") pod \"certified-operators-sf6s8\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.540973 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-utilities\") pod \"certified-operators-sf6s8\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.541457 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-utilities\") pod \"certified-operators-sf6s8\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.541617 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-catalog-content\") pod \"certified-operators-sf6s8\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.560336 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdpts\" (UniqueName: \"kubernetes.io/projected/1880c639-6cce-4b11-9806-9ec0ea6309b3-kube-api-access-kdpts\") pod \"certified-operators-sf6s8\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.631254 4753 generic.go:334] "Generic (PLEG): container finished" podID="6f3f5b7b-6054-4179-aefb-8ac06bf44628" containerID="f9851b510b51f6cedeedb2272ea38c449045648ef008760f4bd1c7523f76faff" exitCode=0 Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.631318 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-93e2-account-create-update-7dxp5" event={"ID":"6f3f5b7b-6054-4179-aefb-8ac06bf44628","Type":"ContainerDied","Data":"f9851b510b51f6cedeedb2272ea38c449045648ef008760f4bd1c7523f76faff"} Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.631343 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-93e2-account-create-update-7dxp5" event={"ID":"6f3f5b7b-6054-4179-aefb-8ac06bf44628","Type":"ContainerStarted","Data":"75bab7472ed2f436622f32e09fb16bf12eb13bdb41c62c83cb32e5c6c5515bf5"} Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.632956 4753 generic.go:334] "Generic (PLEG): container finished" podID="33591b63-9910-4e2b-8188-c8598c1b510f" containerID="12194361fbbdf0e221be0d1d2fa832b7a7be746789bcf75f3cd26d41d57dfa13" exitCode=0 Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.633006 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5kggc" event={"ID":"33591b63-9910-4e2b-8188-c8598c1b510f","Type":"ContainerDied","Data":"12194361fbbdf0e221be0d1d2fa832b7a7be746789bcf75f3cd26d41d57dfa13"} Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.634589 4753 generic.go:334] "Generic (PLEG): container finished" podID="1d53a245-058d-428a-9de6-4d65eff12330" containerID="22faef2d8e3c78953fdd5fe9ae0808301b52e36fbb26df710e1c6fa0216fd2c2" exitCode=0 Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.634637 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" event={"ID":"1d53a245-058d-428a-9de6-4d65eff12330","Type":"ContainerDied","Data":"22faef2d8e3c78953fdd5fe9ae0808301b52e36fbb26df710e1c6fa0216fd2c2"} Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.634659 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" event={"ID":"1d53a245-058d-428a-9de6-4d65eff12330","Type":"ContainerStarted","Data":"0758d302336f402a81a36a88b6fe4601d1dd972b36d1fe130dd5b9d9075f2227"} Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.636160 4753 generic.go:334] "Generic (PLEG): container finished" podID="9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7" containerID="0a6be775f2cd5e5d07c78d6c98ef48e7324779a0346174264aa991bdcbeccd33" exitCode=0 Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.636229 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bsz4g" event={"ID":"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7","Type":"ContainerDied","Data":"0a6be775f2cd5e5d07c78d6c98ef48e7324779a0346174264aa991bdcbeccd33"} Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.638048 4753 generic.go:334] "Generic (PLEG): container finished" podID="b26381a9-132d-429c-bbaf-396b609c273c" containerID="13e68e8a4ccd774ac3de8880e8336b0a1d0e28bb31d378e3ad175d659df35e8e" exitCode=0 Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.638145 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" event={"ID":"b26381a9-132d-429c-bbaf-396b609c273c","Type":"ContainerDied","Data":"13e68e8a4ccd774ac3de8880e8336b0a1d0e28bb31d378e3ad175d659df35e8e"} Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.638214 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" event={"ID":"b26381a9-132d-429c-bbaf-396b609c273c","Type":"ContainerStarted","Data":"3e4c9ec0be8370aace37e210d0185acacd5897a4ea0be17f07599f32174f393a"} Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.640407 4753 generic.go:334] "Generic (PLEG): container finished" podID="2df4bfd0-cd78-4225-9eac-0903d4df186d" containerID="db01f621e3933d8854d001cfb96afddc78b096bf6c50af9bf725b27ada74f3bc" exitCode=0 Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.640445 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tmg7x" event={"ID":"2df4bfd0-cd78-4225-9eac-0903d4df186d","Type":"ContainerDied","Data":"db01f621e3933d8854d001cfb96afddc78b096bf6c50af9bf725b27ada74f3bc"} Jan 29 15:33:56 crc kubenswrapper[4753]: I0129 15:33:56.804664 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.055178 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.055233 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.110042 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zfwqr"] Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.120458 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.204086 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zfwqr"] Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.255342 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-catalog-content\") pod \"community-operators-zfwqr\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.255473 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7m6\" (UniqueName: \"kubernetes.io/projected/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-kube-api-access-rb7m6\") pod \"community-operators-zfwqr\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.255517 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-utilities\") pod \"community-operators-zfwqr\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.356528 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-utilities\") pod \"community-operators-zfwqr\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.356598 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-catalog-content\") pod \"community-operators-zfwqr\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.356673 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7m6\" (UniqueName: \"kubernetes.io/projected/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-kube-api-access-rb7m6\") pod \"community-operators-zfwqr\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.357327 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-utilities\") pod \"community-operators-zfwqr\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.357530 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-catalog-content\") pod \"community-operators-zfwqr\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.375108 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7m6\" (UniqueName: \"kubernetes.io/projected/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-kube-api-access-rb7m6\") pod \"community-operators-zfwqr\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:57 crc kubenswrapper[4753]: I0129 15:33:57.527271 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:33:58 crc kubenswrapper[4753]: W0129 15:33:58.360791 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86bfcbb9_d197_423a_b9b4_2ebd2aedac13.slice/crio-1444304064e29bbcafc016eaf70ba18b7dbd6bedf3486ea9267010f594939b65 WatchSource:0}: Error finding container 1444304064e29bbcafc016eaf70ba18b7dbd6bedf3486ea9267010f594939b65: Status 404 returned error can't find the container with id 1444304064e29bbcafc016eaf70ba18b7dbd6bedf3486ea9267010f594939b65 Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.362450 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zfwqr"] Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.413340 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.419444 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5kggc" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.473310 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.493933 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33591b63-9910-4e2b-8188-c8598c1b510f-operator-scripts\") pod \"33591b63-9910-4e2b-8188-c8598c1b510f\" (UID: \"33591b63-9910-4e2b-8188-c8598c1b510f\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.494132 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b26381a9-132d-429c-bbaf-396b609c273c-operator-scripts\") pod \"b26381a9-132d-429c-bbaf-396b609c273c\" (UID: \"b26381a9-132d-429c-bbaf-396b609c273c\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.494287 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5zp4\" (UniqueName: \"kubernetes.io/projected/33591b63-9910-4e2b-8188-c8598c1b510f-kube-api-access-n5zp4\") pod \"33591b63-9910-4e2b-8188-c8598c1b510f\" (UID: \"33591b63-9910-4e2b-8188-c8598c1b510f\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.494397 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v6rd\" (UniqueName: \"kubernetes.io/projected/b26381a9-132d-429c-bbaf-396b609c273c-kube-api-access-2v6rd\") pod \"b26381a9-132d-429c-bbaf-396b609c273c\" (UID: \"b26381a9-132d-429c-bbaf-396b609c273c\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.499635 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26381a9-132d-429c-bbaf-396b609c273c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b26381a9-132d-429c-bbaf-396b609c273c" (UID: "b26381a9-132d-429c-bbaf-396b609c273c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.502641 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26381a9-132d-429c-bbaf-396b609c273c-kube-api-access-2v6rd" (OuterVolumeSpecName: "kube-api-access-2v6rd") pod "b26381a9-132d-429c-bbaf-396b609c273c" (UID: "b26381a9-132d-429c-bbaf-396b609c273c"). InnerVolumeSpecName "kube-api-access-2v6rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.502740 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33591b63-9910-4e2b-8188-c8598c1b510f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33591b63-9910-4e2b-8188-c8598c1b510f" (UID: "33591b63-9910-4e2b-8188-c8598c1b510f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.504049 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33591b63-9910-4e2b-8188-c8598c1b510f-kube-api-access-n5zp4" (OuterVolumeSpecName: "kube-api-access-n5zp4") pod "33591b63-9910-4e2b-8188-c8598c1b510f" (UID: "33591b63-9910-4e2b-8188-c8598c1b510f"). InnerVolumeSpecName "kube-api-access-n5zp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.508426 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tmg7x" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.514163 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sf6s8"] Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.526508 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-93e2-account-create-update-7dxp5" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.559201 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bsz4g" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.597468 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d53a245-058d-428a-9de6-4d65eff12330-operator-scripts\") pod \"1d53a245-058d-428a-9de6-4d65eff12330\" (UID: \"1d53a245-058d-428a-9de6-4d65eff12330\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.597578 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2df4bfd0-cd78-4225-9eac-0903d4df186d-operator-scripts\") pod \"2df4bfd0-cd78-4225-9eac-0903d4df186d\" (UID: \"2df4bfd0-cd78-4225-9eac-0903d4df186d\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.597631 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zg4\" (UniqueName: \"kubernetes.io/projected/1d53a245-058d-428a-9de6-4d65eff12330-kube-api-access-x7zg4\") pod \"1d53a245-058d-428a-9de6-4d65eff12330\" (UID: \"1d53a245-058d-428a-9de6-4d65eff12330\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.597713 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg9wg\" (UniqueName: \"kubernetes.io/projected/2df4bfd0-cd78-4225-9eac-0903d4df186d-kube-api-access-zg9wg\") pod \"2df4bfd0-cd78-4225-9eac-0903d4df186d\" (UID: \"2df4bfd0-cd78-4225-9eac-0903d4df186d\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.598020 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d53a245-058d-428a-9de6-4d65eff12330-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d53a245-058d-428a-9de6-4d65eff12330" (UID: "1d53a245-058d-428a-9de6-4d65eff12330"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.598100 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33591b63-9910-4e2b-8188-c8598c1b510f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.598117 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b26381a9-132d-429c-bbaf-396b609c273c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.598126 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5zp4\" (UniqueName: \"kubernetes.io/projected/33591b63-9910-4e2b-8188-c8598c1b510f-kube-api-access-n5zp4\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.598137 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v6rd\" (UniqueName: \"kubernetes.io/projected/b26381a9-132d-429c-bbaf-396b609c273c-kube-api-access-2v6rd\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.598453 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df4bfd0-cd78-4225-9eac-0903d4df186d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2df4bfd0-cd78-4225-9eac-0903d4df186d" (UID: "2df4bfd0-cd78-4225-9eac-0903d4df186d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.601231 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df4bfd0-cd78-4225-9eac-0903d4df186d-kube-api-access-zg9wg" (OuterVolumeSpecName: "kube-api-access-zg9wg") pod "2df4bfd0-cd78-4225-9eac-0903d4df186d" (UID: "2df4bfd0-cd78-4225-9eac-0903d4df186d"). InnerVolumeSpecName "kube-api-access-zg9wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.602055 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d53a245-058d-428a-9de6-4d65eff12330-kube-api-access-x7zg4" (OuterVolumeSpecName: "kube-api-access-x7zg4") pod "1d53a245-058d-428a-9de6-4d65eff12330" (UID: "1d53a245-058d-428a-9de6-4d65eff12330"). InnerVolumeSpecName "kube-api-access-x7zg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.667959 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bsz4g" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.667939 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bsz4g" event={"ID":"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7","Type":"ContainerDied","Data":"883050ed18d3b4d74a011f205923a7ef3773657459e2e50b19729c93337609f5"} Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.668285 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="883050ed18d3b4d74a011f205923a7ef3773657459e2e50b19729c93337609f5" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.670598 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.670594 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4efd-account-create-update-9rt7v" event={"ID":"b26381a9-132d-429c-bbaf-396b609c273c","Type":"ContainerDied","Data":"3e4c9ec0be8370aace37e210d0185acacd5897a4ea0be17f07599f32174f393a"} Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.670881 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e4c9ec0be8370aace37e210d0185acacd5897a4ea0be17f07599f32174f393a" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.672463 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tmg7x" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.672484 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tmg7x" event={"ID":"2df4bfd0-cd78-4225-9eac-0903d4df186d","Type":"ContainerDied","Data":"3802356226a4ff005a61a8ef73275f80e4f09d82f940fffc51a928a1ea4619f8"} Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.672517 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3802356226a4ff005a61a8ef73275f80e4f09d82f940fffc51a928a1ea4619f8" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.681977 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-93e2-account-create-update-7dxp5" event={"ID":"6f3f5b7b-6054-4179-aefb-8ac06bf44628","Type":"ContainerDied","Data":"75bab7472ed2f436622f32e09fb16bf12eb13bdb41c62c83cb32e5c6c5515bf5"} Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.682024 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75bab7472ed2f436622f32e09fb16bf12eb13bdb41c62c83cb32e5c6c5515bf5" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.682031 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-93e2-account-create-update-7dxp5" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.683657 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf6s8" event={"ID":"1880c639-6cce-4b11-9806-9ec0ea6309b3","Type":"ContainerStarted","Data":"f404f6da6e09fddab692342f9276b63e897d96295de855b903a32a890e965c7f"} Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.686301 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5kggc" event={"ID":"33591b63-9910-4e2b-8188-c8598c1b510f","Type":"ContainerDied","Data":"c1082982f1748bd3f1233601c0c2e88766cac7a5e01f873b5a2e437c9149052f"} Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.686324 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1082982f1748bd3f1233601c0c2e88766cac7a5e01f873b5a2e437c9149052f" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.686367 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5kggc" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.690067 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" event={"ID":"1d53a245-058d-428a-9de6-4d65eff12330","Type":"ContainerDied","Data":"0758d302336f402a81a36a88b6fe4601d1dd972b36d1fe130dd5b9d9075f2227"} Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.690124 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0758d302336f402a81a36a88b6fe4601d1dd972b36d1fe130dd5b9d9075f2227" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.690083 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-57e6-account-create-update-zm6k8" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.691978 4753 generic.go:334] "Generic (PLEG): container finished" podID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerID="d23d1513a2c4a55507098aac45a6900ee3387ccb1ac652deb6adbe784a61b189" exitCode=0 Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.692017 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfwqr" event={"ID":"86bfcbb9-d197-423a-b9b4-2ebd2aedac13","Type":"ContainerDied","Data":"d23d1513a2c4a55507098aac45a6900ee3387ccb1ac652deb6adbe784a61b189"} Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.692041 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfwqr" event={"ID":"86bfcbb9-d197-423a-b9b4-2ebd2aedac13","Type":"ContainerStarted","Data":"1444304064e29bbcafc016eaf70ba18b7dbd6bedf3486ea9267010f594939b65"} Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.700286 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.702381 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-operator-scripts\") pod \"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7\" (UID: \"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.702424 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2sbv\" (UniqueName: \"kubernetes.io/projected/6f3f5b7b-6054-4179-aefb-8ac06bf44628-kube-api-access-q2sbv\") pod \"6f3f5b7b-6054-4179-aefb-8ac06bf44628\" (UID: \"6f3f5b7b-6054-4179-aefb-8ac06bf44628\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.702472 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh79k\" (UniqueName: \"kubernetes.io/projected/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-kube-api-access-mh79k\") pod \"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7\" (UID: \"9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.702509 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3f5b7b-6054-4179-aefb-8ac06bf44628-operator-scripts\") pod \"6f3f5b7b-6054-4179-aefb-8ac06bf44628\" (UID: \"6f3f5b7b-6054-4179-aefb-8ac06bf44628\") " Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.702746 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7" (UID: "9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.703325 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg9wg\" (UniqueName: \"kubernetes.io/projected/2df4bfd0-cd78-4225-9eac-0903d4df186d-kube-api-access-zg9wg\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.703341 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d53a245-058d-428a-9de6-4d65eff12330-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.703350 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2df4bfd0-cd78-4225-9eac-0903d4df186d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.703360 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zg4\" (UniqueName: \"kubernetes.io/projected/1d53a245-058d-428a-9de6-4d65eff12330-kube-api-access-x7zg4\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.703356 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3f5b7b-6054-4179-aefb-8ac06bf44628-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f3f5b7b-6054-4179-aefb-8ac06bf44628" (UID: "6f3f5b7b-6054-4179-aefb-8ac06bf44628"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.703368 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.705414 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-kube-api-access-mh79k" (OuterVolumeSpecName: "kube-api-access-mh79k") pod "9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7" (UID: "9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7"). InnerVolumeSpecName "kube-api-access-mh79k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.705568 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3f5b7b-6054-4179-aefb-8ac06bf44628-kube-api-access-q2sbv" (OuterVolumeSpecName: "kube-api-access-q2sbv") pod "6f3f5b7b-6054-4179-aefb-8ac06bf44628" (UID: "6f3f5b7b-6054-4179-aefb-8ac06bf44628"). InnerVolumeSpecName "kube-api-access-q2sbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.805642 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2sbv\" (UniqueName: \"kubernetes.io/projected/6f3f5b7b-6054-4179-aefb-8ac06bf44628-kube-api-access-q2sbv\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.805679 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh79k\" (UniqueName: \"kubernetes.io/projected/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7-kube-api-access-mh79k\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:58 crc kubenswrapper[4753]: I0129 15:33:58.805690 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f3f5b7b-6054-4179-aefb-8ac06bf44628-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.704047 4753 generic.go:334] "Generic (PLEG): container finished" podID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerID="6e5c1941db19f53b5bcb21055803069e1176920d9506bad5fab791f8ceb924a7" exitCode=0 Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.704786 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf6s8" event={"ID":"1880c639-6cce-4b11-9806-9ec0ea6309b3","Type":"ContainerDied","Data":"6e5c1941db19f53b5bcb21055803069e1176920d9506bad5fab791f8ceb924a7"} Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.709873 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfwqr" event={"ID":"86bfcbb9-d197-423a-b9b4-2ebd2aedac13","Type":"ContainerStarted","Data":"f84cbbe2b470b6a6dcbaa54c1a78b8c87f584c2a5668456a9b0e138900e220ce"} Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.716131 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jz77g"] Jan 29 15:33:59 crc kubenswrapper[4753]: E0129 15:33:59.716586 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3f5b7b-6054-4179-aefb-8ac06bf44628" containerName="mariadb-account-create-update" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.716610 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3f5b7b-6054-4179-aefb-8ac06bf44628" containerName="mariadb-account-create-update" Jan 29 15:33:59 crc kubenswrapper[4753]: E0129 15:33:59.716629 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7" containerName="mariadb-database-create" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.716638 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7" containerName="mariadb-database-create" Jan 29 15:33:59 crc kubenswrapper[4753]: E0129 15:33:59.716670 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d53a245-058d-428a-9de6-4d65eff12330" containerName="mariadb-account-create-update" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.717317 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d53a245-058d-428a-9de6-4d65eff12330" containerName="mariadb-account-create-update" Jan 29 15:33:59 crc kubenswrapper[4753]: E0129 15:33:59.717332 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26381a9-132d-429c-bbaf-396b609c273c" containerName="mariadb-account-create-update" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.717342 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26381a9-132d-429c-bbaf-396b609c273c" containerName="mariadb-account-create-update" Jan 29 15:33:59 crc kubenswrapper[4753]: E0129 15:33:59.717354 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df4bfd0-cd78-4225-9eac-0903d4df186d" containerName="mariadb-database-create" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.717362 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df4bfd0-cd78-4225-9eac-0903d4df186d" containerName="mariadb-database-create" Jan 29 15:33:59 crc kubenswrapper[4753]: E0129 15:33:59.717387 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33591b63-9910-4e2b-8188-c8598c1b510f" containerName="mariadb-database-create" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.717396 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="33591b63-9910-4e2b-8188-c8598c1b510f" containerName="mariadb-database-create" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.717603 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26381a9-132d-429c-bbaf-396b609c273c" containerName="mariadb-account-create-update" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.717632 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="33591b63-9910-4e2b-8188-c8598c1b510f" containerName="mariadb-database-create" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.717649 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7" containerName="mariadb-database-create" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.717666 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3f5b7b-6054-4179-aefb-8ac06bf44628" containerName="mariadb-account-create-update" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.717683 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d53a245-058d-428a-9de6-4d65eff12330" containerName="mariadb-account-create-update" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.717699 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df4bfd0-cd78-4225-9eac-0903d4df186d" containerName="mariadb-database-create" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.719388 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.758957 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jz77g"] Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.827887 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qzh4\" (UniqueName: \"kubernetes.io/projected/973f958c-7114-4018-8021-19f6b15618b6-kube-api-access-2qzh4\") pod \"redhat-operators-jz77g\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.828136 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-catalog-content\") pod \"redhat-operators-jz77g\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.828305 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-utilities\") pod \"redhat-operators-jz77g\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.930759 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qzh4\" (UniqueName: \"kubernetes.io/projected/973f958c-7114-4018-8021-19f6b15618b6-kube-api-access-2qzh4\") pod \"redhat-operators-jz77g\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.930846 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-catalog-content\") pod \"redhat-operators-jz77g\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.930879 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-utilities\") pod \"redhat-operators-jz77g\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.931616 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-utilities\") pod \"redhat-operators-jz77g\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.931775 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-catalog-content\") pod \"redhat-operators-jz77g\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:33:59 crc kubenswrapper[4753]: I0129 15:33:59.948962 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qzh4\" (UniqueName: \"kubernetes.io/projected/973f958c-7114-4018-8021-19f6b15618b6-kube-api-access-2qzh4\") pod \"redhat-operators-jz77g\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:34:00 crc kubenswrapper[4753]: I0129 15:34:00.057924 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:34:00 crc kubenswrapper[4753]: I0129 15:34:00.603130 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jz77g"] Jan 29 15:34:00 crc kubenswrapper[4753]: W0129 15:34:00.622141 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod973f958c_7114_4018_8021_19f6b15618b6.slice/crio-23f35fa86c09be75a495afdf70da3b75d0f8f11ccf2ac5d9e6f5f5c59f27ee55 WatchSource:0}: Error finding container 23f35fa86c09be75a495afdf70da3b75d0f8f11ccf2ac5d9e6f5f5c59f27ee55: Status 404 returned error can't find the container with id 23f35fa86c09be75a495afdf70da3b75d0f8f11ccf2ac5d9e6f5f5c59f27ee55 Jan 29 15:34:00 crc kubenswrapper[4753]: I0129 15:34:00.720803 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz77g" event={"ID":"973f958c-7114-4018-8021-19f6b15618b6","Type":"ContainerStarted","Data":"23f35fa86c09be75a495afdf70da3b75d0f8f11ccf2ac5d9e6f5f5c59f27ee55"} Jan 29 15:34:00 crc kubenswrapper[4753]: I0129 15:34:00.724043 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf6s8" event={"ID":"1880c639-6cce-4b11-9806-9ec0ea6309b3","Type":"ContainerStarted","Data":"38e695fc9e9497227ba10d0f2948e74182ef75bb37dbac0b0e8eb47027652a96"} Jan 29 15:34:00 crc kubenswrapper[4753]: I0129 15:34:00.726518 4753 generic.go:334] "Generic (PLEG): container finished" podID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerID="f84cbbe2b470b6a6dcbaa54c1a78b8c87f584c2a5668456a9b0e138900e220ce" exitCode=0 Jan 29 15:34:00 crc kubenswrapper[4753]: I0129 15:34:00.726565 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfwqr" event={"ID":"86bfcbb9-d197-423a-b9b4-2ebd2aedac13","Type":"ContainerDied","Data":"f84cbbe2b470b6a6dcbaa54c1a78b8c87f584c2a5668456a9b0e138900e220ce"} Jan 29 15:34:01 crc kubenswrapper[4753]: I0129 15:34:01.739601 4753 generic.go:334] "Generic (PLEG): container finished" podID="973f958c-7114-4018-8021-19f6b15618b6" containerID="9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462" exitCode=0 Jan 29 15:34:01 crc kubenswrapper[4753]: I0129 15:34:01.739752 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz77g" event={"ID":"973f958c-7114-4018-8021-19f6b15618b6","Type":"ContainerDied","Data":"9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462"} Jan 29 15:34:01 crc kubenswrapper[4753]: I0129 15:34:01.744697 4753 generic.go:334] "Generic (PLEG): container finished" podID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerID="38e695fc9e9497227ba10d0f2948e74182ef75bb37dbac0b0e8eb47027652a96" exitCode=0 Jan 29 15:34:01 crc kubenswrapper[4753]: I0129 15:34:01.744796 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf6s8" event={"ID":"1880c639-6cce-4b11-9806-9ec0ea6309b3","Type":"ContainerDied","Data":"38e695fc9e9497227ba10d0f2948e74182ef75bb37dbac0b0e8eb47027652a96"} Jan 29 15:34:01 crc kubenswrapper[4753]: I0129 15:34:01.749442 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfwqr" event={"ID":"86bfcbb9-d197-423a-b9b4-2ebd2aedac13","Type":"ContainerStarted","Data":"75da81175aa647194232bb6d8948734a87508e0a5e0d94f58aaa3c778e79cda2"} Jan 29 15:34:01 crc kubenswrapper[4753]: I0129 15:34:01.793124 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zfwqr" podStartSLOduration=2.371028385 podStartE2EDuration="4.793100501s" podCreationTimestamp="2026-01-29 15:33:57 +0000 UTC" firstStartedPulling="2026-01-29 15:33:58.699966113 +0000 UTC m=+5473.394700495" lastFinishedPulling="2026-01-29 15:34:01.122038229 +0000 UTC m=+5475.816772611" observedRunningTime="2026-01-29 15:34:01.785979608 +0000 UTC m=+5476.480714020" watchObservedRunningTime="2026-01-29 15:34:01.793100501 +0000 UTC m=+5476.487834903" Jan 29 15:34:02 crc kubenswrapper[4753]: I0129 15:34:02.760258 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz77g" event={"ID":"973f958c-7114-4018-8021-19f6b15618b6","Type":"ContainerStarted","Data":"e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc"} Jan 29 15:34:02 crc kubenswrapper[4753]: I0129 15:34:02.763061 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf6s8" event={"ID":"1880c639-6cce-4b11-9806-9ec0ea6309b3","Type":"ContainerStarted","Data":"72d4c6de4aacbc90c53efa2c08e666e4e45a0ca12bb3952f5bc66ac530767e3e"} Jan 29 15:34:02 crc kubenswrapper[4753]: I0129 15:34:02.805654 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sf6s8" podStartSLOduration=4.216685923 podStartE2EDuration="6.805634968s" podCreationTimestamp="2026-01-29 15:33:56 +0000 UTC" firstStartedPulling="2026-01-29 15:33:59.707224268 +0000 UTC m=+5474.401958650" lastFinishedPulling="2026-01-29 15:34:02.296173313 +0000 UTC m=+5476.990907695" observedRunningTime="2026-01-29 15:34:02.803851539 +0000 UTC m=+5477.498585941" watchObservedRunningTime="2026-01-29 15:34:02.805634968 +0000 UTC m=+5477.500369370" Jan 29 15:34:03 crc kubenswrapper[4753]: I0129 15:34:03.776846 4753 generic.go:334] "Generic (PLEG): container finished" podID="973f958c-7114-4018-8021-19f6b15618b6" containerID="e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc" exitCode=0 Jan 29 15:34:03 crc kubenswrapper[4753]: I0129 15:34:03.776943 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz77g" event={"ID":"973f958c-7114-4018-8021-19f6b15618b6","Type":"ContainerDied","Data":"e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc"} Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.328504 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jljmv"] Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.329910 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.333026 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.333882 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.337789 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zjm4g" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.341125 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jljmv"] Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.416113 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-scripts\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.416222 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.416301 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntkh9\" (UniqueName: \"kubernetes.io/projected/0f925a54-0169-4eea-b309-8f7d168419a0-kube-api-access-ntkh9\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.416326 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-config-data\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.517664 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.519376 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntkh9\" (UniqueName: \"kubernetes.io/projected/0f925a54-0169-4eea-b309-8f7d168419a0-kube-api-access-ntkh9\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.519436 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-config-data\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.519751 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-scripts\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.523770 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.523818 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-config-data\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.540399 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntkh9\" (UniqueName: \"kubernetes.io/projected/0f925a54-0169-4eea-b309-8f7d168419a0-kube-api-access-ntkh9\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.541833 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-scripts\") pod \"nova-cell0-conductor-db-sync-jljmv\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.648371 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.792469 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz77g" event={"ID":"973f958c-7114-4018-8021-19f6b15618b6","Type":"ContainerStarted","Data":"aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1"} Jan 29 15:34:04 crc kubenswrapper[4753]: I0129 15:34:04.814618 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jz77g" podStartSLOduration=3.365645436 podStartE2EDuration="5.814598177s" podCreationTimestamp="2026-01-29 15:33:59 +0000 UTC" firstStartedPulling="2026-01-29 15:34:01.742457755 +0000 UTC m=+5476.437192177" lastFinishedPulling="2026-01-29 15:34:04.191410536 +0000 UTC m=+5478.886144918" observedRunningTime="2026-01-29 15:34:04.812057659 +0000 UTC m=+5479.506792061" watchObservedRunningTime="2026-01-29 15:34:04.814598177 +0000 UTC m=+5479.509332559" Jan 29 15:34:05 crc kubenswrapper[4753]: W0129 15:34:05.164380 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f925a54_0169_4eea_b309_8f7d168419a0.slice/crio-832e0ff6ac81ba03e65369011e74328a6a522243415cbfafc2231dedda6d83ac WatchSource:0}: Error finding container 832e0ff6ac81ba03e65369011e74328a6a522243415cbfafc2231dedda6d83ac: Status 404 returned error can't find the container with id 832e0ff6ac81ba03e65369011e74328a6a522243415cbfafc2231dedda6d83ac Jan 29 15:34:05 crc kubenswrapper[4753]: I0129 15:34:05.165782 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jljmv"] Jan 29 15:34:05 crc kubenswrapper[4753]: I0129 15:34:05.803389 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jljmv" event={"ID":"0f925a54-0169-4eea-b309-8f7d168419a0","Type":"ContainerStarted","Data":"d846a7206d4ce7a763a19fcd1e81d0391536a4b1a2f7ff3f08eb9475b5e2ce5e"} Jan 29 15:34:05 crc kubenswrapper[4753]: I0129 15:34:05.803815 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jljmv" event={"ID":"0f925a54-0169-4eea-b309-8f7d168419a0","Type":"ContainerStarted","Data":"832e0ff6ac81ba03e65369011e74328a6a522243415cbfafc2231dedda6d83ac"} Jan 29 15:34:05 crc kubenswrapper[4753]: I0129 15:34:05.829728 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jljmv" podStartSLOduration=1.829710934 podStartE2EDuration="1.829710934s" podCreationTimestamp="2026-01-29 15:34:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:05.823700191 +0000 UTC m=+5480.518434593" watchObservedRunningTime="2026-01-29 15:34:05.829710934 +0000 UTC m=+5480.524445316" Jan 29 15:34:06 crc kubenswrapper[4753]: I0129 15:34:06.805686 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:34:06 crc kubenswrapper[4753]: I0129 15:34:06.806091 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:34:06 crc kubenswrapper[4753]: I0129 15:34:06.849487 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:34:07 crc kubenswrapper[4753]: I0129 15:34:07.276979 4753 scope.go:117] "RemoveContainer" containerID="166553dd5a710aca9b0e0e02018aaf924bc1974ea43eb185e00a14ab9cf5ff9b" Jan 29 15:34:07 crc kubenswrapper[4753]: I0129 15:34:07.305764 4753 scope.go:117] "RemoveContainer" containerID="ff10efe4e2d91bbf36c5c780589d145331c9d9eeaa71ad4903c58cdd437c498b" Jan 29 15:34:07 crc kubenswrapper[4753]: I0129 15:34:07.339044 4753 scope.go:117] "RemoveContainer" containerID="35b5bb938da1b4b9e01854d47efc2d569a95a59a023d82c219b26874f9d17b63" Jan 29 15:34:07 crc kubenswrapper[4753]: I0129 15:34:07.527960 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:34:07 crc kubenswrapper[4753]: I0129 15:34:07.528010 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:34:07 crc kubenswrapper[4753]: I0129 15:34:07.568435 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:34:07 crc kubenswrapper[4753]: I0129 15:34:07.968801 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:34:07 crc kubenswrapper[4753]: I0129 15:34:07.998783 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:34:09 crc kubenswrapper[4753]: I0129 15:34:09.896165 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sf6s8"] Jan 29 15:34:09 crc kubenswrapper[4753]: I0129 15:34:09.896439 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sf6s8" podUID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerName="registry-server" containerID="cri-o://72d4c6de4aacbc90c53efa2c08e666e4e45a0ca12bb3952f5bc66ac530767e3e" gracePeriod=2 Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.058014 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.058083 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.100693 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zfwqr"] Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.101059 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zfwqr" podUID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerName="registry-server" containerID="cri-o://75da81175aa647194232bb6d8948734a87508e0a5e0d94f58aaa3c778e79cda2" gracePeriod=2 Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.853196 4753 generic.go:334] "Generic (PLEG): container finished" podID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerID="75da81175aa647194232bb6d8948734a87508e0a5e0d94f58aaa3c778e79cda2" exitCode=0 Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.853731 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfwqr" event={"ID":"86bfcbb9-d197-423a-b9b4-2ebd2aedac13","Type":"ContainerDied","Data":"75da81175aa647194232bb6d8948734a87508e0a5e0d94f58aaa3c778e79cda2"} Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.855879 4753 generic.go:334] "Generic (PLEG): container finished" podID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerID="72d4c6de4aacbc90c53efa2c08e666e4e45a0ca12bb3952f5bc66ac530767e3e" exitCode=0 Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.855902 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf6s8" event={"ID":"1880c639-6cce-4b11-9806-9ec0ea6309b3","Type":"ContainerDied","Data":"72d4c6de4aacbc90c53efa2c08e666e4e45a0ca12bb3952f5bc66ac530767e3e"} Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.855920 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf6s8" event={"ID":"1880c639-6cce-4b11-9806-9ec0ea6309b3","Type":"ContainerDied","Data":"f404f6da6e09fddab692342f9276b63e897d96295de855b903a32a890e965c7f"} Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.855934 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f404f6da6e09fddab692342f9276b63e897d96295de855b903a32a890e965c7f" Jan 29 15:34:10 crc kubenswrapper[4753]: I0129 15:34:10.886529 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.047459 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdpts\" (UniqueName: \"kubernetes.io/projected/1880c639-6cce-4b11-9806-9ec0ea6309b3-kube-api-access-kdpts\") pod \"1880c639-6cce-4b11-9806-9ec0ea6309b3\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.047514 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-utilities\") pod \"1880c639-6cce-4b11-9806-9ec0ea6309b3\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.047649 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-catalog-content\") pod \"1880c639-6cce-4b11-9806-9ec0ea6309b3\" (UID: \"1880c639-6cce-4b11-9806-9ec0ea6309b3\") " Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.048894 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-utilities" (OuterVolumeSpecName: "utilities") pod "1880c639-6cce-4b11-9806-9ec0ea6309b3" (UID: "1880c639-6cce-4b11-9806-9ec0ea6309b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.053528 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1880c639-6cce-4b11-9806-9ec0ea6309b3-kube-api-access-kdpts" (OuterVolumeSpecName: "kube-api-access-kdpts") pod "1880c639-6cce-4b11-9806-9ec0ea6309b3" (UID: "1880c639-6cce-4b11-9806-9ec0ea6309b3"). InnerVolumeSpecName "kube-api-access-kdpts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.121484 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jz77g" podUID="973f958c-7114-4018-8021-19f6b15618b6" containerName="registry-server" probeResult="failure" output=< Jan 29 15:34:11 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 15:34:11 crc kubenswrapper[4753]: > Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.151079 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdpts\" (UniqueName: \"kubernetes.io/projected/1880c639-6cce-4b11-9806-9ec0ea6309b3-kube-api-access-kdpts\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.151258 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.329407 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1880c639-6cce-4b11-9806-9ec0ea6309b3" (UID: "1880c639-6cce-4b11-9806-9ec0ea6309b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.354510 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1880c639-6cce-4b11-9806-9ec0ea6309b3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.462874 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.556522 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb7m6\" (UniqueName: \"kubernetes.io/projected/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-kube-api-access-rb7m6\") pod \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.556570 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-catalog-content\") pod \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.556745 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-utilities\") pod \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\" (UID: \"86bfcbb9-d197-423a-b9b4-2ebd2aedac13\") " Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.557830 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-utilities" (OuterVolumeSpecName: "utilities") pod "86bfcbb9-d197-423a-b9b4-2ebd2aedac13" (UID: "86bfcbb9-d197-423a-b9b4-2ebd2aedac13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.565922 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-kube-api-access-rb7m6" (OuterVolumeSpecName: "kube-api-access-rb7m6") pod "86bfcbb9-d197-423a-b9b4-2ebd2aedac13" (UID: "86bfcbb9-d197-423a-b9b4-2ebd2aedac13"). InnerVolumeSpecName "kube-api-access-rb7m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.610353 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86bfcbb9-d197-423a-b9b4-2ebd2aedac13" (UID: "86bfcbb9-d197-423a-b9b4-2ebd2aedac13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.659226 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb7m6\" (UniqueName: \"kubernetes.io/projected/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-kube-api-access-rb7m6\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.659610 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.659651 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86bfcbb9-d197-423a-b9b4-2ebd2aedac13-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.865693 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sf6s8" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.865699 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zfwqr" event={"ID":"86bfcbb9-d197-423a-b9b4-2ebd2aedac13","Type":"ContainerDied","Data":"1444304064e29bbcafc016eaf70ba18b7dbd6bedf3486ea9267010f594939b65"} Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.865777 4753 scope.go:117] "RemoveContainer" containerID="75da81175aa647194232bb6d8948734a87508e0a5e0d94f58aaa3c778e79cda2" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.865702 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zfwqr" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.886931 4753 scope.go:117] "RemoveContainer" containerID="f84cbbe2b470b6a6dcbaa54c1a78b8c87f584c2a5668456a9b0e138900e220ce" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.916967 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sf6s8"] Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.934596 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sf6s8"] Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.936028 4753 scope.go:117] "RemoveContainer" containerID="d23d1513a2c4a55507098aac45a6900ee3387ccb1ac652deb6adbe784a61b189" Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.943685 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zfwqr"] Jan 29 15:34:11 crc kubenswrapper[4753]: I0129 15:34:11.952459 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zfwqr"] Jan 29 15:34:12 crc kubenswrapper[4753]: I0129 15:34:12.159305 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1880c639-6cce-4b11-9806-9ec0ea6309b3" path="/var/lib/kubelet/pods/1880c639-6cce-4b11-9806-9ec0ea6309b3/volumes" Jan 29 15:34:12 crc kubenswrapper[4753]: I0129 15:34:12.160183 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" path="/var/lib/kubelet/pods/86bfcbb9-d197-423a-b9b4-2ebd2aedac13/volumes" Jan 29 15:34:13 crc kubenswrapper[4753]: I0129 15:34:13.888249 4753 generic.go:334] "Generic (PLEG): container finished" podID="0f925a54-0169-4eea-b309-8f7d168419a0" containerID="d846a7206d4ce7a763a19fcd1e81d0391536a4b1a2f7ff3f08eb9475b5e2ce5e" exitCode=0 Jan 29 15:34:13 crc kubenswrapper[4753]: I0129 15:34:13.888330 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jljmv" event={"ID":"0f925a54-0169-4eea-b309-8f7d168419a0","Type":"ContainerDied","Data":"d846a7206d4ce7a763a19fcd1e81d0391536a4b1a2f7ff3f08eb9475b5e2ce5e"} Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.224657 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.332436 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-combined-ca-bundle\") pod \"0f925a54-0169-4eea-b309-8f7d168419a0\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.332495 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntkh9\" (UniqueName: \"kubernetes.io/projected/0f925a54-0169-4eea-b309-8f7d168419a0-kube-api-access-ntkh9\") pod \"0f925a54-0169-4eea-b309-8f7d168419a0\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.332593 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-config-data\") pod \"0f925a54-0169-4eea-b309-8f7d168419a0\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.332645 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-scripts\") pod \"0f925a54-0169-4eea-b309-8f7d168419a0\" (UID: \"0f925a54-0169-4eea-b309-8f7d168419a0\") " Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.338828 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-scripts" (OuterVolumeSpecName: "scripts") pod "0f925a54-0169-4eea-b309-8f7d168419a0" (UID: "0f925a54-0169-4eea-b309-8f7d168419a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.341477 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f925a54-0169-4eea-b309-8f7d168419a0-kube-api-access-ntkh9" (OuterVolumeSpecName: "kube-api-access-ntkh9") pod "0f925a54-0169-4eea-b309-8f7d168419a0" (UID: "0f925a54-0169-4eea-b309-8f7d168419a0"). InnerVolumeSpecName "kube-api-access-ntkh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.360060 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f925a54-0169-4eea-b309-8f7d168419a0" (UID: "0f925a54-0169-4eea-b309-8f7d168419a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.364350 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-config-data" (OuterVolumeSpecName: "config-data") pod "0f925a54-0169-4eea-b309-8f7d168419a0" (UID: "0f925a54-0169-4eea-b309-8f7d168419a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.434630 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntkh9\" (UniqueName: \"kubernetes.io/projected/0f925a54-0169-4eea-b309-8f7d168419a0-kube-api-access-ntkh9\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.434663 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.434675 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.434683 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f925a54-0169-4eea-b309-8f7d168419a0-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.907370 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jljmv" event={"ID":"0f925a54-0169-4eea-b309-8f7d168419a0","Type":"ContainerDied","Data":"832e0ff6ac81ba03e65369011e74328a6a522243415cbfafc2231dedda6d83ac"} Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.908034 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="832e0ff6ac81ba03e65369011e74328a6a522243415cbfafc2231dedda6d83ac" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.907443 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jljmv" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.990913 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 15:34:15 crc kubenswrapper[4753]: E0129 15:34:15.991307 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerName="registry-server" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.991324 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerName="registry-server" Jan 29 15:34:15 crc kubenswrapper[4753]: E0129 15:34:15.991333 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f925a54-0169-4eea-b309-8f7d168419a0" containerName="nova-cell0-conductor-db-sync" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.991339 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f925a54-0169-4eea-b309-8f7d168419a0" containerName="nova-cell0-conductor-db-sync" Jan 29 15:34:15 crc kubenswrapper[4753]: E0129 15:34:15.991356 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerName="extract-utilities" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.991364 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerName="extract-utilities" Jan 29 15:34:15 crc kubenswrapper[4753]: E0129 15:34:15.991380 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerName="extract-content" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.991386 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerName="extract-content" Jan 29 15:34:15 crc kubenswrapper[4753]: E0129 15:34:15.991400 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerName="extract-utilities" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.991406 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerName="extract-utilities" Jan 29 15:34:15 crc kubenswrapper[4753]: E0129 15:34:15.991415 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerName="registry-server" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.991421 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerName="registry-server" Jan 29 15:34:15 crc kubenswrapper[4753]: E0129 15:34:15.991432 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerName="extract-content" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.991439 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerName="extract-content" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.991605 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="86bfcbb9-d197-423a-b9b4-2ebd2aedac13" containerName="registry-server" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.991614 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="1880c639-6cce-4b11-9806-9ec0ea6309b3" containerName="registry-server" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.991622 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f925a54-0169-4eea-b309-8f7d168419a0" containerName="nova-cell0-conductor-db-sync" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.992201 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.994650 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 15:34:15 crc kubenswrapper[4753]: I0129 15:34:15.994782 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zjm4g" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.002761 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.147319 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.147683 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4gj8\" (UniqueName: \"kubernetes.io/projected/037a17d1-e25f-461b-be74-d4127a64ed11-kube-api-access-t4gj8\") pod \"nova-cell0-conductor-0\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.147721 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.249088 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.249205 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4gj8\" (UniqueName: \"kubernetes.io/projected/037a17d1-e25f-461b-be74-d4127a64ed11-kube-api-access-t4gj8\") pod \"nova-cell0-conductor-0\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.249272 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.255355 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.263497 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.268148 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4gj8\" (UniqueName: \"kubernetes.io/projected/037a17d1-e25f-461b-be74-d4127a64ed11-kube-api-access-t4gj8\") pod \"nova-cell0-conductor-0\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.307528 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.799448 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 15:34:16 crc kubenswrapper[4753]: W0129 15:34:16.802313 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod037a17d1_e25f_461b_be74_d4127a64ed11.slice/crio-4b93a42fd46e70cc7196f10a84459d5caf0a37f4c92a07788e98008014b23e62 WatchSource:0}: Error finding container 4b93a42fd46e70cc7196f10a84459d5caf0a37f4c92a07788e98008014b23e62: Status 404 returned error can't find the container with id 4b93a42fd46e70cc7196f10a84459d5caf0a37f4c92a07788e98008014b23e62 Jan 29 15:34:16 crc kubenswrapper[4753]: I0129 15:34:16.917091 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"037a17d1-e25f-461b-be74-d4127a64ed11","Type":"ContainerStarted","Data":"4b93a42fd46e70cc7196f10a84459d5caf0a37f4c92a07788e98008014b23e62"} Jan 29 15:34:17 crc kubenswrapper[4753]: I0129 15:34:17.931841 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"037a17d1-e25f-461b-be74-d4127a64ed11","Type":"ContainerStarted","Data":"56f98792b126883286141647cec180dbbb79a6818635795f8a311bcb1e51c95e"} Jan 29 15:34:17 crc kubenswrapper[4753]: I0129 15:34:17.932187 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:17 crc kubenswrapper[4753]: I0129 15:34:17.952709 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.952691806 podStartE2EDuration="2.952691806s" podCreationTimestamp="2026-01-29 15:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:17.951496103 +0000 UTC m=+5492.646230485" watchObservedRunningTime="2026-01-29 15:34:17.952691806 +0000 UTC m=+5492.647426188" Jan 29 15:34:20 crc kubenswrapper[4753]: I0129 15:34:20.143623 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:34:20 crc kubenswrapper[4753]: I0129 15:34:20.189759 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:34:20 crc kubenswrapper[4753]: I0129 15:34:20.378346 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jz77g"] Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.347768 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.776265 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-n4smg"] Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.778029 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.779878 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.783181 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.803575 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-n4smg"] Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.847441 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-scripts\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.847515 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-config-data\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.847592 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.847685 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgm79\" (UniqueName: \"kubernetes.io/projected/8de6fe34-0d8e-494e-8cea-b91c8e437b88-kube-api-access-bgm79\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.906273 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.908419 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.911114 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.925645 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.953547 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-scripts\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.953633 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-config-data\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.953680 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.953753 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgm79\" (UniqueName: \"kubernetes.io/projected/8de6fe34-0d8e-494e-8cea-b91c8e437b88-kube-api-access-bgm79\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:21 crc kubenswrapper[4753]: I0129 15:34:21.986706 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-scripts\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.021844 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jz77g" podUID="973f958c-7114-4018-8021-19f6b15618b6" containerName="registry-server" containerID="cri-o://aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1" gracePeriod=2 Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.036551 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.039974 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-config-data\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.040734 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgm79\" (UniqueName: \"kubernetes.io/projected/8de6fe34-0d8e-494e-8cea-b91c8e437b88-kube-api-access-bgm79\") pod \"nova-cell0-cell-mapping-n4smg\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.060208 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03b8773f-8f94-456c-9075-fd1c69ad7e20-logs\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.060313 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n57nl\" (UniqueName: \"kubernetes.io/projected/03b8773f-8f94-456c-9075-fd1c69ad7e20-kube-api-access-n57nl\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.060434 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.060549 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-config-data\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.085295 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.086963 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.090382 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.094296 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.100562 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.161826 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-config-data\") pod \"nova-scheduler-0\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.161914 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03b8773f-8f94-456c-9075-fd1c69ad7e20-logs\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.161978 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n57nl\" (UniqueName: \"kubernetes.io/projected/03b8773f-8f94-456c-9075-fd1c69ad7e20-kube-api-access-n57nl\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.162011 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25gs2\" (UniqueName: \"kubernetes.io/projected/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-kube-api-access-25gs2\") pod \"nova-scheduler-0\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.162047 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.162172 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-config-data\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.162239 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.163108 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03b8773f-8f94-456c-9075-fd1c69ad7e20-logs\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.167961 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-config-data\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.196877 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n57nl\" (UniqueName: \"kubernetes.io/projected/03b8773f-8f94-456c-9075-fd1c69ad7e20-kube-api-access-n57nl\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.197531 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.220002 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.221779 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.234169 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.234548 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.262253 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.263605 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.263799 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.263889 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-config-data\") pod \"nova-scheduler-0\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.263985 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25gs2\" (UniqueName: \"kubernetes.io/projected/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-kube-api-access-25gs2\") pod \"nova-scheduler-0\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.265439 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.278000 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.281689 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.283477 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-config-data\") pod \"nova-scheduler-0\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.283535 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.305427 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77779959c9-5sw5r"] Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.307251 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.313809 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25gs2\" (UniqueName: \"kubernetes.io/projected/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-kube-api-access-25gs2\") pod \"nova-scheduler-0\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.349207 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77779959c9-5sw5r"] Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.366065 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-sb\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.366200 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.366246 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392865b5-bb31-439f-b001-b6ef6b7a1a4c-logs\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.366285 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-dns-svc\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.366353 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8zl5\" (UniqueName: \"kubernetes.io/projected/392865b5-bb31-439f-b001-b6ef6b7a1a4c-kube-api-access-z8zl5\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.366415 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-config\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.366464 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ztqn\" (UniqueName: \"kubernetes.io/projected/8fc5f142-1f9b-45b9-a0df-0abe9530d688-kube-api-access-4ztqn\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.366554 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-nb\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.366590 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4lhw\" (UniqueName: \"kubernetes.io/projected/6fe820fd-d9f1-409c-a860-f89f10c60c12-kube-api-access-q4lhw\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.366617 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.367001 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.367062 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-config-data\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.469388 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.469474 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-config-data\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.469548 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-sb\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.469616 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.469644 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392865b5-bb31-439f-b001-b6ef6b7a1a4c-logs\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.469674 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-dns-svc\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.469752 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8zl5\" (UniqueName: \"kubernetes.io/projected/392865b5-bb31-439f-b001-b6ef6b7a1a4c-kube-api-access-z8zl5\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.469817 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-config\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.469868 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ztqn\" (UniqueName: \"kubernetes.io/projected/8fc5f142-1f9b-45b9-a0df-0abe9530d688-kube-api-access-4ztqn\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.469975 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-nb\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.470036 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4lhw\" (UniqueName: \"kubernetes.io/projected/6fe820fd-d9f1-409c-a860-f89f10c60c12-kube-api-access-q4lhw\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.470062 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.471318 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-dns-svc\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.471965 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-sb\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.473775 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.474366 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.474775 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392865b5-bb31-439f-b001-b6ef6b7a1a4c-logs\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.474848 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.475374 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-nb\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.475414 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-config\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.480845 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-config-data\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.490040 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4lhw\" (UniqueName: \"kubernetes.io/projected/6fe820fd-d9f1-409c-a860-f89f10c60c12-kube-api-access-q4lhw\") pod \"nova-cell1-novncproxy-0\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.496413 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ztqn\" (UniqueName: \"kubernetes.io/projected/8fc5f142-1f9b-45b9-a0df-0abe9530d688-kube-api-access-4ztqn\") pod \"dnsmasq-dns-77779959c9-5sw5r\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.497599 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8zl5\" (UniqueName: \"kubernetes.io/projected/392865b5-bb31-439f-b001-b6ef6b7a1a4c-kube-api-access-z8zl5\") pod \"nova-metadata-0\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.613855 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.635176 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.655493 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.683740 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.733053 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-n4smg"] Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.769774 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:34:22 crc kubenswrapper[4753]: W0129 15:34:22.772467 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8de6fe34_0d8e_494e_8cea_b91c8e437b88.slice/crio-747bf346e2ad96df2ddb9af255765c774ff41072363ce4e0c165ecd95eb5d7e6 WatchSource:0}: Error finding container 747bf346e2ad96df2ddb9af255765c774ff41072363ce4e0c165ecd95eb5d7e6: Status 404 returned error can't find the container with id 747bf346e2ad96df2ddb9af255765c774ff41072363ce4e0c165ecd95eb5d7e6 Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.875484 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-utilities\") pod \"973f958c-7114-4018-8021-19f6b15618b6\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.875599 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-catalog-content\") pod \"973f958c-7114-4018-8021-19f6b15618b6\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.875676 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qzh4\" (UniqueName: \"kubernetes.io/projected/973f958c-7114-4018-8021-19f6b15618b6-kube-api-access-2qzh4\") pod \"973f958c-7114-4018-8021-19f6b15618b6\" (UID: \"973f958c-7114-4018-8021-19f6b15618b6\") " Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.877519 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-utilities" (OuterVolumeSpecName: "utilities") pod "973f958c-7114-4018-8021-19f6b15618b6" (UID: "973f958c-7114-4018-8021-19f6b15618b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.882480 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973f958c-7114-4018-8021-19f6b15618b6-kube-api-access-2qzh4" (OuterVolumeSpecName: "kube-api-access-2qzh4") pod "973f958c-7114-4018-8021-19f6b15618b6" (UID: "973f958c-7114-4018-8021-19f6b15618b6"). InnerVolumeSpecName "kube-api-access-2qzh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.932200 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.988240 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.988283 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qzh4\" (UniqueName: \"kubernetes.io/projected/973f958c-7114-4018-8021-19f6b15618b6-kube-api-access-2qzh4\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.990094 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4zbrh"] Jan 29 15:34:22 crc kubenswrapper[4753]: E0129 15:34:22.990630 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973f958c-7114-4018-8021-19f6b15618b6" containerName="registry-server" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.990710 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="973f958c-7114-4018-8021-19f6b15618b6" containerName="registry-server" Jan 29 15:34:22 crc kubenswrapper[4753]: E0129 15:34:22.990796 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973f958c-7114-4018-8021-19f6b15618b6" containerName="extract-utilities" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.990854 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="973f958c-7114-4018-8021-19f6b15618b6" containerName="extract-utilities" Jan 29 15:34:22 crc kubenswrapper[4753]: E0129 15:34:22.990922 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973f958c-7114-4018-8021-19f6b15618b6" containerName="extract-content" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.991013 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="973f958c-7114-4018-8021-19f6b15618b6" containerName="extract-content" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.991285 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="973f958c-7114-4018-8021-19f6b15618b6" containerName="registry-server" Jan 29 15:34:22 crc kubenswrapper[4753]: I0129 15:34:22.992140 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.001917 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.002180 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.019336 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.030520 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4zbrh"] Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.042243 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5","Type":"ContainerStarted","Data":"b671f2f92cead6de96ac49d6678c265f605ddb4cdb3d256049c509bf8e42a3d9"} Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.052288 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03b8773f-8f94-456c-9075-fd1c69ad7e20","Type":"ContainerStarted","Data":"01ffad8a6de73b68d55368784dd03c0e7e187b8a1c7cbfd64e07cb082d9449ca"} Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.057447 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-n4smg" event={"ID":"8de6fe34-0d8e-494e-8cea-b91c8e437b88","Type":"ContainerStarted","Data":"747bf346e2ad96df2ddb9af255765c774ff41072363ce4e0c165ecd95eb5d7e6"} Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.068512 4753 generic.go:334] "Generic (PLEG): container finished" podID="973f958c-7114-4018-8021-19f6b15618b6" containerID="aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1" exitCode=0 Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.068627 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz77g" event={"ID":"973f958c-7114-4018-8021-19f6b15618b6","Type":"ContainerDied","Data":"aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1"} Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.068697 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jz77g" event={"ID":"973f958c-7114-4018-8021-19f6b15618b6","Type":"ContainerDied","Data":"23f35fa86c09be75a495afdf70da3b75d0f8f11ccf2ac5d9e6f5f5c59f27ee55"} Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.068783 4753 scope.go:117] "RemoveContainer" containerID="aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.068952 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jz77g" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.089889 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwsbn\" (UniqueName: \"kubernetes.io/projected/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-kube-api-access-fwsbn\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.090019 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.090475 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-config-data\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.090558 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-scripts\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.120909 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "973f958c-7114-4018-8021-19f6b15618b6" (UID: "973f958c-7114-4018-8021-19f6b15618b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.169945 4753 scope.go:117] "RemoveContainer" containerID="e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.196688 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-config-data\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.196940 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-scripts\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.197003 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwsbn\" (UniqueName: \"kubernetes.io/projected/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-kube-api-access-fwsbn\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.201380 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-config-data\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.204394 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.204568 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/973f958c-7114-4018-8021-19f6b15618b6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.204613 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-scripts\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.209413 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.232782 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwsbn\" (UniqueName: \"kubernetes.io/projected/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-kube-api-access-fwsbn\") pod \"nova-cell1-conductor-db-sync-4zbrh\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.303713 4753 scope.go:117] "RemoveContainer" containerID="9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.307028 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.330288 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.352451 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.361469 4753 scope.go:117] "RemoveContainer" containerID="aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1" Jan 29 15:34:23 crc kubenswrapper[4753]: E0129 15:34:23.361920 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1\": container with ID starting with aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1 not found: ID does not exist" containerID="aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.361956 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1"} err="failed to get container status \"aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1\": rpc error: code = NotFound desc = could not find container \"aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1\": container with ID starting with aa84ee4698b13c53545b211cfc2921f216e98e6719fd3ce568566aef595edba1 not found: ID does not exist" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.361984 4753 scope.go:117] "RemoveContainer" containerID="e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc" Jan 29 15:34:23 crc kubenswrapper[4753]: E0129 15:34:23.362441 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc\": container with ID starting with e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc not found: ID does not exist" containerID="e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.362470 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc"} err="failed to get container status \"e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc\": rpc error: code = NotFound desc = could not find container \"e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc\": container with ID starting with e5b38fe75cae438170a35944bc34c86ddf35c8cbcee5ca5a4c3563911fcf79dc not found: ID does not exist" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.362489 4753 scope.go:117] "RemoveContainer" containerID="9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462" Jan 29 15:34:23 crc kubenswrapper[4753]: E0129 15:34:23.363055 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462\": container with ID starting with 9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462 not found: ID does not exist" containerID="9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.363124 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462"} err="failed to get container status \"9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462\": rpc error: code = NotFound desc = could not find container \"9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462\": container with ID starting with 9df34938a2fe90bc07ad55a52fb05d6c274b64a0333b528f9ea0deba1af4b462 not found: ID does not exist" Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.422898 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jz77g"] Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.443565 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jz77g"] Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.459206 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77779959c9-5sw5r"] Jan 29 15:34:23 crc kubenswrapper[4753]: W0129 15:34:23.459823 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fc5f142_1f9b_45b9_a0df_0abe9530d688.slice/crio-72fd5c8b652eb05a43c54852288c24cdde8875a198545469926f968be8c1b0a4 WatchSource:0}: Error finding container 72fd5c8b652eb05a43c54852288c24cdde8875a198545469926f968be8c1b0a4: Status 404 returned error can't find the container with id 72fd5c8b652eb05a43c54852288c24cdde8875a198545469926f968be8c1b0a4 Jan 29 15:34:23 crc kubenswrapper[4753]: I0129 15:34:23.849776 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4zbrh"] Jan 29 15:34:23 crc kubenswrapper[4753]: W0129 15:34:23.891403 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d7cd059_b4b2_4516_b9d4_2e2e13b7b38b.slice/crio-e07cff32c133de717718876aaedb18ed05fff821bd144bb2a0c2431fd931c131 WatchSource:0}: Error finding container e07cff32c133de717718876aaedb18ed05fff821bd144bb2a0c2431fd931c131: Status 404 returned error can't find the container with id e07cff32c133de717718876aaedb18ed05fff821bd144bb2a0c2431fd931c131 Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.077859 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"392865b5-bb31-439f-b001-b6ef6b7a1a4c","Type":"ContainerStarted","Data":"0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.077916 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"392865b5-bb31-439f-b001-b6ef6b7a1a4c","Type":"ContainerStarted","Data":"dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.077930 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"392865b5-bb31-439f-b001-b6ef6b7a1a4c","Type":"ContainerStarted","Data":"e92962e503ee400bc32c1bff4b5557304c5f4aac07eaf1f072875c082d9dfd22"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.079757 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4zbrh" event={"ID":"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b","Type":"ContainerStarted","Data":"e07cff32c133de717718876aaedb18ed05fff821bd144bb2a0c2431fd931c131"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.085004 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-n4smg" event={"ID":"8de6fe34-0d8e-494e-8cea-b91c8e437b88","Type":"ContainerStarted","Data":"3452c8832f0b8b0f8cf58d57bac5c5cf11c9bf27006d1a303881002226f68c73"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.088517 4753 generic.go:334] "Generic (PLEG): container finished" podID="8fc5f142-1f9b-45b9-a0df-0abe9530d688" containerID="89fb396c49784ee588ac62b06ace3b0e44cc17ff37a9366ec923a48df5fcd70f" exitCode=0 Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.088621 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" event={"ID":"8fc5f142-1f9b-45b9-a0df-0abe9530d688","Type":"ContainerDied","Data":"89fb396c49784ee588ac62b06ace3b0e44cc17ff37a9366ec923a48df5fcd70f"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.088671 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" event={"ID":"8fc5f142-1f9b-45b9-a0df-0abe9530d688","Type":"ContainerStarted","Data":"72fd5c8b652eb05a43c54852288c24cdde8875a198545469926f968be8c1b0a4"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.090208 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6fe820fd-d9f1-409c-a860-f89f10c60c12","Type":"ContainerStarted","Data":"bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.090535 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6fe820fd-d9f1-409c-a860-f89f10c60c12","Type":"ContainerStarted","Data":"873d143c83f8f531a6b0d96ef80668fa2bd5253056c3200f12ded05cb147f89e"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.093784 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5","Type":"ContainerStarted","Data":"9a75ac00c7daba2c415d2e3c7b0591c8b551d4903fd8edd980d0e552aba51abf"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.102499 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.102467 podStartE2EDuration="2.102467s" podCreationTimestamp="2026-01-29 15:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:24.099368447 +0000 UTC m=+5498.794102829" watchObservedRunningTime="2026-01-29 15:34:24.102467 +0000 UTC m=+5498.797201382" Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.109205 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03b8773f-8f94-456c-9075-fd1c69ad7e20","Type":"ContainerStarted","Data":"5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.109266 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03b8773f-8f94-456c-9075-fd1c69ad7e20","Type":"ContainerStarted","Data":"852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51"} Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.144259 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.144237709 podStartE2EDuration="2.144237709s" podCreationTimestamp="2026-01-29 15:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:24.143690804 +0000 UTC m=+5498.838425186" watchObservedRunningTime="2026-01-29 15:34:24.144237709 +0000 UTC m=+5498.838972091" Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.178159 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.178119376 podStartE2EDuration="2.178119376s" podCreationTimestamp="2026-01-29 15:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:24.165897638 +0000 UTC m=+5498.860632030" watchObservedRunningTime="2026-01-29 15:34:24.178119376 +0000 UTC m=+5498.872853768" Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.189965 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973f958c-7114-4018-8021-19f6b15618b6" path="/var/lib/kubelet/pods/973f958c-7114-4018-8021-19f6b15618b6/volumes" Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.221893 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-n4smg" podStartSLOduration=3.221875598 podStartE2EDuration="3.221875598s" podCreationTimestamp="2026-01-29 15:34:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:24.215307962 +0000 UTC m=+5498.910042364" watchObservedRunningTime="2026-01-29 15:34:24.221875598 +0000 UTC m=+5498.916609980" Jan 29 15:34:24 crc kubenswrapper[4753]: I0129 15:34:24.256823 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.256798503 podStartE2EDuration="3.256798503s" podCreationTimestamp="2026-01-29 15:34:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:24.231457464 +0000 UTC m=+5498.926191866" watchObservedRunningTime="2026-01-29 15:34:24.256798503 +0000 UTC m=+5498.951532885" Jan 29 15:34:25 crc kubenswrapper[4753]: I0129 15:34:25.120240 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4zbrh" event={"ID":"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b","Type":"ContainerStarted","Data":"516b3d758bf8f47b9acb854501c20565ce10a1920afc00e60bb79fe2e2de0fe1"} Jan 29 15:34:25 crc kubenswrapper[4753]: I0129 15:34:25.122234 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" event={"ID":"8fc5f142-1f9b-45b9-a0df-0abe9530d688","Type":"ContainerStarted","Data":"950fdfde8da868542ae3476859a5f714f1afd03535c8e5b01a25b64b5f6fba9b"} Jan 29 15:34:25 crc kubenswrapper[4753]: I0129 15:34:25.140643 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4zbrh" podStartSLOduration=3.140621454 podStartE2EDuration="3.140621454s" podCreationTimestamp="2026-01-29 15:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:25.134735256 +0000 UTC m=+5499.829469638" watchObservedRunningTime="2026-01-29 15:34:25.140621454 +0000 UTC m=+5499.835355836" Jan 29 15:34:26 crc kubenswrapper[4753]: I0129 15:34:26.131325 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:27 crc kubenswrapper[4753]: I0129 15:34:27.054789 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:34:27 crc kubenswrapper[4753]: I0129 15:34:27.055177 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:34:27 crc kubenswrapper[4753]: I0129 15:34:27.614851 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 15:34:27 crc kubenswrapper[4753]: I0129 15:34:27.636069 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 15:34:27 crc kubenswrapper[4753]: I0129 15:34:27.637319 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 15:34:27 crc kubenswrapper[4753]: I0129 15:34:27.656290 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:28 crc kubenswrapper[4753]: I0129 15:34:28.160796 4753 generic.go:334] "Generic (PLEG): container finished" podID="0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b" containerID="516b3d758bf8f47b9acb854501c20565ce10a1920afc00e60bb79fe2e2de0fe1" exitCode=0 Jan 29 15:34:28 crc kubenswrapper[4753]: I0129 15:34:28.161604 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4zbrh" event={"ID":"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b","Type":"ContainerDied","Data":"516b3d758bf8f47b9acb854501c20565ce10a1920afc00e60bb79fe2e2de0fe1"} Jan 29 15:34:28 crc kubenswrapper[4753]: I0129 15:34:28.181656 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" podStartSLOduration=6.18046903 podStartE2EDuration="6.18046903s" podCreationTimestamp="2026-01-29 15:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:25.161768801 +0000 UTC m=+5499.856503193" watchObservedRunningTime="2026-01-29 15:34:28.18046903 +0000 UTC m=+5502.875203412" Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.170546 4753 generic.go:334] "Generic (PLEG): container finished" podID="8de6fe34-0d8e-494e-8cea-b91c8e437b88" containerID="3452c8832f0b8b0f8cf58d57bac5c5cf11c9bf27006d1a303881002226f68c73" exitCode=0 Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.170666 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-n4smg" event={"ID":"8de6fe34-0d8e-494e-8cea-b91c8e437b88","Type":"ContainerDied","Data":"3452c8832f0b8b0f8cf58d57bac5c5cf11c9bf27006d1a303881002226f68c73"} Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.501983 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.541778 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-combined-ca-bundle\") pod \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.541829 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-config-data\") pod \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.541867 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwsbn\" (UniqueName: \"kubernetes.io/projected/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-kube-api-access-fwsbn\") pod \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.541908 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-scripts\") pod \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\" (UID: \"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b\") " Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.547245 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-scripts" (OuterVolumeSpecName: "scripts") pod "0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b" (UID: "0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.547739 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-kube-api-access-fwsbn" (OuterVolumeSpecName: "kube-api-access-fwsbn") pod "0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b" (UID: "0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b"). InnerVolumeSpecName "kube-api-access-fwsbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.568384 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b" (UID: "0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.572046 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-config-data" (OuterVolumeSpecName: "config-data") pod "0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b" (UID: "0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.643820 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.643854 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.643864 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwsbn\" (UniqueName: \"kubernetes.io/projected/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-kube-api-access-fwsbn\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:29 crc kubenswrapper[4753]: I0129 15:34:29.643873 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.182479 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4zbrh" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.182481 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4zbrh" event={"ID":"0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b","Type":"ContainerDied","Data":"e07cff32c133de717718876aaedb18ed05fff821bd144bb2a0c2431fd931c131"} Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.186133 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e07cff32c133de717718876aaedb18ed05fff821bd144bb2a0c2431fd931c131" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.263257 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 15:34:30 crc kubenswrapper[4753]: E0129 15:34:30.263811 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b" containerName="nova-cell1-conductor-db-sync" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.263836 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b" containerName="nova-cell1-conductor-db-sync" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.264072 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b" containerName="nova-cell1-conductor-db-sync" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.264958 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.268392 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.284385 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.368608 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.368828 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpw5j\" (UniqueName: \"kubernetes.io/projected/c40be523-36a9-4bd7-8568-94e2921da7bd-kube-api-access-dpw5j\") pod \"nova-cell1-conductor-0\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.369003 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.471560 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.472316 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.473028 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpw5j\" (UniqueName: \"kubernetes.io/projected/c40be523-36a9-4bd7-8568-94e2921da7bd-kube-api-access-dpw5j\") pod \"nova-cell1-conductor-0\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.476418 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.481641 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.492610 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpw5j\" (UniqueName: \"kubernetes.io/projected/c40be523-36a9-4bd7-8568-94e2921da7bd-kube-api-access-dpw5j\") pod \"nova-cell1-conductor-0\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.562099 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.576872 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-combined-ca-bundle\") pod \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.577053 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-config-data\") pod \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.577249 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgm79\" (UniqueName: \"kubernetes.io/projected/8de6fe34-0d8e-494e-8cea-b91c8e437b88-kube-api-access-bgm79\") pod \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.577375 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-scripts\") pod \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\" (UID: \"8de6fe34-0d8e-494e-8cea-b91c8e437b88\") " Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.582282 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.583356 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de6fe34-0d8e-494e-8cea-b91c8e437b88-kube-api-access-bgm79" (OuterVolumeSpecName: "kube-api-access-bgm79") pod "8de6fe34-0d8e-494e-8cea-b91c8e437b88" (UID: "8de6fe34-0d8e-494e-8cea-b91c8e437b88"). InnerVolumeSpecName "kube-api-access-bgm79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.584076 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-scripts" (OuterVolumeSpecName: "scripts") pod "8de6fe34-0d8e-494e-8cea-b91c8e437b88" (UID: "8de6fe34-0d8e-494e-8cea-b91c8e437b88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.609849 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-config-data" (OuterVolumeSpecName: "config-data") pod "8de6fe34-0d8e-494e-8cea-b91c8e437b88" (UID: "8de6fe34-0d8e-494e-8cea-b91c8e437b88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.625603 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8de6fe34-0d8e-494e-8cea-b91c8e437b88" (UID: "8de6fe34-0d8e-494e-8cea-b91c8e437b88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.682201 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.682273 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.682289 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgm79\" (UniqueName: \"kubernetes.io/projected/8de6fe34-0d8e-494e-8cea-b91c8e437b88-kube-api-access-bgm79\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:30 crc kubenswrapper[4753]: I0129 15:34:30.682309 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8de6fe34-0d8e-494e-8cea-b91c8e437b88-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.019850 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.191520 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-n4smg" event={"ID":"8de6fe34-0d8e-494e-8cea-b91c8e437b88","Type":"ContainerDied","Data":"747bf346e2ad96df2ddb9af255765c774ff41072363ce4e0c165ecd95eb5d7e6"} Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.191878 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="747bf346e2ad96df2ddb9af255765c774ff41072363ce4e0c165ecd95eb5d7e6" Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.191538 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-n4smg" Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.192618 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c40be523-36a9-4bd7-8568-94e2921da7bd","Type":"ContainerStarted","Data":"7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae"} Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.192641 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c40be523-36a9-4bd7-8568-94e2921da7bd","Type":"ContainerStarted","Data":"2d6167c935c3ed3859388a9123b851d778486d3268117fd4f964957967df613c"} Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.192738 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.219614 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.219592966 podStartE2EDuration="1.219592966s" podCreationTimestamp="2026-01-29 15:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:31.213688248 +0000 UTC m=+5505.908422630" watchObservedRunningTime="2026-01-29 15:34:31.219592966 +0000 UTC m=+5505.914327348" Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.381268 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.381530 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="03b8773f-8f94-456c-9075-fd1c69ad7e20" containerName="nova-api-log" containerID="cri-o://852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51" gracePeriod=30 Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.381694 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="03b8773f-8f94-456c-9075-fd1c69ad7e20" containerName="nova-api-api" containerID="cri-o://5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4" gracePeriod=30 Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.403221 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.403500 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f8bfa9d3-3e74-41d4-a434-873bbc09dfd5" containerName="nova-scheduler-scheduler" containerID="cri-o://9a75ac00c7daba2c415d2e3c7b0591c8b551d4903fd8edd980d0e552aba51abf" gracePeriod=30 Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.418392 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.418687 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" containerName="nova-metadata-log" containerID="cri-o://dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6" gracePeriod=30 Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.418720 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" containerName="nova-metadata-metadata" containerID="cri-o://0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4" gracePeriod=30 Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.972953 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:34:31 crc kubenswrapper[4753]: I0129 15:34:31.991177 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.017874 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8zl5\" (UniqueName: \"kubernetes.io/projected/392865b5-bb31-439f-b001-b6ef6b7a1a4c-kube-api-access-z8zl5\") pod \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.017965 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392865b5-bb31-439f-b001-b6ef6b7a1a4c-logs\") pod \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.017999 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-config-data\") pod \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.018044 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03b8773f-8f94-456c-9075-fd1c69ad7e20-logs\") pod \"03b8773f-8f94-456c-9075-fd1c69ad7e20\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.018090 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-combined-ca-bundle\") pod \"03b8773f-8f94-456c-9075-fd1c69ad7e20\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.018182 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-combined-ca-bundle\") pod \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\" (UID: \"392865b5-bb31-439f-b001-b6ef6b7a1a4c\") " Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.018212 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-config-data\") pod \"03b8773f-8f94-456c-9075-fd1c69ad7e20\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.018249 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n57nl\" (UniqueName: \"kubernetes.io/projected/03b8773f-8f94-456c-9075-fd1c69ad7e20-kube-api-access-n57nl\") pod \"03b8773f-8f94-456c-9075-fd1c69ad7e20\" (UID: \"03b8773f-8f94-456c-9075-fd1c69ad7e20\") " Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.020091 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03b8773f-8f94-456c-9075-fd1c69ad7e20-logs" (OuterVolumeSpecName: "logs") pod "03b8773f-8f94-456c-9075-fd1c69ad7e20" (UID: "03b8773f-8f94-456c-9075-fd1c69ad7e20"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.020955 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392865b5-bb31-439f-b001-b6ef6b7a1a4c-logs" (OuterVolumeSpecName: "logs") pod "392865b5-bb31-439f-b001-b6ef6b7a1a4c" (UID: "392865b5-bb31-439f-b001-b6ef6b7a1a4c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.028132 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b8773f-8f94-456c-9075-fd1c69ad7e20-kube-api-access-n57nl" (OuterVolumeSpecName: "kube-api-access-n57nl") pod "03b8773f-8f94-456c-9075-fd1c69ad7e20" (UID: "03b8773f-8f94-456c-9075-fd1c69ad7e20"). InnerVolumeSpecName "kube-api-access-n57nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.032214 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392865b5-bb31-439f-b001-b6ef6b7a1a4c-kube-api-access-z8zl5" (OuterVolumeSpecName: "kube-api-access-z8zl5") pod "392865b5-bb31-439f-b001-b6ef6b7a1a4c" (UID: "392865b5-bb31-439f-b001-b6ef6b7a1a4c"). InnerVolumeSpecName "kube-api-access-z8zl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.045494 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-config-data" (OuterVolumeSpecName: "config-data") pod "03b8773f-8f94-456c-9075-fd1c69ad7e20" (UID: "03b8773f-8f94-456c-9075-fd1c69ad7e20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.045697 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-config-data" (OuterVolumeSpecName: "config-data") pod "392865b5-bb31-439f-b001-b6ef6b7a1a4c" (UID: "392865b5-bb31-439f-b001-b6ef6b7a1a4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.053414 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "392865b5-bb31-439f-b001-b6ef6b7a1a4c" (UID: "392865b5-bb31-439f-b001-b6ef6b7a1a4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.057385 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03b8773f-8f94-456c-9075-fd1c69ad7e20" (UID: "03b8773f-8f94-456c-9075-fd1c69ad7e20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.119491 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.119524 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.119537 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b8773f-8f94-456c-9075-fd1c69ad7e20-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.119549 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n57nl\" (UniqueName: \"kubernetes.io/projected/03b8773f-8f94-456c-9075-fd1c69ad7e20-kube-api-access-n57nl\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.119562 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8zl5\" (UniqueName: \"kubernetes.io/projected/392865b5-bb31-439f-b001-b6ef6b7a1a4c-kube-api-access-z8zl5\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.119571 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392865b5-bb31-439f-b001-b6ef6b7a1a4c-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.119580 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392865b5-bb31-439f-b001-b6ef6b7a1a4c-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.119589 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03b8773f-8f94-456c-9075-fd1c69ad7e20-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.203067 4753 generic.go:334] "Generic (PLEG): container finished" podID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" containerID="0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4" exitCode=0 Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.203101 4753 generic.go:334] "Generic (PLEG): container finished" podID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" containerID="dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6" exitCode=143 Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.203143 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"392865b5-bb31-439f-b001-b6ef6b7a1a4c","Type":"ContainerDied","Data":"0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4"} Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.203188 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"392865b5-bb31-439f-b001-b6ef6b7a1a4c","Type":"ContainerDied","Data":"dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6"} Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.203200 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"392865b5-bb31-439f-b001-b6ef6b7a1a4c","Type":"ContainerDied","Data":"e92962e503ee400bc32c1bff4b5557304c5f4aac07eaf1f072875c082d9dfd22"} Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.203215 4753 scope.go:117] "RemoveContainer" containerID="0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.203330 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.207248 4753 generic.go:334] "Generic (PLEG): container finished" podID="03b8773f-8f94-456c-9075-fd1c69ad7e20" containerID="5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4" exitCode=0 Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.207269 4753 generic.go:334] "Generic (PLEG): container finished" podID="03b8773f-8f94-456c-9075-fd1c69ad7e20" containerID="852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51" exitCode=143 Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.207292 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.207295 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03b8773f-8f94-456c-9075-fd1c69ad7e20","Type":"ContainerDied","Data":"5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4"} Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.207343 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03b8773f-8f94-456c-9075-fd1c69ad7e20","Type":"ContainerDied","Data":"852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51"} Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.207361 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"03b8773f-8f94-456c-9075-fd1c69ad7e20","Type":"ContainerDied","Data":"01ffad8a6de73b68d55368784dd03c0e7e187b8a1c7cbfd64e07cb082d9449ca"} Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.240297 4753 scope.go:117] "RemoveContainer" containerID="dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.243415 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.262361 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.269587 4753 scope.go:117] "RemoveContainer" containerID="0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4" Jan 29 15:34:32 crc kubenswrapper[4753]: E0129 15:34:32.274366 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4\": container with ID starting with 0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4 not found: ID does not exist" containerID="0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.274455 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4"} err="failed to get container status \"0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4\": rpc error: code = NotFound desc = could not find container \"0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4\": container with ID starting with 0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4 not found: ID does not exist" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.274491 4753 scope.go:117] "RemoveContainer" containerID="dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6" Jan 29 15:34:32 crc kubenswrapper[4753]: E0129 15:34:32.275013 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6\": container with ID starting with dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6 not found: ID does not exist" containerID="dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.275042 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6"} err="failed to get container status \"dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6\": rpc error: code = NotFound desc = could not find container \"dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6\": container with ID starting with dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6 not found: ID does not exist" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.275057 4753 scope.go:117] "RemoveContainer" containerID="0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.276301 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4"} err="failed to get container status \"0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4\": rpc error: code = NotFound desc = could not find container \"0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4\": container with ID starting with 0da4111a6092942fd002629ec6fc5f4156f9d1ea4743a0c567babb9d217ce9f4 not found: ID does not exist" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.276341 4753 scope.go:117] "RemoveContainer" containerID="dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.276841 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6"} err="failed to get container status \"dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6\": rpc error: code = NotFound desc = could not find container \"dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6\": container with ID starting with dc9419665d7604b0ac2cfa60634e9656057c80e6acf06790b9a263afe2da10a6 not found: ID does not exist" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.276862 4753 scope.go:117] "RemoveContainer" containerID="5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.278949 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.289962 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.297347 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:32 crc kubenswrapper[4753]: E0129 15:34:32.297746 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" containerName="nova-metadata-metadata" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.297769 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" containerName="nova-metadata-metadata" Jan 29 15:34:32 crc kubenswrapper[4753]: E0129 15:34:32.297800 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b8773f-8f94-456c-9075-fd1c69ad7e20" containerName="nova-api-api" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.297808 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b8773f-8f94-456c-9075-fd1c69ad7e20" containerName="nova-api-api" Jan 29 15:34:32 crc kubenswrapper[4753]: E0129 15:34:32.297830 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b8773f-8f94-456c-9075-fd1c69ad7e20" containerName="nova-api-log" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.297838 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b8773f-8f94-456c-9075-fd1c69ad7e20" containerName="nova-api-log" Jan 29 15:34:32 crc kubenswrapper[4753]: E0129 15:34:32.297851 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de6fe34-0d8e-494e-8cea-b91c8e437b88" containerName="nova-manage" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.297859 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de6fe34-0d8e-494e-8cea-b91c8e437b88" containerName="nova-manage" Jan 29 15:34:32 crc kubenswrapper[4753]: E0129 15:34:32.297874 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" containerName="nova-metadata-log" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.297882 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" containerName="nova-metadata-log" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.298048 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b8773f-8f94-456c-9075-fd1c69ad7e20" containerName="nova-api-log" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.298060 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de6fe34-0d8e-494e-8cea-b91c8e437b88" containerName="nova-manage" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.298073 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" containerName="nova-metadata-log" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.298081 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b8773f-8f94-456c-9075-fd1c69ad7e20" containerName="nova-api-api" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.298088 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" containerName="nova-metadata-metadata" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.299002 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.302729 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.303108 4753 scope.go:117] "RemoveContainer" containerID="852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.307009 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.308693 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.312574 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.316340 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.325185 4753 scope.go:117] "RemoveContainer" containerID="5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4" Jan 29 15:34:32 crc kubenswrapper[4753]: E0129 15:34:32.326100 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4\": container with ID starting with 5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4 not found: ID does not exist" containerID="5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.326139 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4"} err="failed to get container status \"5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4\": rpc error: code = NotFound desc = could not find container \"5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4\": container with ID starting with 5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4 not found: ID does not exist" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.326207 4753 scope.go:117] "RemoveContainer" containerID="852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51" Jan 29 15:34:32 crc kubenswrapper[4753]: E0129 15:34:32.326623 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51\": container with ID starting with 852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51 not found: ID does not exist" containerID="852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.326671 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51"} err="failed to get container status \"852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51\": rpc error: code = NotFound desc = could not find container \"852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51\": container with ID starting with 852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51 not found: ID does not exist" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.326706 4753 scope.go:117] "RemoveContainer" containerID="5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.328655 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4"} err="failed to get container status \"5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4\": rpc error: code = NotFound desc = could not find container \"5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4\": container with ID starting with 5bd981222bdfa671b7eb5a28132a8e10f655e6c861eea4e9eed50eb182c9b9b4 not found: ID does not exist" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.328719 4753 scope.go:117] "RemoveContainer" containerID="852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.329100 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51"} err="failed to get container status \"852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51\": rpc error: code = NotFound desc = could not find container \"852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51\": container with ID starting with 852f31c7f1c7d6928b268915dbe9e13e90148b04caef7777aa8bdd4edc587e51 not found: ID does not exist" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.333677 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.431480 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-config-data\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.431526 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-logs\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.431566 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dw4s\" (UniqueName: \"kubernetes.io/projected/23e994d8-c397-46ea-8c59-4a90a324d558-kube-api-access-8dw4s\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.431618 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e994d8-c397-46ea-8c59-4a90a324d558-logs\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.431668 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.431691 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-config-data\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.431709 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lrpf\" (UniqueName: \"kubernetes.io/projected/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-kube-api-access-9lrpf\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.431725 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.533049 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.533098 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-config-data\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.533121 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lrpf\" (UniqueName: \"kubernetes.io/projected/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-kube-api-access-9lrpf\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.533140 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.533203 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-config-data\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.533218 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-logs\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.533274 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dw4s\" (UniqueName: \"kubernetes.io/projected/23e994d8-c397-46ea-8c59-4a90a324d558-kube-api-access-8dw4s\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.533334 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e994d8-c397-46ea-8c59-4a90a324d558-logs\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.533697 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e994d8-c397-46ea-8c59-4a90a324d558-logs\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.534558 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-logs\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.550093 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.550445 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-config-data\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.550489 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-config-data\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.553314 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.556908 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dw4s\" (UniqueName: \"kubernetes.io/projected/23e994d8-c397-46ea-8c59-4a90a324d558-kube-api-access-8dw4s\") pod \"nova-metadata-0\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.557938 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lrpf\" (UniqueName: \"kubernetes.io/projected/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-kube-api-access-9lrpf\") pod \"nova-api-0\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.620548 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.637629 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.661438 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.673292 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.686438 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.754482 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69df65c7dc-vl6ck"] Jan 29 15:34:32 crc kubenswrapper[4753]: I0129 15:34:32.754924 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" podUID="73c9c3a0-87c7-4f79-b0a7-3964416ea053" containerName="dnsmasq-dns" containerID="cri-o://5d38142ecaab2439dc4320a1292f6aa03921c4d9d55f4c65cf8ba2929c50bcf3" gracePeriod=10 Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.184455 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.232966 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23e994d8-c397-46ea-8c59-4a90a324d558","Type":"ContainerStarted","Data":"38ec96630c320c9b09f1d4b0f49863dfa75e7d7b3c95b41d52e1eddb28a00586"} Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.238582 4753 generic.go:334] "Generic (PLEG): container finished" podID="73c9c3a0-87c7-4f79-b0a7-3964416ea053" containerID="5d38142ecaab2439dc4320a1292f6aa03921c4d9d55f4c65cf8ba2929c50bcf3" exitCode=0 Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.238653 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" event={"ID":"73c9c3a0-87c7-4f79-b0a7-3964416ea053","Type":"ContainerDied","Data":"5d38142ecaab2439dc4320a1292f6aa03921c4d9d55f4c65cf8ba2929c50bcf3"} Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.248639 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.285214 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.347630 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.452719 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-config\") pod \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.452784 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-nb\") pod \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.452932 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfk2t\" (UniqueName: \"kubernetes.io/projected/73c9c3a0-87c7-4f79-b0a7-3964416ea053-kube-api-access-jfk2t\") pod \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.452968 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-sb\") pod \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.453037 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-dns-svc\") pod \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\" (UID: \"73c9c3a0-87c7-4f79-b0a7-3964416ea053\") " Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.519728 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c9c3a0-87c7-4f79-b0a7-3964416ea053-kube-api-access-jfk2t" (OuterVolumeSpecName: "kube-api-access-jfk2t") pod "73c9c3a0-87c7-4f79-b0a7-3964416ea053" (UID: "73c9c3a0-87c7-4f79-b0a7-3964416ea053"). InnerVolumeSpecName "kube-api-access-jfk2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.562644 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfk2t\" (UniqueName: \"kubernetes.io/projected/73c9c3a0-87c7-4f79-b0a7-3964416ea053-kube-api-access-jfk2t\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.579863 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73c9c3a0-87c7-4f79-b0a7-3964416ea053" (UID: "73c9c3a0-87c7-4f79-b0a7-3964416ea053"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.604728 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73c9c3a0-87c7-4f79-b0a7-3964416ea053" (UID: "73c9c3a0-87c7-4f79-b0a7-3964416ea053"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.614225 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-config" (OuterVolumeSpecName: "config") pod "73c9c3a0-87c7-4f79-b0a7-3964416ea053" (UID: "73c9c3a0-87c7-4f79-b0a7-3964416ea053"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.649646 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73c9c3a0-87c7-4f79-b0a7-3964416ea053" (UID: "73c9c3a0-87c7-4f79-b0a7-3964416ea053"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.664090 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.664138 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.664177 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:33 crc kubenswrapper[4753]: I0129 15:34:33.664189 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73c9c3a0-87c7-4f79-b0a7-3964416ea053-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.160303 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b8773f-8f94-456c-9075-fd1c69ad7e20" path="/var/lib/kubelet/pods/03b8773f-8f94-456c-9075-fd1c69ad7e20/volumes" Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.161017 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392865b5-bb31-439f-b001-b6ef6b7a1a4c" path="/var/lib/kubelet/pods/392865b5-bb31-439f-b001-b6ef6b7a1a4c/volumes" Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.253602 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" event={"ID":"73c9c3a0-87c7-4f79-b0a7-3964416ea053","Type":"ContainerDied","Data":"d800ae7096ed727b106b47773d98b332fc1b9e68663a2b7006a829e4c653d061"} Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.253633 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69df65c7dc-vl6ck" Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.253688 4753 scope.go:117] "RemoveContainer" containerID="5d38142ecaab2439dc4320a1292f6aa03921c4d9d55f4c65cf8ba2929c50bcf3" Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.257731 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23e994d8-c397-46ea-8c59-4a90a324d558","Type":"ContainerStarted","Data":"3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8"} Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.259776 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23e994d8-c397-46ea-8c59-4a90a324d558","Type":"ContainerStarted","Data":"a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c"} Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.272497 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"81c7beaf-bbe4-474d-a57b-fc3d4e076c62","Type":"ContainerStarted","Data":"ffda8e1dd45a7a0454677be029cd8a457941119cf5c551962a9c8e8fe53daf0a"} Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.272541 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"81c7beaf-bbe4-474d-a57b-fc3d4e076c62","Type":"ContainerStarted","Data":"24b74692fa3128d6df9a65671c64a796937f2ca62652dbecbdbcfa2fc90dff0a"} Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.272558 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"81c7beaf-bbe4-474d-a57b-fc3d4e076c62","Type":"ContainerStarted","Data":"71c2e28fc36d3c7fbd9804b9536681ddd45ba5986dba819f3c1feca02fb779d7"} Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.280788 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69df65c7dc-vl6ck"] Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.295099 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69df65c7dc-vl6ck"] Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.300166 4753 scope.go:117] "RemoveContainer" containerID="8ec22344cf77cc426962964ec25f6888fe513c8ec20ee46c886ef1f8cf055a83" Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.314338 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.314315832 podStartE2EDuration="2.314315832s" podCreationTimestamp="2026-01-29 15:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:34.312480722 +0000 UTC m=+5509.007215104" watchObservedRunningTime="2026-01-29 15:34:34.314315832 +0000 UTC m=+5509.009050224" Jan 29 15:34:34 crc kubenswrapper[4753]: I0129 15:34:34.334404 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.334382939 podStartE2EDuration="2.334382939s" podCreationTimestamp="2026-01-29 15:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:34.329500898 +0000 UTC m=+5509.024235290" watchObservedRunningTime="2026-01-29 15:34:34.334382939 +0000 UTC m=+5509.029117311" Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.286583 4753 generic.go:334] "Generic (PLEG): container finished" podID="f8bfa9d3-3e74-41d4-a434-873bbc09dfd5" containerID="9a75ac00c7daba2c415d2e3c7b0591c8b551d4903fd8edd980d0e552aba51abf" exitCode=0 Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.286667 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5","Type":"ContainerDied","Data":"9a75ac00c7daba2c415d2e3c7b0591c8b551d4903fd8edd980d0e552aba51abf"} Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.287060 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5","Type":"ContainerDied","Data":"b671f2f92cead6de96ac49d6678c265f605ddb4cdb3d256049c509bf8e42a3d9"} Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.287078 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b671f2f92cead6de96ac49d6678c265f605ddb4cdb3d256049c509bf8e42a3d9" Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.295198 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.395055 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-combined-ca-bundle\") pod \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.395373 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25gs2\" (UniqueName: \"kubernetes.io/projected/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-kube-api-access-25gs2\") pod \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.395529 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-config-data\") pod \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\" (UID: \"f8bfa9d3-3e74-41d4-a434-873bbc09dfd5\") " Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.399955 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-kube-api-access-25gs2" (OuterVolumeSpecName: "kube-api-access-25gs2") pod "f8bfa9d3-3e74-41d4-a434-873bbc09dfd5" (UID: "f8bfa9d3-3e74-41d4-a434-873bbc09dfd5"). InnerVolumeSpecName "kube-api-access-25gs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.420869 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-config-data" (OuterVolumeSpecName: "config-data") pod "f8bfa9d3-3e74-41d4-a434-873bbc09dfd5" (UID: "f8bfa9d3-3e74-41d4-a434-873bbc09dfd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.424813 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8bfa9d3-3e74-41d4-a434-873bbc09dfd5" (UID: "f8bfa9d3-3e74-41d4-a434-873bbc09dfd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.498404 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.498448 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25gs2\" (UniqueName: \"kubernetes.io/projected/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-kube-api-access-25gs2\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:35 crc kubenswrapper[4753]: I0129 15:34:35.498462 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.160312 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c9c3a0-87c7-4f79-b0a7-3964416ea053" path="/var/lib/kubelet/pods/73c9c3a0-87c7-4f79-b0a7-3964416ea053/volumes" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.297224 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.330999 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.351123 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.360777 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:36 crc kubenswrapper[4753]: E0129 15:34:36.361277 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c9c3a0-87c7-4f79-b0a7-3964416ea053" containerName="dnsmasq-dns" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.361302 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c9c3a0-87c7-4f79-b0a7-3964416ea053" containerName="dnsmasq-dns" Jan 29 15:34:36 crc kubenswrapper[4753]: E0129 15:34:36.361317 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c9c3a0-87c7-4f79-b0a7-3964416ea053" containerName="init" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.361325 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c9c3a0-87c7-4f79-b0a7-3964416ea053" containerName="init" Jan 29 15:34:36 crc kubenswrapper[4753]: E0129 15:34:36.361351 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bfa9d3-3e74-41d4-a434-873bbc09dfd5" containerName="nova-scheduler-scheduler" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.361358 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bfa9d3-3e74-41d4-a434-873bbc09dfd5" containerName="nova-scheduler-scheduler" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.361576 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c9c3a0-87c7-4f79-b0a7-3964416ea053" containerName="dnsmasq-dns" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.361608 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bfa9d3-3e74-41d4-a434-873bbc09dfd5" containerName="nova-scheduler-scheduler" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.362322 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.368009 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.370616 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.530867 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lx7p\" (UniqueName: \"kubernetes.io/projected/5fbbe2a4-d836-4639-aeed-4706ba424bc8-kube-api-access-8lx7p\") pod \"nova-scheduler-0\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.531019 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.531376 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-config-data\") pod \"nova-scheduler-0\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.633376 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-config-data\") pod \"nova-scheduler-0\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.633550 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lx7p\" (UniqueName: \"kubernetes.io/projected/5fbbe2a4-d836-4639-aeed-4706ba424bc8-kube-api-access-8lx7p\") pod \"nova-scheduler-0\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.633607 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.640135 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-config-data\") pod \"nova-scheduler-0\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.640353 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.655424 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lx7p\" (UniqueName: \"kubernetes.io/projected/5fbbe2a4-d836-4639-aeed-4706ba424bc8-kube-api-access-8lx7p\") pod \"nova-scheduler-0\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:36 crc kubenswrapper[4753]: I0129 15:34:36.705346 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:34:37 crc kubenswrapper[4753]: I0129 15:34:37.170849 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:37 crc kubenswrapper[4753]: I0129 15:34:37.306747 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5fbbe2a4-d836-4639-aeed-4706ba424bc8","Type":"ContainerStarted","Data":"7f4d22d74f89ecf9f7d2c44c8edc2190ec74bde929b2bf20526d3fb9508a28f8"} Jan 29 15:34:37 crc kubenswrapper[4753]: I0129 15:34:37.620978 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 15:34:37 crc kubenswrapper[4753]: I0129 15:34:37.621040 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 15:34:38 crc kubenswrapper[4753]: I0129 15:34:38.171592 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bfa9d3-3e74-41d4-a434-873bbc09dfd5" path="/var/lib/kubelet/pods/f8bfa9d3-3e74-41d4-a434-873bbc09dfd5/volumes" Jan 29 15:34:38 crc kubenswrapper[4753]: I0129 15:34:38.318325 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5fbbe2a4-d836-4639-aeed-4706ba424bc8","Type":"ContainerStarted","Data":"560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac"} Jan 29 15:34:38 crc kubenswrapper[4753]: I0129 15:34:38.344981 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.344956434 podStartE2EDuration="2.344956434s" podCreationTimestamp="2026-01-29 15:34:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:38.33960335 +0000 UTC m=+5513.034337792" watchObservedRunningTime="2026-01-29 15:34:38.344956434 +0000 UTC m=+5513.039690826" Jan 29 15:34:40 crc kubenswrapper[4753]: I0129 15:34:40.613903 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.066044 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-d5h4f"] Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.067415 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.071993 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.072279 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.078853 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-d5h4f"] Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.237186 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-scripts\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.237517 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.237636 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf2v8\" (UniqueName: \"kubernetes.io/projected/e23b99b1-289f-4f3b-8de7-7567e21674a4-kube-api-access-xf2v8\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.237677 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-config-data\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.341023 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-scripts\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.341069 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.341203 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf2v8\" (UniqueName: \"kubernetes.io/projected/e23b99b1-289f-4f3b-8de7-7567e21674a4-kube-api-access-xf2v8\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.341245 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-config-data\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.346616 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-scripts\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.346831 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.354070 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-config-data\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.358636 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf2v8\" (UniqueName: \"kubernetes.io/projected/e23b99b1-289f-4f3b-8de7-7567e21674a4-kube-api-access-xf2v8\") pod \"nova-cell1-cell-mapping-d5h4f\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.407527 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.705466 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 15:34:41 crc kubenswrapper[4753]: I0129 15:34:41.882637 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-d5h4f"] Jan 29 15:34:42 crc kubenswrapper[4753]: I0129 15:34:42.374910 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-d5h4f" event={"ID":"e23b99b1-289f-4f3b-8de7-7567e21674a4","Type":"ContainerStarted","Data":"487fcdc593a8ac55d5d3583e9aaac21ca9431766bd0a898764b8a281a5f76e7d"} Jan 29 15:34:42 crc kubenswrapper[4753]: I0129 15:34:42.375291 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-d5h4f" event={"ID":"e23b99b1-289f-4f3b-8de7-7567e21674a4","Type":"ContainerStarted","Data":"888e100bb7875f8a49548bbcc0f68a8829fe49dc18cd80f7b3244958d6d1606a"} Jan 29 15:34:42 crc kubenswrapper[4753]: I0129 15:34:42.404597 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-d5h4f" podStartSLOduration=1.404575913 podStartE2EDuration="1.404575913s" podCreationTimestamp="2026-01-29 15:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:42.393901036 +0000 UTC m=+5517.088635458" watchObservedRunningTime="2026-01-29 15:34:42.404575913 +0000 UTC m=+5517.099310305" Jan 29 15:34:42 crc kubenswrapper[4753]: I0129 15:34:42.621196 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 15:34:42 crc kubenswrapper[4753]: I0129 15:34:42.622975 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 15:34:42 crc kubenswrapper[4753]: I0129 15:34:42.638810 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 15:34:42 crc kubenswrapper[4753]: I0129 15:34:42.638845 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 15:34:43 crc kubenswrapper[4753]: I0129 15:34:43.745471 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.78:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:34:43 crc kubenswrapper[4753]: I0129 15:34:43.745475 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.77:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:34:43 crc kubenswrapper[4753]: I0129 15:34:43.745517 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.78:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:34:43 crc kubenswrapper[4753]: I0129 15:34:43.745788 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.77:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:34:46 crc kubenswrapper[4753]: I0129 15:34:46.706509 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 15:34:46 crc kubenswrapper[4753]: I0129 15:34:46.738534 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 15:34:47 crc kubenswrapper[4753]: I0129 15:34:47.453829 4753 generic.go:334] "Generic (PLEG): container finished" podID="e23b99b1-289f-4f3b-8de7-7567e21674a4" containerID="487fcdc593a8ac55d5d3583e9aaac21ca9431766bd0a898764b8a281a5f76e7d" exitCode=0 Jan 29 15:34:47 crc kubenswrapper[4753]: I0129 15:34:47.453940 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-d5h4f" event={"ID":"e23b99b1-289f-4f3b-8de7-7567e21674a4","Type":"ContainerDied","Data":"487fcdc593a8ac55d5d3583e9aaac21ca9431766bd0a898764b8a281a5f76e7d"} Jan 29 15:34:47 crc kubenswrapper[4753]: I0129 15:34:47.486920 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.797678 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.802840 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-combined-ca-bundle\") pod \"e23b99b1-289f-4f3b-8de7-7567e21674a4\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.802920 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf2v8\" (UniqueName: \"kubernetes.io/projected/e23b99b1-289f-4f3b-8de7-7567e21674a4-kube-api-access-xf2v8\") pod \"e23b99b1-289f-4f3b-8de7-7567e21674a4\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.803062 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-config-data\") pod \"e23b99b1-289f-4f3b-8de7-7567e21674a4\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.803138 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-scripts\") pod \"e23b99b1-289f-4f3b-8de7-7567e21674a4\" (UID: \"e23b99b1-289f-4f3b-8de7-7567e21674a4\") " Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.809328 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-scripts" (OuterVolumeSpecName: "scripts") pod "e23b99b1-289f-4f3b-8de7-7567e21674a4" (UID: "e23b99b1-289f-4f3b-8de7-7567e21674a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.810694 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e23b99b1-289f-4f3b-8de7-7567e21674a4-kube-api-access-xf2v8" (OuterVolumeSpecName: "kube-api-access-xf2v8") pod "e23b99b1-289f-4f3b-8de7-7567e21674a4" (UID: "e23b99b1-289f-4f3b-8de7-7567e21674a4"). InnerVolumeSpecName "kube-api-access-xf2v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.834993 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-config-data" (OuterVolumeSpecName: "config-data") pod "e23b99b1-289f-4f3b-8de7-7567e21674a4" (UID: "e23b99b1-289f-4f3b-8de7-7567e21674a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.838566 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e23b99b1-289f-4f3b-8de7-7567e21674a4" (UID: "e23b99b1-289f-4f3b-8de7-7567e21674a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.905935 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.905997 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf2v8\" (UniqueName: \"kubernetes.io/projected/e23b99b1-289f-4f3b-8de7-7567e21674a4-kube-api-access-xf2v8\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.906018 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:48 crc kubenswrapper[4753]: I0129 15:34:48.906036 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e23b99b1-289f-4f3b-8de7-7567e21674a4-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.476501 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-d5h4f" event={"ID":"e23b99b1-289f-4f3b-8de7-7567e21674a4","Type":"ContainerDied","Data":"888e100bb7875f8a49548bbcc0f68a8829fe49dc18cd80f7b3244958d6d1606a"} Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.476539 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-d5h4f" Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.476553 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="888e100bb7875f8a49548bbcc0f68a8829fe49dc18cd80f7b3244958d6d1606a" Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.676959 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.677404 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerName="nova-api-log" containerID="cri-o://24b74692fa3128d6df9a65671c64a796937f2ca62652dbecbdbcfa2fc90dff0a" gracePeriod=30 Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.677492 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerName="nova-api-api" containerID="cri-o://ffda8e1dd45a7a0454677be029cd8a457941119cf5c551962a9c8e8fe53daf0a" gracePeriod=30 Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.703660 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.703882 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5fbbe2a4-d836-4639-aeed-4706ba424bc8" containerName="nova-scheduler-scheduler" containerID="cri-o://560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac" gracePeriod=30 Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.764892 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.765177 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" containerName="nova-metadata-log" containerID="cri-o://a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c" gracePeriod=30 Jan 29 15:34:49 crc kubenswrapper[4753]: I0129 15:34:49.765330 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" containerName="nova-metadata-metadata" containerID="cri-o://3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8" gracePeriod=30 Jan 29 15:34:50 crc kubenswrapper[4753]: I0129 15:34:50.487731 4753 generic.go:334] "Generic (PLEG): container finished" podID="23e994d8-c397-46ea-8c59-4a90a324d558" containerID="a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c" exitCode=143 Jan 29 15:34:50 crc kubenswrapper[4753]: I0129 15:34:50.487793 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23e994d8-c397-46ea-8c59-4a90a324d558","Type":"ContainerDied","Data":"a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c"} Jan 29 15:34:50 crc kubenswrapper[4753]: I0129 15:34:50.490898 4753 generic.go:334] "Generic (PLEG): container finished" podID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerID="24b74692fa3128d6df9a65671c64a796937f2ca62652dbecbdbcfa2fc90dff0a" exitCode=143 Jan 29 15:34:50 crc kubenswrapper[4753]: I0129 15:34:50.490958 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"81c7beaf-bbe4-474d-a57b-fc3d4e076c62","Type":"ContainerDied","Data":"24b74692fa3128d6df9a65671c64a796937f2ca62652dbecbdbcfa2fc90dff0a"} Jan 29 15:34:51 crc kubenswrapper[4753]: E0129 15:34:51.708540 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 15:34:51 crc kubenswrapper[4753]: E0129 15:34:51.710340 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 15:34:51 crc kubenswrapper[4753]: E0129 15:34:51.711655 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 15:34:51 crc kubenswrapper[4753]: E0129 15:34:51.711701 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5fbbe2a4-d836-4639-aeed-4706ba424bc8" containerName="nova-scheduler-scheduler" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.407068 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.526405 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dw4s\" (UniqueName: \"kubernetes.io/projected/23e994d8-c397-46ea-8c59-4a90a324d558-kube-api-access-8dw4s\") pod \"23e994d8-c397-46ea-8c59-4a90a324d558\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.526586 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-config-data\") pod \"23e994d8-c397-46ea-8c59-4a90a324d558\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.526683 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e994d8-c397-46ea-8c59-4a90a324d558-logs\") pod \"23e994d8-c397-46ea-8c59-4a90a324d558\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.526691 4753 generic.go:334] "Generic (PLEG): container finished" podID="23e994d8-c397-46ea-8c59-4a90a324d558" containerID="3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8" exitCode=0 Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.526801 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-combined-ca-bundle\") pod \"23e994d8-c397-46ea-8c59-4a90a324d558\" (UID: \"23e994d8-c397-46ea-8c59-4a90a324d558\") " Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.526840 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.527227 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23e994d8-c397-46ea-8c59-4a90a324d558","Type":"ContainerDied","Data":"3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8"} Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.527281 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23e994d8-c397-46ea-8c59-4a90a324d558","Type":"ContainerDied","Data":"38ec96630c320c9b09f1d4b0f49863dfa75e7d7b3c95b41d52e1eddb28a00586"} Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.527310 4753 scope.go:117] "RemoveContainer" containerID="3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.527729 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23e994d8-c397-46ea-8c59-4a90a324d558-logs" (OuterVolumeSpecName: "logs") pod "23e994d8-c397-46ea-8c59-4a90a324d558" (UID: "23e994d8-c397-46ea-8c59-4a90a324d558"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.533096 4753 generic.go:334] "Generic (PLEG): container finished" podID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerID="ffda8e1dd45a7a0454677be029cd8a457941119cf5c551962a9c8e8fe53daf0a" exitCode=0 Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.533135 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"81c7beaf-bbe4-474d-a57b-fc3d4e076c62","Type":"ContainerDied","Data":"ffda8e1dd45a7a0454677be029cd8a457941119cf5c551962a9c8e8fe53daf0a"} Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.533391 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e994d8-c397-46ea-8c59-4a90a324d558-kube-api-access-8dw4s" (OuterVolumeSpecName: "kube-api-access-8dw4s") pod "23e994d8-c397-46ea-8c59-4a90a324d558" (UID: "23e994d8-c397-46ea-8c59-4a90a324d558"). InnerVolumeSpecName "kube-api-access-8dw4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.554009 4753 scope.go:117] "RemoveContainer" containerID="a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.555614 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-config-data" (OuterVolumeSpecName: "config-data") pod "23e994d8-c397-46ea-8c59-4a90a324d558" (UID: "23e994d8-c397-46ea-8c59-4a90a324d558"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.556816 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23e994d8-c397-46ea-8c59-4a90a324d558" (UID: "23e994d8-c397-46ea-8c59-4a90a324d558"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.574691 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.589744 4753 scope.go:117] "RemoveContainer" containerID="3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8" Jan 29 15:34:53 crc kubenswrapper[4753]: E0129 15:34:53.590179 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8\": container with ID starting with 3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8 not found: ID does not exist" containerID="3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.590208 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8"} err="failed to get container status \"3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8\": rpc error: code = NotFound desc = could not find container \"3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8\": container with ID starting with 3b283be826e22b87ff22949beaeee46a13fe3225a0d918c41586d7551fe2d6e8 not found: ID does not exist" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.590228 4753 scope.go:117] "RemoveContainer" containerID="a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c" Jan 29 15:34:53 crc kubenswrapper[4753]: E0129 15:34:53.590636 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c\": container with ID starting with a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c not found: ID does not exist" containerID="a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.590660 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c"} err="failed to get container status \"a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c\": rpc error: code = NotFound desc = could not find container \"a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c\": container with ID starting with a52706dfaecd413e5d70f46519de763eaf5d14ffebd28eedc086bcc84021362c not found: ID does not exist" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.628579 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.628612 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e994d8-c397-46ea-8c59-4a90a324d558-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.628622 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e994d8-c397-46ea-8c59-4a90a324d558-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.628631 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dw4s\" (UniqueName: \"kubernetes.io/projected/23e994d8-c397-46ea-8c59-4a90a324d558-kube-api-access-8dw4s\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.729718 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lrpf\" (UniqueName: \"kubernetes.io/projected/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-kube-api-access-9lrpf\") pod \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.729835 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-combined-ca-bundle\") pod \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.729921 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-logs\") pod \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.729996 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-config-data\") pod \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\" (UID: \"81c7beaf-bbe4-474d-a57b-fc3d4e076c62\") " Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.730848 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-logs" (OuterVolumeSpecName: "logs") pod "81c7beaf-bbe4-474d-a57b-fc3d4e076c62" (UID: "81c7beaf-bbe4-474d-a57b-fc3d4e076c62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.733991 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-kube-api-access-9lrpf" (OuterVolumeSpecName: "kube-api-access-9lrpf") pod "81c7beaf-bbe4-474d-a57b-fc3d4e076c62" (UID: "81c7beaf-bbe4-474d-a57b-fc3d4e076c62"). InnerVolumeSpecName "kube-api-access-9lrpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.753251 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-config-data" (OuterVolumeSpecName: "config-data") pod "81c7beaf-bbe4-474d-a57b-fc3d4e076c62" (UID: "81c7beaf-bbe4-474d-a57b-fc3d4e076c62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.761848 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81c7beaf-bbe4-474d-a57b-fc3d4e076c62" (UID: "81c7beaf-bbe4-474d-a57b-fc3d4e076c62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.832774 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lrpf\" (UniqueName: \"kubernetes.io/projected/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-kube-api-access-9lrpf\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.832813 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.832828 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.832844 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c7beaf-bbe4-474d-a57b-fc3d4e076c62-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.877431 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.895007 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905092 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:53 crc kubenswrapper[4753]: E0129 15:34:53.905493 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerName="nova-api-log" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905513 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerName="nova-api-log" Jan 29 15:34:53 crc kubenswrapper[4753]: E0129 15:34:53.905530 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerName="nova-api-api" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905536 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerName="nova-api-api" Jan 29 15:34:53 crc kubenswrapper[4753]: E0129 15:34:53.905545 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" containerName="nova-metadata-log" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905552 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" containerName="nova-metadata-log" Jan 29 15:34:53 crc kubenswrapper[4753]: E0129 15:34:53.905561 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" containerName="nova-metadata-metadata" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905568 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" containerName="nova-metadata-metadata" Jan 29 15:34:53 crc kubenswrapper[4753]: E0129 15:34:53.905574 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e23b99b1-289f-4f3b-8de7-7567e21674a4" containerName="nova-manage" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905579 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="e23b99b1-289f-4f3b-8de7-7567e21674a4" containerName="nova-manage" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905777 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerName="nova-api-api" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905796 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" containerName="nova-metadata-metadata" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905807 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" containerName="nova-api-log" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905820 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="e23b99b1-289f-4f3b-8de7-7567e21674a4" containerName="nova-manage" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.905838 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" containerName="nova-metadata-log" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.906961 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.908954 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.929557 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:53 crc kubenswrapper[4753]: I0129 15:34:53.977006 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.037000 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.037077 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-config-data\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.037366 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j4zg\" (UniqueName: \"kubernetes.io/projected/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-kube-api-access-9j4zg\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.037483 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-logs\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.138720 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lx7p\" (UniqueName: \"kubernetes.io/projected/5fbbe2a4-d836-4639-aeed-4706ba424bc8-kube-api-access-8lx7p\") pod \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.138836 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-config-data\") pod \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.138881 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-combined-ca-bundle\") pod \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\" (UID: \"5fbbe2a4-d836-4639-aeed-4706ba424bc8\") " Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.139172 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.139217 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-config-data\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.139313 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j4zg\" (UniqueName: \"kubernetes.io/projected/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-kube-api-access-9j4zg\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.139366 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-logs\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.139871 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-logs\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.143010 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fbbe2a4-d836-4639-aeed-4706ba424bc8-kube-api-access-8lx7p" (OuterVolumeSpecName: "kube-api-access-8lx7p") pod "5fbbe2a4-d836-4639-aeed-4706ba424bc8" (UID: "5fbbe2a4-d836-4639-aeed-4706ba424bc8"). InnerVolumeSpecName "kube-api-access-8lx7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.143961 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-config-data\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.145680 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.161185 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e994d8-c397-46ea-8c59-4a90a324d558" path="/var/lib/kubelet/pods/23e994d8-c397-46ea-8c59-4a90a324d558/volumes" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.162635 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j4zg\" (UniqueName: \"kubernetes.io/projected/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-kube-api-access-9j4zg\") pod \"nova-metadata-0\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.163350 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fbbe2a4-d836-4639-aeed-4706ba424bc8" (UID: "5fbbe2a4-d836-4639-aeed-4706ba424bc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.173121 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-config-data" (OuterVolumeSpecName: "config-data") pod "5fbbe2a4-d836-4639-aeed-4706ba424bc8" (UID: "5fbbe2a4-d836-4639-aeed-4706ba424bc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.231455 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.241468 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.241682 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lx7p\" (UniqueName: \"kubernetes.io/projected/5fbbe2a4-d836-4639-aeed-4706ba424bc8-kube-api-access-8lx7p\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.241812 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbbe2a4-d836-4639-aeed-4706ba424bc8-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.542054 4753 generic.go:334] "Generic (PLEG): container finished" podID="5fbbe2a4-d836-4639-aeed-4706ba424bc8" containerID="560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac" exitCode=0 Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.542112 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.542105 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5fbbe2a4-d836-4639-aeed-4706ba424bc8","Type":"ContainerDied","Data":"560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac"} Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.542659 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5fbbe2a4-d836-4639-aeed-4706ba424bc8","Type":"ContainerDied","Data":"7f4d22d74f89ecf9f7d2c44c8edc2190ec74bde929b2bf20526d3fb9508a28f8"} Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.542690 4753 scope.go:117] "RemoveContainer" containerID="560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.548842 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"81c7beaf-bbe4-474d-a57b-fc3d4e076c62","Type":"ContainerDied","Data":"71c2e28fc36d3c7fbd9804b9536681ddd45ba5986dba819f3c1feca02fb779d7"} Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.548901 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.576415 4753 scope.go:117] "RemoveContainer" containerID="560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac" Jan 29 15:34:54 crc kubenswrapper[4753]: E0129 15:34:54.576824 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac\": container with ID starting with 560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac not found: ID does not exist" containerID="560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.576858 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac"} err="failed to get container status \"560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac\": rpc error: code = NotFound desc = could not find container \"560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac\": container with ID starting with 560f15b030be55c8a4bda8fed4ed4d15bf84b6fa2a382b0d7a59810d35bf99ac not found: ID does not exist" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.576883 4753 scope.go:117] "RemoveContainer" containerID="ffda8e1dd45a7a0454677be029cd8a457941119cf5c551962a9c8e8fe53daf0a" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.589404 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.610366 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.642845 4753 scope.go:117] "RemoveContainer" containerID="24b74692fa3128d6df9a65671c64a796937f2ca62652dbecbdbcfa2fc90dff0a" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.643394 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:54 crc kubenswrapper[4753]: E0129 15:34:54.644647 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbbe2a4-d836-4639-aeed-4706ba424bc8" containerName="nova-scheduler-scheduler" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.644669 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbbe2a4-d836-4639-aeed-4706ba424bc8" containerName="nova-scheduler-scheduler" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.645042 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbbe2a4-d836-4639-aeed-4706ba424bc8" containerName="nova-scheduler-scheduler" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.646541 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.651026 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.661239 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.670040 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4c21c48-52e6-4603-97e0-c33fc2ab2896-logs\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.670550 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-config-data\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.670702 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.670772 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxfv6\" (UniqueName: \"kubernetes.io/projected/d4c21c48-52e6-4603-97e0-c33fc2ab2896-kube-api-access-sxfv6\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.675456 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:54 crc kubenswrapper[4753]: W0129 15:34:54.691843 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod162bd0fb_4ccf_4276_8443_73b9b8a7f99f.slice/crio-e35cdb08c176052c7dab099e81f303f95794c53c57aade871e8fb35d91f54d7a WatchSource:0}: Error finding container e35cdb08c176052c7dab099e81f303f95794c53c57aade871e8fb35d91f54d7a: Status 404 returned error can't find the container with id e35cdb08c176052c7dab099e81f303f95794c53c57aade871e8fb35d91f54d7a Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.695595 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.705224 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.707332 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.709356 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.729778 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.748462 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.772779 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-config-data\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.772836 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.772865 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxfv6\" (UniqueName: \"kubernetes.io/projected/d4c21c48-52e6-4603-97e0-c33fc2ab2896-kube-api-access-sxfv6\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.772983 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4c21c48-52e6-4603-97e0-c33fc2ab2896-logs\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.773429 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4c21c48-52e6-4603-97e0-c33fc2ab2896-logs\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.778313 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.778545 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-config-data\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.789429 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxfv6\" (UniqueName: \"kubernetes.io/projected/d4c21c48-52e6-4603-97e0-c33fc2ab2896-kube-api-access-sxfv6\") pod \"nova-api-0\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.874407 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.874508 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-config-data\") pod \"nova-scheduler-0\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.874643 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89rsp\" (UniqueName: \"kubernetes.io/projected/dcad9510-6729-4c00-8857-b386e6f05806-kube-api-access-89rsp\") pod \"nova-scheduler-0\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.974698 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.976196 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-config-data\") pod \"nova-scheduler-0\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.976340 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89rsp\" (UniqueName: \"kubernetes.io/projected/dcad9510-6729-4c00-8857-b386e6f05806-kube-api-access-89rsp\") pod \"nova-scheduler-0\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.976396 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.981175 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-config-data\") pod \"nova-scheduler-0\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.985633 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:54 crc kubenswrapper[4753]: I0129 15:34:54.994805 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89rsp\" (UniqueName: \"kubernetes.io/projected/dcad9510-6729-4c00-8857-b386e6f05806-kube-api-access-89rsp\") pod \"nova-scheduler-0\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " pod="openstack/nova-scheduler-0" Jan 29 15:34:55 crc kubenswrapper[4753]: I0129 15:34:55.023871 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:34:55 crc kubenswrapper[4753]: I0129 15:34:55.331661 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:34:55 crc kubenswrapper[4753]: I0129 15:34:55.419076 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:34:55 crc kubenswrapper[4753]: I0129 15:34:55.562450 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4c21c48-52e6-4603-97e0-c33fc2ab2896","Type":"ContainerStarted","Data":"2ad220de0eaaab558a38cafeaf525e2d98b083a3b37926ee23c53332eb534748"} Jan 29 15:34:55 crc kubenswrapper[4753]: I0129 15:34:55.567290 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcad9510-6729-4c00-8857-b386e6f05806","Type":"ContainerStarted","Data":"a9676b80d4b1fc0fd2c6f6ba0b692b5d43430ed22e2b95200a192ecd7d07e539"} Jan 29 15:34:55 crc kubenswrapper[4753]: I0129 15:34:55.569311 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bd0fb-4ccf-4276-8443-73b9b8a7f99f","Type":"ContainerStarted","Data":"d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5"} Jan 29 15:34:55 crc kubenswrapper[4753]: I0129 15:34:55.569355 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bd0fb-4ccf-4276-8443-73b9b8a7f99f","Type":"ContainerStarted","Data":"6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7"} Jan 29 15:34:55 crc kubenswrapper[4753]: I0129 15:34:55.569371 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bd0fb-4ccf-4276-8443-73b9b8a7f99f","Type":"ContainerStarted","Data":"e35cdb08c176052c7dab099e81f303f95794c53c57aade871e8fb35d91f54d7a"} Jan 29 15:34:55 crc kubenswrapper[4753]: I0129 15:34:55.584421 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.584404307 podStartE2EDuration="1.584404307s" podCreationTimestamp="2026-01-29 15:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:55.579980568 +0000 UTC m=+5530.274714960" watchObservedRunningTime="2026-01-29 15:34:55.584404307 +0000 UTC m=+5530.279138689" Jan 29 15:34:55 crc kubenswrapper[4753]: I0129 15:34:55.611131 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.611106672 podStartE2EDuration="2.611106672s" podCreationTimestamp="2026-01-29 15:34:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:55.599623874 +0000 UTC m=+5530.294358266" watchObservedRunningTime="2026-01-29 15:34:55.611106672 +0000 UTC m=+5530.305841064" Jan 29 15:34:56 crc kubenswrapper[4753]: I0129 15:34:56.158954 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fbbe2a4-d836-4639-aeed-4706ba424bc8" path="/var/lib/kubelet/pods/5fbbe2a4-d836-4639-aeed-4706ba424bc8/volumes" Jan 29 15:34:56 crc kubenswrapper[4753]: I0129 15:34:56.160080 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c7beaf-bbe4-474d-a57b-fc3d4e076c62" path="/var/lib/kubelet/pods/81c7beaf-bbe4-474d-a57b-fc3d4e076c62/volumes" Jan 29 15:34:56 crc kubenswrapper[4753]: I0129 15:34:56.577678 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcad9510-6729-4c00-8857-b386e6f05806","Type":"ContainerStarted","Data":"0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336"} Jan 29 15:34:56 crc kubenswrapper[4753]: I0129 15:34:56.579392 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4c21c48-52e6-4603-97e0-c33fc2ab2896","Type":"ContainerStarted","Data":"f796d177b20c89bfe938e46351a6fd8893f34714c53ff5ac2b153681d17ac15b"} Jan 29 15:34:56 crc kubenswrapper[4753]: I0129 15:34:56.579477 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4c21c48-52e6-4603-97e0-c33fc2ab2896","Type":"ContainerStarted","Data":"7c9b4c32c91b8c6d6bdeebd5167d5892a44be83f419750e6c28653d5708adae6"} Jan 29 15:34:56 crc kubenswrapper[4753]: I0129 15:34:56.604023 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.604005404 podStartE2EDuration="2.604005404s" podCreationTimestamp="2026-01-29 15:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:34:56.600517371 +0000 UTC m=+5531.295251783" watchObservedRunningTime="2026-01-29 15:34:56.604005404 +0000 UTC m=+5531.298739786" Jan 29 15:34:57 crc kubenswrapper[4753]: I0129 15:34:57.055287 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:34:57 crc kubenswrapper[4753]: I0129 15:34:57.055343 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:34:57 crc kubenswrapper[4753]: I0129 15:34:57.055386 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:34:57 crc kubenswrapper[4753]: I0129 15:34:57.056133 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20e283980ad77b065d0dfa0d4018e594dc6a0c2625911542352b6567ce9e5f09"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:34:57 crc kubenswrapper[4753]: I0129 15:34:57.056215 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://20e283980ad77b065d0dfa0d4018e594dc6a0c2625911542352b6567ce9e5f09" gracePeriod=600 Jan 29 15:34:57 crc kubenswrapper[4753]: I0129 15:34:57.589765 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="20e283980ad77b065d0dfa0d4018e594dc6a0c2625911542352b6567ce9e5f09" exitCode=0 Jan 29 15:34:57 crc kubenswrapper[4753]: I0129 15:34:57.589846 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"20e283980ad77b065d0dfa0d4018e594dc6a0c2625911542352b6567ce9e5f09"} Jan 29 15:34:57 crc kubenswrapper[4753]: I0129 15:34:57.590190 4753 scope.go:117] "RemoveContainer" containerID="2eaf6def470adba5d133da8607e9aee72784be178bb9081770f8efef4ae7f7ca" Jan 29 15:34:58 crc kubenswrapper[4753]: I0129 15:34:58.599928 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e"} Jan 29 15:34:59 crc kubenswrapper[4753]: I0129 15:34:59.232265 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 15:34:59 crc kubenswrapper[4753]: I0129 15:34:59.232604 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 15:35:00 crc kubenswrapper[4753]: I0129 15:35:00.024660 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 15:35:04 crc kubenswrapper[4753]: I0129 15:35:04.231844 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 15:35:04 crc kubenswrapper[4753]: I0129 15:35:04.232332 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 15:35:04 crc kubenswrapper[4753]: I0129 15:35:04.974878 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 15:35:04 crc kubenswrapper[4753]: I0129 15:35:04.975204 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 15:35:05 crc kubenswrapper[4753]: I0129 15:35:05.025025 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 15:35:05 crc kubenswrapper[4753]: I0129 15:35:05.070182 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 15:35:05 crc kubenswrapper[4753]: I0129 15:35:05.314422 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.81:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:35:05 crc kubenswrapper[4753]: I0129 15:35:05.314519 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.81:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:35:05 crc kubenswrapper[4753]: I0129 15:35:05.703332 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 15:35:06 crc kubenswrapper[4753]: I0129 15:35:06.057441 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:35:06 crc kubenswrapper[4753]: I0129 15:35:06.057422 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:35:07 crc kubenswrapper[4753]: I0129 15:35:07.404451 4753 scope.go:117] "RemoveContainer" containerID="35f784f78a74ebf146871cc1f34d0e1771c60d7180ab610c68c41a92b60eac57" Jan 29 15:35:14 crc kubenswrapper[4753]: I0129 15:35:14.233885 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 15:35:14 crc kubenswrapper[4753]: I0129 15:35:14.234538 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 15:35:14 crc kubenswrapper[4753]: I0129 15:35:14.236443 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 15:35:14 crc kubenswrapper[4753]: I0129 15:35:14.236513 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 15:35:14 crc kubenswrapper[4753]: I0129 15:35:14.979725 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 15:35:14 crc kubenswrapper[4753]: I0129 15:35:14.980296 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 15:35:14 crc kubenswrapper[4753]: I0129 15:35:14.981068 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 15:35:14 crc kubenswrapper[4753]: I0129 15:35:14.984400 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 15:35:15 crc kubenswrapper[4753]: I0129 15:35:15.800138 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 15:35:15 crc kubenswrapper[4753]: I0129 15:35:15.805603 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.019777 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c78759bc5-lq749"] Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.021975 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.039318 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c78759bc5-lq749"] Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.134263 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-sb\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.134340 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-nb\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.134395 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbd4c\" (UniqueName: \"kubernetes.io/projected/7e362383-f440-41a3-9d31-7378a94aeca6-kube-api-access-kbd4c\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.134482 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-dns-svc\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.134609 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-config\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.236744 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-dns-svc\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.236811 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-config\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.236968 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-sb\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.237029 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-nb\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.237089 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbd4c\" (UniqueName: \"kubernetes.io/projected/7e362383-f440-41a3-9d31-7378a94aeca6-kube-api-access-kbd4c\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.237977 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-dns-svc\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.238054 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-config\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.238280 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-nb\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.238520 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-sb\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.278228 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbd4c\" (UniqueName: \"kubernetes.io/projected/7e362383-f440-41a3-9d31-7378a94aeca6-kube-api-access-kbd4c\") pod \"dnsmasq-dns-6c78759bc5-lq749\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.345419 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:16 crc kubenswrapper[4753]: W0129 15:35:16.848893 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e362383_f440_41a3_9d31_7378a94aeca6.slice/crio-5a6a267e94477409f80bdcd8cd930b79db39f2ec093f6461abcf29d7abb8095d WatchSource:0}: Error finding container 5a6a267e94477409f80bdcd8cd930b79db39f2ec093f6461abcf29d7abb8095d: Status 404 returned error can't find the container with id 5a6a267e94477409f80bdcd8cd930b79db39f2ec093f6461abcf29d7abb8095d Jan 29 15:35:16 crc kubenswrapper[4753]: I0129 15:35:16.857360 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c78759bc5-lq749"] Jan 29 15:35:17 crc kubenswrapper[4753]: I0129 15:35:17.815720 4753 generic.go:334] "Generic (PLEG): container finished" podID="7e362383-f440-41a3-9d31-7378a94aeca6" containerID="e01d6e8e0c6c3615bbf394c6709d72cc87a89a261ad5adfad8c1f8a6edce4554" exitCode=0 Jan 29 15:35:17 crc kubenswrapper[4753]: I0129 15:35:17.815837 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" event={"ID":"7e362383-f440-41a3-9d31-7378a94aeca6","Type":"ContainerDied","Data":"e01d6e8e0c6c3615bbf394c6709d72cc87a89a261ad5adfad8c1f8a6edce4554"} Jan 29 15:35:17 crc kubenswrapper[4753]: I0129 15:35:17.816166 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" event={"ID":"7e362383-f440-41a3-9d31-7378a94aeca6","Type":"ContainerStarted","Data":"5a6a267e94477409f80bdcd8cd930b79db39f2ec093f6461abcf29d7abb8095d"} Jan 29 15:35:18 crc kubenswrapper[4753]: I0129 15:35:18.837036 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" event={"ID":"7e362383-f440-41a3-9d31-7378a94aeca6","Type":"ContainerStarted","Data":"95dc5639c8b059145586af0cc3df2bc73ad9f53187f59a5678d9c3db90f50e99"} Jan 29 15:35:18 crc kubenswrapper[4753]: I0129 15:35:18.839043 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:18 crc kubenswrapper[4753]: I0129 15:35:18.870906 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" podStartSLOduration=3.870885542 podStartE2EDuration="3.870885542s" podCreationTimestamp="2026-01-29 15:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:35:18.863110725 +0000 UTC m=+5553.557845117" watchObservedRunningTime="2026-01-29 15:35:18.870885542 +0000 UTC m=+5553.565619924" Jan 29 15:35:26 crc kubenswrapper[4753]: I0129 15:35:26.347307 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:26 crc kubenswrapper[4753]: I0129 15:35:26.450200 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77779959c9-5sw5r"] Jan 29 15:35:26 crc kubenswrapper[4753]: I0129 15:35:26.450605 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" podUID="8fc5f142-1f9b-45b9-a0df-0abe9530d688" containerName="dnsmasq-dns" containerID="cri-o://950fdfde8da868542ae3476859a5f714f1afd03535c8e5b01a25b64b5f6fba9b" gracePeriod=10 Jan 29 15:35:26 crc kubenswrapper[4753]: I0129 15:35:26.912220 4753 generic.go:334] "Generic (PLEG): container finished" podID="8fc5f142-1f9b-45b9-a0df-0abe9530d688" containerID="950fdfde8da868542ae3476859a5f714f1afd03535c8e5b01a25b64b5f6fba9b" exitCode=0 Jan 29 15:35:26 crc kubenswrapper[4753]: I0129 15:35:26.912528 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" event={"ID":"8fc5f142-1f9b-45b9-a0df-0abe9530d688","Type":"ContainerDied","Data":"950fdfde8da868542ae3476859a5f714f1afd03535c8e5b01a25b64b5f6fba9b"} Jan 29 15:35:26 crc kubenswrapper[4753]: I0129 15:35:26.912559 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" event={"ID":"8fc5f142-1f9b-45b9-a0df-0abe9530d688","Type":"ContainerDied","Data":"72fd5c8b652eb05a43c54852288c24cdde8875a198545469926f968be8c1b0a4"} Jan 29 15:35:26 crc kubenswrapper[4753]: I0129 15:35:26.912574 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72fd5c8b652eb05a43c54852288c24cdde8875a198545469926f968be8c1b0a4" Jan 29 15:35:26 crc kubenswrapper[4753]: I0129 15:35:26.939605 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.005728 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-nb\") pod \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.005818 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-dns-svc\") pod \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.005862 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ztqn\" (UniqueName: \"kubernetes.io/projected/8fc5f142-1f9b-45b9-a0df-0abe9530d688-kube-api-access-4ztqn\") pod \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.005985 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-sb\") pod \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.006011 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-config\") pod \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\" (UID: \"8fc5f142-1f9b-45b9-a0df-0abe9530d688\") " Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.026431 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc5f142-1f9b-45b9-a0df-0abe9530d688-kube-api-access-4ztqn" (OuterVolumeSpecName: "kube-api-access-4ztqn") pod "8fc5f142-1f9b-45b9-a0df-0abe9530d688" (UID: "8fc5f142-1f9b-45b9-a0df-0abe9530d688"). InnerVolumeSpecName "kube-api-access-4ztqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.073003 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8fc5f142-1f9b-45b9-a0df-0abe9530d688" (UID: "8fc5f142-1f9b-45b9-a0df-0abe9530d688"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.077706 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8fc5f142-1f9b-45b9-a0df-0abe9530d688" (UID: "8fc5f142-1f9b-45b9-a0df-0abe9530d688"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.078236 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-config" (OuterVolumeSpecName: "config") pod "8fc5f142-1f9b-45b9-a0df-0abe9530d688" (UID: "8fc5f142-1f9b-45b9-a0df-0abe9530d688"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.098742 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8fc5f142-1f9b-45b9-a0df-0abe9530d688" (UID: "8fc5f142-1f9b-45b9-a0df-0abe9530d688"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.109088 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.109122 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.109134 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ztqn\" (UniqueName: \"kubernetes.io/projected/8fc5f142-1f9b-45b9-a0df-0abe9530d688-kube-api-access-4ztqn\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.109144 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.109155 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fc5f142-1f9b-45b9-a0df-0abe9530d688-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.920701 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77779959c9-5sw5r" Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.960204 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77779959c9-5sw5r"] Jan 29 15:35:27 crc kubenswrapper[4753]: I0129 15:35:27.973196 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77779959c9-5sw5r"] Jan 29 15:35:28 crc kubenswrapper[4753]: I0129 15:35:28.161772 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc5f142-1f9b-45b9-a0df-0abe9530d688" path="/var/lib/kubelet/pods/8fc5f142-1f9b-45b9-a0df-0abe9530d688/volumes" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.007590 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6zxpq"] Jan 29 15:35:29 crc kubenswrapper[4753]: E0129 15:35:29.008055 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc5f142-1f9b-45b9-a0df-0abe9530d688" containerName="init" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.008070 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc5f142-1f9b-45b9-a0df-0abe9530d688" containerName="init" Jan 29 15:35:29 crc kubenswrapper[4753]: E0129 15:35:29.008087 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc5f142-1f9b-45b9-a0df-0abe9530d688" containerName="dnsmasq-dns" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.008093 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc5f142-1f9b-45b9-a0df-0abe9530d688" containerName="dnsmasq-dns" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.008305 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc5f142-1f9b-45b9-a0df-0abe9530d688" containerName="dnsmasq-dns" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.008938 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6zxpq" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.016360 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6zxpq"] Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.100931 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-53dd-account-create-update-zlbhj"] Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.102676 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-53dd-account-create-update-zlbhj" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.104492 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.115934 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-53dd-account-create-update-zlbhj"] Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.147422 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7lf\" (UniqueName: \"kubernetes.io/projected/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-kube-api-access-ts7lf\") pod \"cinder-db-create-6zxpq\" (UID: \"f3129f07-ee9a-402f-97f1-8a8c3093b3ae\") " pod="openstack/cinder-db-create-6zxpq" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.147468 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-operator-scripts\") pod \"cinder-db-create-6zxpq\" (UID: \"f3129f07-ee9a-402f-97f1-8a8c3093b3ae\") " pod="openstack/cinder-db-create-6zxpq" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.249495 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts7lf\" (UniqueName: \"kubernetes.io/projected/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-kube-api-access-ts7lf\") pod \"cinder-db-create-6zxpq\" (UID: \"f3129f07-ee9a-402f-97f1-8a8c3093b3ae\") " pod="openstack/cinder-db-create-6zxpq" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.249569 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-operator-scripts\") pod \"cinder-db-create-6zxpq\" (UID: \"f3129f07-ee9a-402f-97f1-8a8c3093b3ae\") " pod="openstack/cinder-db-create-6zxpq" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.250483 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-operator-scripts\") pod \"cinder-db-create-6zxpq\" (UID: \"f3129f07-ee9a-402f-97f1-8a8c3093b3ae\") " pod="openstack/cinder-db-create-6zxpq" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.251316 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbr4w\" (UniqueName: \"kubernetes.io/projected/50dc7afb-2680-43db-88e4-8b5315ee34c4-kube-api-access-tbr4w\") pod \"cinder-53dd-account-create-update-zlbhj\" (UID: \"50dc7afb-2680-43db-88e4-8b5315ee34c4\") " pod="openstack/cinder-53dd-account-create-update-zlbhj" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.252119 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50dc7afb-2680-43db-88e4-8b5315ee34c4-operator-scripts\") pod \"cinder-53dd-account-create-update-zlbhj\" (UID: \"50dc7afb-2680-43db-88e4-8b5315ee34c4\") " pod="openstack/cinder-53dd-account-create-update-zlbhj" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.276873 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts7lf\" (UniqueName: \"kubernetes.io/projected/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-kube-api-access-ts7lf\") pod \"cinder-db-create-6zxpq\" (UID: \"f3129f07-ee9a-402f-97f1-8a8c3093b3ae\") " pod="openstack/cinder-db-create-6zxpq" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.324402 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6zxpq" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.353954 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbr4w\" (UniqueName: \"kubernetes.io/projected/50dc7afb-2680-43db-88e4-8b5315ee34c4-kube-api-access-tbr4w\") pod \"cinder-53dd-account-create-update-zlbhj\" (UID: \"50dc7afb-2680-43db-88e4-8b5315ee34c4\") " pod="openstack/cinder-53dd-account-create-update-zlbhj" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.354053 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50dc7afb-2680-43db-88e4-8b5315ee34c4-operator-scripts\") pod \"cinder-53dd-account-create-update-zlbhj\" (UID: \"50dc7afb-2680-43db-88e4-8b5315ee34c4\") " pod="openstack/cinder-53dd-account-create-update-zlbhj" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.354957 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50dc7afb-2680-43db-88e4-8b5315ee34c4-operator-scripts\") pod \"cinder-53dd-account-create-update-zlbhj\" (UID: \"50dc7afb-2680-43db-88e4-8b5315ee34c4\") " pod="openstack/cinder-53dd-account-create-update-zlbhj" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.374782 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbr4w\" (UniqueName: \"kubernetes.io/projected/50dc7afb-2680-43db-88e4-8b5315ee34c4-kube-api-access-tbr4w\") pod \"cinder-53dd-account-create-update-zlbhj\" (UID: \"50dc7afb-2680-43db-88e4-8b5315ee34c4\") " pod="openstack/cinder-53dd-account-create-update-zlbhj" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.418895 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-53dd-account-create-update-zlbhj" Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.867958 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6zxpq"] Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.941305 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-53dd-account-create-update-zlbhj"] Jan 29 15:35:29 crc kubenswrapper[4753]: I0129 15:35:29.950537 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6zxpq" event={"ID":"f3129f07-ee9a-402f-97f1-8a8c3093b3ae","Type":"ContainerStarted","Data":"bf4142cbf07f3f7509adf538c140d42c5c90d3e002f47973e3fbb35fb68156d9"} Jan 29 15:35:30 crc kubenswrapper[4753]: I0129 15:35:30.963375 4753 generic.go:334] "Generic (PLEG): container finished" podID="f3129f07-ee9a-402f-97f1-8a8c3093b3ae" containerID="f3e96303b31f0a4eaf9543ecf4a2fbb463326996b9ef00debfffac3c9003316a" exitCode=0 Jan 29 15:35:30 crc kubenswrapper[4753]: I0129 15:35:30.963475 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6zxpq" event={"ID":"f3129f07-ee9a-402f-97f1-8a8c3093b3ae","Type":"ContainerDied","Data":"f3e96303b31f0a4eaf9543ecf4a2fbb463326996b9ef00debfffac3c9003316a"} Jan 29 15:35:30 crc kubenswrapper[4753]: I0129 15:35:30.966670 4753 generic.go:334] "Generic (PLEG): container finished" podID="50dc7afb-2680-43db-88e4-8b5315ee34c4" containerID="57c97e4fbc6500efa3262734e237de8aa934f2cf7da7e1285ee3f2add032876f" exitCode=0 Jan 29 15:35:30 crc kubenswrapper[4753]: I0129 15:35:30.966708 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-53dd-account-create-update-zlbhj" event={"ID":"50dc7afb-2680-43db-88e4-8b5315ee34c4","Type":"ContainerDied","Data":"57c97e4fbc6500efa3262734e237de8aa934f2cf7da7e1285ee3f2add032876f"} Jan 29 15:35:30 crc kubenswrapper[4753]: I0129 15:35:30.966748 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-53dd-account-create-update-zlbhj" event={"ID":"50dc7afb-2680-43db-88e4-8b5315ee34c4","Type":"ContainerStarted","Data":"2554191b7735c30e52a05d09cdd251f72d5fcc101fa68c28a83ad077f6dbf15a"} Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.417994 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6zxpq" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.426623 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-53dd-account-create-update-zlbhj" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.526718 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts7lf\" (UniqueName: \"kubernetes.io/projected/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-kube-api-access-ts7lf\") pod \"f3129f07-ee9a-402f-97f1-8a8c3093b3ae\" (UID: \"f3129f07-ee9a-402f-97f1-8a8c3093b3ae\") " Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.527030 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50dc7afb-2680-43db-88e4-8b5315ee34c4-operator-scripts\") pod \"50dc7afb-2680-43db-88e4-8b5315ee34c4\" (UID: \"50dc7afb-2680-43db-88e4-8b5315ee34c4\") " Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.527188 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbr4w\" (UniqueName: \"kubernetes.io/projected/50dc7afb-2680-43db-88e4-8b5315ee34c4-kube-api-access-tbr4w\") pod \"50dc7afb-2680-43db-88e4-8b5315ee34c4\" (UID: \"50dc7afb-2680-43db-88e4-8b5315ee34c4\") " Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.527217 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-operator-scripts\") pod \"f3129f07-ee9a-402f-97f1-8a8c3093b3ae\" (UID: \"f3129f07-ee9a-402f-97f1-8a8c3093b3ae\") " Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.527575 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50dc7afb-2680-43db-88e4-8b5315ee34c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50dc7afb-2680-43db-88e4-8b5315ee34c4" (UID: "50dc7afb-2680-43db-88e4-8b5315ee34c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.528047 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3129f07-ee9a-402f-97f1-8a8c3093b3ae" (UID: "f3129f07-ee9a-402f-97f1-8a8c3093b3ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.532929 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-kube-api-access-ts7lf" (OuterVolumeSpecName: "kube-api-access-ts7lf") pod "f3129f07-ee9a-402f-97f1-8a8c3093b3ae" (UID: "f3129f07-ee9a-402f-97f1-8a8c3093b3ae"). InnerVolumeSpecName "kube-api-access-ts7lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.534173 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50dc7afb-2680-43db-88e4-8b5315ee34c4-kube-api-access-tbr4w" (OuterVolumeSpecName: "kube-api-access-tbr4w") pod "50dc7afb-2680-43db-88e4-8b5315ee34c4" (UID: "50dc7afb-2680-43db-88e4-8b5315ee34c4"). InnerVolumeSpecName "kube-api-access-tbr4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.630391 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts7lf\" (UniqueName: \"kubernetes.io/projected/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-kube-api-access-ts7lf\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.630460 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50dc7afb-2680-43db-88e4-8b5315ee34c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.630482 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbr4w\" (UniqueName: \"kubernetes.io/projected/50dc7afb-2680-43db-88e4-8b5315ee34c4-kube-api-access-tbr4w\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.630502 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3129f07-ee9a-402f-97f1-8a8c3093b3ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.989353 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6zxpq" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.989375 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6zxpq" event={"ID":"f3129f07-ee9a-402f-97f1-8a8c3093b3ae","Type":"ContainerDied","Data":"bf4142cbf07f3f7509adf538c140d42c5c90d3e002f47973e3fbb35fb68156d9"} Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.989465 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf4142cbf07f3f7509adf538c140d42c5c90d3e002f47973e3fbb35fb68156d9" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.991491 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-53dd-account-create-update-zlbhj" event={"ID":"50dc7afb-2680-43db-88e4-8b5315ee34c4","Type":"ContainerDied","Data":"2554191b7735c30e52a05d09cdd251f72d5fcc101fa68c28a83ad077f6dbf15a"} Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.991542 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2554191b7735c30e52a05d09cdd251f72d5fcc101fa68c28a83ad077f6dbf15a" Jan 29 15:35:32 crc kubenswrapper[4753]: I0129 15:35:32.991571 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-53dd-account-create-update-zlbhj" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.362612 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-cdd6x"] Jan 29 15:35:34 crc kubenswrapper[4753]: E0129 15:35:34.363314 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3129f07-ee9a-402f-97f1-8a8c3093b3ae" containerName="mariadb-database-create" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.363331 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3129f07-ee9a-402f-97f1-8a8c3093b3ae" containerName="mariadb-database-create" Jan 29 15:35:34 crc kubenswrapper[4753]: E0129 15:35:34.363349 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50dc7afb-2680-43db-88e4-8b5315ee34c4" containerName="mariadb-account-create-update" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.363357 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="50dc7afb-2680-43db-88e4-8b5315ee34c4" containerName="mariadb-account-create-update" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.363552 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="50dc7afb-2680-43db-88e4-8b5315ee34c4" containerName="mariadb-account-create-update" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.363566 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3129f07-ee9a-402f-97f1-8a8c3093b3ae" containerName="mariadb-database-create" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.364200 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.366521 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.367125 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.368660 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xlgk6" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.373624 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-cdd6x"] Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.462895 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-etc-machine-id\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.462986 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xb49\" (UniqueName: \"kubernetes.io/projected/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-kube-api-access-7xb49\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.463023 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-db-sync-config-data\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.463111 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-config-data\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.463286 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-scripts\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.463332 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-combined-ca-bundle\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.564946 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-db-sync-config-data\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.565080 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-config-data\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.565127 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-scripts\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.565154 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-combined-ca-bundle\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.565216 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-etc-machine-id\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.565282 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xb49\" (UniqueName: \"kubernetes.io/projected/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-kube-api-access-7xb49\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.565630 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-etc-machine-id\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.578309 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-config-data\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.579356 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-scripts\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.580878 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-db-sync-config-data\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.583827 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-combined-ca-bundle\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.594929 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xb49\" (UniqueName: \"kubernetes.io/projected/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-kube-api-access-7xb49\") pod \"cinder-db-sync-cdd6x\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:34 crc kubenswrapper[4753]: I0129 15:35:34.693496 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:35 crc kubenswrapper[4753]: I0129 15:35:35.168132 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-cdd6x"] Jan 29 15:35:36 crc kubenswrapper[4753]: I0129 15:35:36.017540 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cdd6x" event={"ID":"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf","Type":"ContainerStarted","Data":"e8f4f1e5862d855a9e7856d9ee77bc0b86f5442662aef00530c6df7b12717aa4"} Jan 29 15:35:36 crc kubenswrapper[4753]: I0129 15:35:36.017949 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cdd6x" event={"ID":"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf","Type":"ContainerStarted","Data":"af37d2f4fc95103e9b61e80f18caa6ec120571b116d0d9077d5ba965cba2dbee"} Jan 29 15:35:36 crc kubenswrapper[4753]: I0129 15:35:36.034408 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-cdd6x" podStartSLOduration=2.034391962 podStartE2EDuration="2.034391962s" podCreationTimestamp="2026-01-29 15:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:35:36.032889342 +0000 UTC m=+5570.727623724" watchObservedRunningTime="2026-01-29 15:35:36.034391962 +0000 UTC m=+5570.729126344" Jan 29 15:35:39 crc kubenswrapper[4753]: I0129 15:35:39.043887 4753 generic.go:334] "Generic (PLEG): container finished" podID="fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" containerID="e8f4f1e5862d855a9e7856d9ee77bc0b86f5442662aef00530c6df7b12717aa4" exitCode=0 Jan 29 15:35:39 crc kubenswrapper[4753]: I0129 15:35:39.044004 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cdd6x" event={"ID":"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf","Type":"ContainerDied","Data":"e8f4f1e5862d855a9e7856d9ee77bc0b86f5442662aef00530c6df7b12717aa4"} Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.484108 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.579581 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-combined-ca-bundle\") pod \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.579831 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-db-sync-config-data\") pod \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.579898 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-config-data\") pod \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.579914 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-scripts\") pod \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.579996 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xb49\" (UniqueName: \"kubernetes.io/projected/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-kube-api-access-7xb49\") pod \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.580052 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-etc-machine-id\") pod \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\" (UID: \"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf\") " Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.580244 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" (UID: "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.580405 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.585479 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-kube-api-access-7xb49" (OuterVolumeSpecName: "kube-api-access-7xb49") pod "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" (UID: "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf"). InnerVolumeSpecName "kube-api-access-7xb49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.585857 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" (UID: "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.589283 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-scripts" (OuterVolumeSpecName: "scripts") pod "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" (UID: "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.604371 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" (UID: "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.623537 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-config-data" (OuterVolumeSpecName: "config-data") pod "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" (UID: "fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.681920 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.681967 4753 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.682019 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.682036 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:40 crc kubenswrapper[4753]: I0129 15:35:40.682050 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xb49\" (UniqueName: \"kubernetes.io/projected/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf-kube-api-access-7xb49\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.068029 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cdd6x" event={"ID":"fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf","Type":"ContainerDied","Data":"af37d2f4fc95103e9b61e80f18caa6ec120571b116d0d9077d5ba965cba2dbee"} Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.068068 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af37d2f4fc95103e9b61e80f18caa6ec120571b116d0d9077d5ba965cba2dbee" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.068068 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cdd6x" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.383321 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6c4df955-76jcm"] Jan 29 15:35:41 crc kubenswrapper[4753]: E0129 15:35:41.383772 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" containerName="cinder-db-sync" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.383792 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" containerName="cinder-db-sync" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.384044 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" containerName="cinder-db-sync" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.385993 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.413602 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6c4df955-76jcm"] Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.497749 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.497797 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6jnp\" (UniqueName: \"kubernetes.io/projected/87c3e570-7085-4a5f-b38b-7d4d0df86a99-kube-api-access-m6jnp\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.497835 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-config\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.497914 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.497950 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-dns-svc\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.504104 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.511409 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.513775 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xlgk6" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.513988 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.514319 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.516280 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.519852 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600147 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600243 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6jnp\" (UniqueName: \"kubernetes.io/projected/87c3e570-7085-4a5f-b38b-7d4d0df86a99-kube-api-access-m6jnp\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600282 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0b86796-88a4-4177-92ce-ea4cbceb4749-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600311 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b86796-88a4-4177-92ce-ea4cbceb4749-logs\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600347 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkvnh\" (UniqueName: \"kubernetes.io/projected/d0b86796-88a4-4177-92ce-ea4cbceb4749-kube-api-access-rkvnh\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600403 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-config\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600510 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data-custom\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600576 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600616 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600684 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-dns-svc\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600732 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.600759 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-scripts\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.601432 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.601435 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-config\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.601733 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-dns-svc\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.601930 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87c3e570-7085-4a5f-b38b-7d4d0df86a99-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.621368 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6jnp\" (UniqueName: \"kubernetes.io/projected/87c3e570-7085-4a5f-b38b-7d4d0df86a99-kube-api-access-m6jnp\") pod \"dnsmasq-dns-6f6c4df955-76jcm\" (UID: \"87c3e570-7085-4a5f-b38b-7d4d0df86a99\") " pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.702792 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data-custom\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.702897 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.702976 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.703000 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-scripts\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.703067 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0b86796-88a4-4177-92ce-ea4cbceb4749-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.703094 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b86796-88a4-4177-92ce-ea4cbceb4749-logs\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.703125 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkvnh\" (UniqueName: \"kubernetes.io/projected/d0b86796-88a4-4177-92ce-ea4cbceb4749-kube-api-access-rkvnh\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.703217 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0b86796-88a4-4177-92ce-ea4cbceb4749-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.703563 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b86796-88a4-4177-92ce-ea4cbceb4749-logs\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.706481 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data-custom\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.707504 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.708625 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-scripts\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.709324 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.711645 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.727068 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkvnh\" (UniqueName: \"kubernetes.io/projected/d0b86796-88a4-4177-92ce-ea4cbceb4749-kube-api-access-rkvnh\") pod \"cinder-api-0\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " pod="openstack/cinder-api-0" Jan 29 15:35:41 crc kubenswrapper[4753]: I0129 15:35:41.833302 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 15:35:42 crc kubenswrapper[4753]: I0129 15:35:42.261612 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6c4df955-76jcm"] Jan 29 15:35:42 crc kubenswrapper[4753]: W0129 15:35:42.399487 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0b86796_88a4_4177_92ce_ea4cbceb4749.slice/crio-76160e8d7dcdfea6edddc64b78fda41581a602189c60bf5f79dd166501d5de94 WatchSource:0}: Error finding container 76160e8d7dcdfea6edddc64b78fda41581a602189c60bf5f79dd166501d5de94: Status 404 returned error can't find the container with id 76160e8d7dcdfea6edddc64b78fda41581a602189c60bf5f79dd166501d5de94 Jan 29 15:35:42 crc kubenswrapper[4753]: I0129 15:35:42.403704 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 15:35:43 crc kubenswrapper[4753]: I0129 15:35:43.084903 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d0b86796-88a4-4177-92ce-ea4cbceb4749","Type":"ContainerStarted","Data":"f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90"} Jan 29 15:35:43 crc kubenswrapper[4753]: I0129 15:35:43.084951 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d0b86796-88a4-4177-92ce-ea4cbceb4749","Type":"ContainerStarted","Data":"76160e8d7dcdfea6edddc64b78fda41581a602189c60bf5f79dd166501d5de94"} Jan 29 15:35:43 crc kubenswrapper[4753]: I0129 15:35:43.086342 4753 generic.go:334] "Generic (PLEG): container finished" podID="87c3e570-7085-4a5f-b38b-7d4d0df86a99" containerID="c53e49b7feac94269cc96cca26a3740896aa88ca33f6810b804f781f138b5455" exitCode=0 Jan 29 15:35:43 crc kubenswrapper[4753]: I0129 15:35:43.086369 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" event={"ID":"87c3e570-7085-4a5f-b38b-7d4d0df86a99","Type":"ContainerDied","Data":"c53e49b7feac94269cc96cca26a3740896aa88ca33f6810b804f781f138b5455"} Jan 29 15:35:43 crc kubenswrapper[4753]: I0129 15:35:43.086402 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" event={"ID":"87c3e570-7085-4a5f-b38b-7d4d0df86a99","Type":"ContainerStarted","Data":"6880c4d0cb5a615e9d39ba01c92c2fe001d83954e8a5839e1cdf976e49ce36c5"} Jan 29 15:35:44 crc kubenswrapper[4753]: I0129 15:35:44.095407 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d0b86796-88a4-4177-92ce-ea4cbceb4749","Type":"ContainerStarted","Data":"9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf"} Jan 29 15:35:44 crc kubenswrapper[4753]: I0129 15:35:44.096017 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 29 15:35:44 crc kubenswrapper[4753]: I0129 15:35:44.096968 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" event={"ID":"87c3e570-7085-4a5f-b38b-7d4d0df86a99","Type":"ContainerStarted","Data":"5305310c011a2b5f4274c35f999620d2cf129ae5d02a58f4338944e9e97b156b"} Jan 29 15:35:44 crc kubenswrapper[4753]: I0129 15:35:44.097200 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:44 crc kubenswrapper[4753]: I0129 15:35:44.112620 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.112601067 podStartE2EDuration="3.112601067s" podCreationTimestamp="2026-01-29 15:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:35:44.111324662 +0000 UTC m=+5578.806059064" watchObservedRunningTime="2026-01-29 15:35:44.112601067 +0000 UTC m=+5578.807335449" Jan 29 15:35:44 crc kubenswrapper[4753]: I0129 15:35:44.137744 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" podStartSLOduration=3.13772523 podStartE2EDuration="3.13772523s" podCreationTimestamp="2026-01-29 15:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:35:44.128305848 +0000 UTC m=+5578.823040230" watchObservedRunningTime="2026-01-29 15:35:44.13772523 +0000 UTC m=+5578.832459612" Jan 29 15:35:51 crc kubenswrapper[4753]: I0129 15:35:51.713311 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6c4df955-76jcm" Jan 29 15:35:51 crc kubenswrapper[4753]: I0129 15:35:51.780338 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c78759bc5-lq749"] Jan 29 15:35:51 crc kubenswrapper[4753]: I0129 15:35:51.780621 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" podUID="7e362383-f440-41a3-9d31-7378a94aeca6" containerName="dnsmasq-dns" containerID="cri-o://95dc5639c8b059145586af0cc3df2bc73ad9f53187f59a5678d9c3db90f50e99" gracePeriod=10 Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.201193 4753 generic.go:334] "Generic (PLEG): container finished" podID="7e362383-f440-41a3-9d31-7378a94aeca6" containerID="95dc5639c8b059145586af0cc3df2bc73ad9f53187f59a5678d9c3db90f50e99" exitCode=0 Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.201602 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" event={"ID":"7e362383-f440-41a3-9d31-7378a94aeca6","Type":"ContainerDied","Data":"95dc5639c8b059145586af0cc3df2bc73ad9f53187f59a5678d9c3db90f50e99"} Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.360898 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.448587 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-config\") pod \"7e362383-f440-41a3-9d31-7378a94aeca6\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.448769 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-nb\") pod \"7e362383-f440-41a3-9d31-7378a94aeca6\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.448843 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-dns-svc\") pod \"7e362383-f440-41a3-9d31-7378a94aeca6\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.448936 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbd4c\" (UniqueName: \"kubernetes.io/projected/7e362383-f440-41a3-9d31-7378a94aeca6-kube-api-access-kbd4c\") pod \"7e362383-f440-41a3-9d31-7378a94aeca6\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.449000 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-sb\") pod \"7e362383-f440-41a3-9d31-7378a94aeca6\" (UID: \"7e362383-f440-41a3-9d31-7378a94aeca6\") " Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.474662 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e362383-f440-41a3-9d31-7378a94aeca6-kube-api-access-kbd4c" (OuterVolumeSpecName: "kube-api-access-kbd4c") pod "7e362383-f440-41a3-9d31-7378a94aeca6" (UID: "7e362383-f440-41a3-9d31-7378a94aeca6"). InnerVolumeSpecName "kube-api-access-kbd4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.500697 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e362383-f440-41a3-9d31-7378a94aeca6" (UID: "7e362383-f440-41a3-9d31-7378a94aeca6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.506727 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e362383-f440-41a3-9d31-7378a94aeca6" (UID: "7e362383-f440-41a3-9d31-7378a94aeca6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.509842 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e362383-f440-41a3-9d31-7378a94aeca6" (UID: "7e362383-f440-41a3-9d31-7378a94aeca6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.511524 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-config" (OuterVolumeSpecName: "config") pod "7e362383-f440-41a3-9d31-7378a94aeca6" (UID: "7e362383-f440-41a3-9d31-7378a94aeca6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.550676 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbd4c\" (UniqueName: \"kubernetes.io/projected/7e362383-f440-41a3-9d31-7378a94aeca6-kube-api-access-kbd4c\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.550721 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.550734 4753 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.550747 4753 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:52 crc kubenswrapper[4753]: I0129 15:35:52.550759 4753 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e362383-f440-41a3-9d31-7378a94aeca6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.032047 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.032369 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-log" containerID="cri-o://6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7" gracePeriod=30 Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.032433 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-metadata" containerID="cri-o://d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5" gracePeriod=30 Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.047710 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.047927 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dcad9510-6729-4c00-8857-b386e6f05806" containerName="nova-scheduler-scheduler" containerID="cri-o://0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" gracePeriod=30 Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.068753 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.069049 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-log" containerID="cri-o://f796d177b20c89bfe938e46351a6fd8893f34714c53ff5ac2b153681d17ac15b" gracePeriod=30 Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.069436 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-api" containerID="cri-o://7c9b4c32c91b8c6d6bdeebd5167d5892a44be83f419750e6c28653d5708adae6" gracePeriod=30 Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.084102 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.084555 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="037a17d1-e25f-461b-be74-d4127a64ed11" containerName="nova-cell0-conductor-conductor" containerID="cri-o://56f98792b126883286141647cec180dbbb79a6818635795f8a311bcb1e51c95e" gracePeriod=30 Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.097971 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.098803 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6fe820fd-d9f1-409c-a860-f89f10c60c12" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6" gracePeriod=30 Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.256082 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.256210 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c78759bc5-lq749" event={"ID":"7e362383-f440-41a3-9d31-7378a94aeca6","Type":"ContainerDied","Data":"5a6a267e94477409f80bdcd8cd930b79db39f2ec093f6461abcf29d7abb8095d"} Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.256272 4753 scope.go:117] "RemoveContainer" containerID="95dc5639c8b059145586af0cc3df2bc73ad9f53187f59a5678d9c3db90f50e99" Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.270722 4753 generic.go:334] "Generic (PLEG): container finished" podID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerID="6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7" exitCode=143 Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.270844 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bd0fb-4ccf-4276-8443-73b9b8a7f99f","Type":"ContainerDied","Data":"6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7"} Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.276345 4753 generic.go:334] "Generic (PLEG): container finished" podID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerID="f796d177b20c89bfe938e46351a6fd8893f34714c53ff5ac2b153681d17ac15b" exitCode=143 Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.276390 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4c21c48-52e6-4603-97e0-c33fc2ab2896","Type":"ContainerDied","Data":"f796d177b20c89bfe938e46351a6fd8893f34714c53ff5ac2b153681d17ac15b"} Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.293171 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c78759bc5-lq749"] Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.304359 4753 scope.go:117] "RemoveContainer" containerID="e01d6e8e0c6c3615bbf394c6709d72cc87a89a261ad5adfad8c1f8a6edce4554" Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.306884 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c78759bc5-lq749"] Jan 29 15:35:53 crc kubenswrapper[4753]: I0129 15:35:53.967522 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.027175 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.160564 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e362383-f440-41a3-9d31-7378a94aeca6" path="/var/lib/kubelet/pods/7e362383-f440-41a3-9d31-7378a94aeca6/volumes" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.210581 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-config-data\") pod \"6fe820fd-d9f1-409c-a860-f89f10c60c12\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.211247 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4lhw\" (UniqueName: \"kubernetes.io/projected/6fe820fd-d9f1-409c-a860-f89f10c60c12-kube-api-access-q4lhw\") pod \"6fe820fd-d9f1-409c-a860-f89f10c60c12\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.211407 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-combined-ca-bundle\") pod \"6fe820fd-d9f1-409c-a860-f89f10c60c12\" (UID: \"6fe820fd-d9f1-409c-a860-f89f10c60c12\") " Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.235113 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe820fd-d9f1-409c-a860-f89f10c60c12-kube-api-access-q4lhw" (OuterVolumeSpecName: "kube-api-access-q4lhw") pod "6fe820fd-d9f1-409c-a860-f89f10c60c12" (UID: "6fe820fd-d9f1-409c-a860-f89f10c60c12"). InnerVolumeSpecName "kube-api-access-q4lhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.242300 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-config-data" (OuterVolumeSpecName: "config-data") pod "6fe820fd-d9f1-409c-a860-f89f10c60c12" (UID: "6fe820fd-d9f1-409c-a860-f89f10c60c12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.257885 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fe820fd-d9f1-409c-a860-f89f10c60c12" (UID: "6fe820fd-d9f1-409c-a860-f89f10c60c12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.287338 4753 generic.go:334] "Generic (PLEG): container finished" podID="6fe820fd-d9f1-409c-a860-f89f10c60c12" containerID="bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6" exitCode=0 Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.287404 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6fe820fd-d9f1-409c-a860-f89f10c60c12","Type":"ContainerDied","Data":"bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6"} Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.287430 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6fe820fd-d9f1-409c-a860-f89f10c60c12","Type":"ContainerDied","Data":"873d143c83f8f531a6b0d96ef80668fa2bd5253056c3200f12ded05cb147f89e"} Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.287446 4753 scope.go:117] "RemoveContainer" containerID="bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.287562 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.313932 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.313979 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4lhw\" (UniqueName: \"kubernetes.io/projected/6fe820fd-d9f1-409c-a860-f89f10c60c12-kube-api-access-q4lhw\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.313993 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe820fd-d9f1-409c-a860-f89f10c60c12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.357397 4753 scope.go:117] "RemoveContainer" containerID="bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6" Jan 29 15:35:54 crc kubenswrapper[4753]: E0129 15:35:54.357968 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6\": container with ID starting with bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6 not found: ID does not exist" containerID="bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.358027 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6"} err="failed to get container status \"bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6\": rpc error: code = NotFound desc = could not find container \"bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6\": container with ID starting with bae7a40191753c6f26076b5709bfb6f0d5038bac2c27d7dc56c18b314ea37cf6 not found: ID does not exist" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.364583 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.384557 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.393898 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 15:35:54 crc kubenswrapper[4753]: E0129 15:35:54.394401 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e362383-f440-41a3-9d31-7378a94aeca6" containerName="dnsmasq-dns" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.394428 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e362383-f440-41a3-9d31-7378a94aeca6" containerName="dnsmasq-dns" Jan 29 15:35:54 crc kubenswrapper[4753]: E0129 15:35:54.394448 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe820fd-d9f1-409c-a860-f89f10c60c12" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.394454 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe820fd-d9f1-409c-a860-f89f10c60c12" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 15:35:54 crc kubenswrapper[4753]: E0129 15:35:54.394461 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e362383-f440-41a3-9d31-7378a94aeca6" containerName="init" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.394468 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e362383-f440-41a3-9d31-7378a94aeca6" containerName="init" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.394633 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e362383-f440-41a3-9d31-7378a94aeca6" containerName="dnsmasq-dns" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.394650 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe820fd-d9f1-409c-a860-f89f10c60c12" containerName="nova-cell1-novncproxy-novncproxy" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.395413 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.404314 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.409488 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.517067 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppcqx\" (UniqueName: \"kubernetes.io/projected/968ff4a1-242c-4e6c-af17-20956a2c99c4-kube-api-access-ppcqx\") pod \"nova-cell1-novncproxy-0\" (UID: \"968ff4a1-242c-4e6c-af17-20956a2c99c4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.517189 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968ff4a1-242c-4e6c-af17-20956a2c99c4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"968ff4a1-242c-4e6c-af17-20956a2c99c4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.517296 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968ff4a1-242c-4e6c-af17-20956a2c99c4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"968ff4a1-242c-4e6c-af17-20956a2c99c4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.620060 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppcqx\" (UniqueName: \"kubernetes.io/projected/968ff4a1-242c-4e6c-af17-20956a2c99c4-kube-api-access-ppcqx\") pod \"nova-cell1-novncproxy-0\" (UID: \"968ff4a1-242c-4e6c-af17-20956a2c99c4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.620142 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968ff4a1-242c-4e6c-af17-20956a2c99c4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"968ff4a1-242c-4e6c-af17-20956a2c99c4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.620221 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968ff4a1-242c-4e6c-af17-20956a2c99c4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"968ff4a1-242c-4e6c-af17-20956a2c99c4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.624584 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968ff4a1-242c-4e6c-af17-20956a2c99c4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"968ff4a1-242c-4e6c-af17-20956a2c99c4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.624924 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968ff4a1-242c-4e6c-af17-20956a2c99c4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"968ff4a1-242c-4e6c-af17-20956a2c99c4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.641586 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppcqx\" (UniqueName: \"kubernetes.io/projected/968ff4a1-242c-4e6c-af17-20956a2c99c4-kube-api-access-ppcqx\") pod \"nova-cell1-novncproxy-0\" (UID: \"968ff4a1-242c-4e6c-af17-20956a2c99c4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:54 crc kubenswrapper[4753]: I0129 15:35:54.723583 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:35:55 crc kubenswrapper[4753]: E0129 15:35:55.030248 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 15:35:55 crc kubenswrapper[4753]: E0129 15:35:55.031906 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 15:35:55 crc kubenswrapper[4753]: E0129 15:35:55.033211 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 15:35:55 crc kubenswrapper[4753]: E0129 15:35:55.033245 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dcad9510-6729-4c00-8857-b386e6f05806" containerName="nova-scheduler-scheduler" Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.304019 4753 generic.go:334] "Generic (PLEG): container finished" podID="037a17d1-e25f-461b-be74-d4127a64ed11" containerID="56f98792b126883286141647cec180dbbb79a6818635795f8a311bcb1e51c95e" exitCode=0 Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.304068 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"037a17d1-e25f-461b-be74-d4127a64ed11","Type":"ContainerDied","Data":"56f98792b126883286141647cec180dbbb79a6818635795f8a311bcb1e51c95e"} Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.379376 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.507265 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.645051 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4gj8\" (UniqueName: \"kubernetes.io/projected/037a17d1-e25f-461b-be74-d4127a64ed11-kube-api-access-t4gj8\") pod \"037a17d1-e25f-461b-be74-d4127a64ed11\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.645172 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-combined-ca-bundle\") pod \"037a17d1-e25f-461b-be74-d4127a64ed11\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.645437 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-config-data\") pod \"037a17d1-e25f-461b-be74-d4127a64ed11\" (UID: \"037a17d1-e25f-461b-be74-d4127a64ed11\") " Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.653459 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037a17d1-e25f-461b-be74-d4127a64ed11-kube-api-access-t4gj8" (OuterVolumeSpecName: "kube-api-access-t4gj8") pod "037a17d1-e25f-461b-be74-d4127a64ed11" (UID: "037a17d1-e25f-461b-be74-d4127a64ed11"). InnerVolumeSpecName "kube-api-access-t4gj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.677948 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "037a17d1-e25f-461b-be74-d4127a64ed11" (UID: "037a17d1-e25f-461b-be74-d4127a64ed11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.688717 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-config-data" (OuterVolumeSpecName: "config-data") pod "037a17d1-e25f-461b-be74-d4127a64ed11" (UID: "037a17d1-e25f-461b-be74-d4127a64ed11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.747766 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.747804 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/037a17d1-e25f-461b-be74-d4127a64ed11-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:55 crc kubenswrapper[4753]: I0129 15:35:55.747815 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4gj8\" (UniqueName: \"kubernetes.io/projected/037a17d1-e25f-461b-be74-d4127a64ed11-kube-api-access-t4gj8\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.165241 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe820fd-d9f1-409c-a860-f89f10c60c12" path="/var/lib/kubelet/pods/6fe820fd-d9f1-409c-a860-f89f10c60c12/volumes" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.233103 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": read tcp 10.217.0.2:52754->10.217.1.82:8774: read: connection reset by peer" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.233103 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": read tcp 10.217.0.2:52740->10.217.1.82:8774: read: connection reset by peer" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.306083 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.306307 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="c40be523-36a9-4bd7-8568-94e2921da7bd" containerName="nova-cell1-conductor-conductor" containerID="cri-o://7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae" gracePeriod=30 Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.325258 4753 generic.go:334] "Generic (PLEG): container finished" podID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerID="7c9b4c32c91b8c6d6bdeebd5167d5892a44be83f419750e6c28653d5708adae6" exitCode=0 Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.325334 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4c21c48-52e6-4603-97e0-c33fc2ab2896","Type":"ContainerDied","Data":"7c9b4c32c91b8c6d6bdeebd5167d5892a44be83f419750e6c28653d5708adae6"} Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.338523 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.338521 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"037a17d1-e25f-461b-be74-d4127a64ed11","Type":"ContainerDied","Data":"4b93a42fd46e70cc7196f10a84459d5caf0a37f4c92a07788e98008014b23e62"} Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.338724 4753 scope.go:117] "RemoveContainer" containerID="56f98792b126883286141647cec180dbbb79a6818635795f8a311bcb1e51c95e" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.346995 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"968ff4a1-242c-4e6c-af17-20956a2c99c4","Type":"ContainerStarted","Data":"49abfa5e840fddb434f4781d55db2ed0088a73ef8439c814376a1729fc91e019"} Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.347039 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"968ff4a1-242c-4e6c-af17-20956a2c99c4","Type":"ContainerStarted","Data":"d444f9ab207f2d9c6a19e844b07f5811dc5731f14738ec9305ed3896183bd416"} Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.375984 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.390565 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.406523 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 15:35:56 crc kubenswrapper[4753]: E0129 15:35:56.407106 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="037a17d1-e25f-461b-be74-d4127a64ed11" containerName="nova-cell0-conductor-conductor" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.407138 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="037a17d1-e25f-461b-be74-d4127a64ed11" containerName="nova-cell0-conductor-conductor" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.407431 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="037a17d1-e25f-461b-be74-d4127a64ed11" containerName="nova-cell0-conductor-conductor" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.407983 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.407961981 podStartE2EDuration="2.407961981s" podCreationTimestamp="2026-01-29 15:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:35:56.390530824 +0000 UTC m=+5591.085265226" watchObservedRunningTime="2026-01-29 15:35:56.407961981 +0000 UTC m=+5591.102696373" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.408392 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.413423 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.451244 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.81:8775/\": read tcp 10.217.0.2:49052->10.217.1.81:8775: read: connection reset by peer" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.451330 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.81:8775/\": read tcp 10.217.0.2:49062->10.217.1.81:8775: read: connection reset by peer" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.455201 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.579379 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33161748-700b-402e-b133-35b428c5887d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"33161748-700b-402e-b133-35b428c5887d\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.579733 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33161748-700b-402e-b133-35b428c5887d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"33161748-700b-402e-b133-35b428c5887d\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.579971 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fz96\" (UniqueName: \"kubernetes.io/projected/33161748-700b-402e-b133-35b428c5887d-kube-api-access-2fz96\") pod \"nova-cell0-conductor-0\" (UID: \"33161748-700b-402e-b133-35b428c5887d\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.681286 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33161748-700b-402e-b133-35b428c5887d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"33161748-700b-402e-b133-35b428c5887d\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.681580 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fz96\" (UniqueName: \"kubernetes.io/projected/33161748-700b-402e-b133-35b428c5887d-kube-api-access-2fz96\") pod \"nova-cell0-conductor-0\" (UID: \"33161748-700b-402e-b133-35b428c5887d\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.681675 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33161748-700b-402e-b133-35b428c5887d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"33161748-700b-402e-b133-35b428c5887d\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.689168 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33161748-700b-402e-b133-35b428c5887d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"33161748-700b-402e-b133-35b428c5887d\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.698826 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33161748-700b-402e-b133-35b428c5887d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"33161748-700b-402e-b133-35b428c5887d\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.702849 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fz96\" (UniqueName: \"kubernetes.io/projected/33161748-700b-402e-b133-35b428c5887d-kube-api-access-2fz96\") pod \"nova-cell0-conductor-0\" (UID: \"33161748-700b-402e-b133-35b428c5887d\") " pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.743605 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.789680 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.884976 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-combined-ca-bundle\") pod \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.885356 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4c21c48-52e6-4603-97e0-c33fc2ab2896-logs\") pod \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.885444 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-config-data\") pod \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.885628 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxfv6\" (UniqueName: \"kubernetes.io/projected/d4c21c48-52e6-4603-97e0-c33fc2ab2896-kube-api-access-sxfv6\") pod \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\" (UID: \"d4c21c48-52e6-4603-97e0-c33fc2ab2896\") " Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.893415 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4c21c48-52e6-4603-97e0-c33fc2ab2896-logs" (OuterVolumeSpecName: "logs") pod "d4c21c48-52e6-4603-97e0-c33fc2ab2896" (UID: "d4c21c48-52e6-4603-97e0-c33fc2ab2896"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.894219 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4c21c48-52e6-4603-97e0-c33fc2ab2896-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.896426 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4c21c48-52e6-4603-97e0-c33fc2ab2896-kube-api-access-sxfv6" (OuterVolumeSpecName: "kube-api-access-sxfv6") pod "d4c21c48-52e6-4603-97e0-c33fc2ab2896" (UID: "d4c21c48-52e6-4603-97e0-c33fc2ab2896"). InnerVolumeSpecName "kube-api-access-sxfv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.933534 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-config-data" (OuterVolumeSpecName: "config-data") pod "d4c21c48-52e6-4603-97e0-c33fc2ab2896" (UID: "d4c21c48-52e6-4603-97e0-c33fc2ab2896"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.933681 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4c21c48-52e6-4603-97e0-c33fc2ab2896" (UID: "d4c21c48-52e6-4603-97e0-c33fc2ab2896"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.995946 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.995978 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4c21c48-52e6-4603-97e0-c33fc2ab2896-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:56 crc kubenswrapper[4753]: I0129 15:35:56.995989 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxfv6\" (UniqueName: \"kubernetes.io/projected/d4c21c48-52e6-4603-97e0-c33fc2ab2896-kube-api-access-sxfv6\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.138893 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.302911 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-combined-ca-bundle\") pod \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.304876 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-logs\") pod \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.304934 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-config-data\") pod \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.305132 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j4zg\" (UniqueName: \"kubernetes.io/projected/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-kube-api-access-9j4zg\") pod \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\" (UID: \"162bd0fb-4ccf-4276-8443-73b9b8a7f99f\") " Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.306305 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-logs" (OuterVolumeSpecName: "logs") pod "162bd0fb-4ccf-4276-8443-73b9b8a7f99f" (UID: "162bd0fb-4ccf-4276-8443-73b9b8a7f99f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.322056 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-kube-api-access-9j4zg" (OuterVolumeSpecName: "kube-api-access-9j4zg") pod "162bd0fb-4ccf-4276-8443-73b9b8a7f99f" (UID: "162bd0fb-4ccf-4276-8443-73b9b8a7f99f"). InnerVolumeSpecName "kube-api-access-9j4zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.352992 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "162bd0fb-4ccf-4276-8443-73b9b8a7f99f" (UID: "162bd0fb-4ccf-4276-8443-73b9b8a7f99f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.353770 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-config-data" (OuterVolumeSpecName: "config-data") pod "162bd0fb-4ccf-4276-8443-73b9b8a7f99f" (UID: "162bd0fb-4ccf-4276-8443-73b9b8a7f99f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.372136 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d4c21c48-52e6-4603-97e0-c33fc2ab2896","Type":"ContainerDied","Data":"2ad220de0eaaab558a38cafeaf525e2d98b083a3b37926ee23c53332eb534748"} Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.372213 4753 scope.go:117] "RemoveContainer" containerID="7c9b4c32c91b8c6d6bdeebd5167d5892a44be83f419750e6c28653d5708adae6" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.372360 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.378801 4753 generic.go:334] "Generic (PLEG): container finished" podID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerID="d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5" exitCode=0 Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.380023 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.383146 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bd0fb-4ccf-4276-8443-73b9b8a7f99f","Type":"ContainerDied","Data":"d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5"} Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.383363 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"162bd0fb-4ccf-4276-8443-73b9b8a7f99f","Type":"ContainerDied","Data":"e35cdb08c176052c7dab099e81f303f95794c53c57aade871e8fb35d91f54d7a"} Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.383466 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.416996 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.420439 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.420925 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.420990 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j4zg\" (UniqueName: \"kubernetes.io/projected/162bd0fb-4ccf-4276-8443-73b9b8a7f99f-kube-api-access-9j4zg\") on node \"crc\" DevicePath \"\"" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.427358 4753 scope.go:117] "RemoveContainer" containerID="f796d177b20c89bfe938e46351a6fd8893f34714c53ff5ac2b153681d17ac15b" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.450062 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.469620 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.490263 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 29 15:35:57 crc kubenswrapper[4753]: E0129 15:35:57.490826 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-metadata" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.490849 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-metadata" Jan 29 15:35:57 crc kubenswrapper[4753]: E0129 15:35:57.490865 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-api" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.490871 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-api" Jan 29 15:35:57 crc kubenswrapper[4753]: E0129 15:35:57.490883 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-log" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.490891 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-log" Jan 29 15:35:57 crc kubenswrapper[4753]: E0129 15:35:57.490970 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-log" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.491001 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-log" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.491311 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-metadata" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.491327 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-api" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.491341 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" containerName="nova-metadata-log" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.491350 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" containerName="nova-api-log" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.492529 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.495830 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.507119 4753 scope.go:117] "RemoveContainer" containerID="d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.529419 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.538443 4753 scope.go:117] "RemoveContainer" containerID="6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.548315 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.558786 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.565867 4753 scope.go:117] "RemoveContainer" containerID="d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5" Jan 29 15:35:57 crc kubenswrapper[4753]: E0129 15:35:57.566388 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5\": container with ID starting with d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5 not found: ID does not exist" containerID="d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.566433 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5"} err="failed to get container status \"d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5\": rpc error: code = NotFound desc = could not find container \"d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5\": container with ID starting with d17e12521146b6f60a6f78a12cdf660ffac6233421f3455856308fdf51f02ac5 not found: ID does not exist" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.566463 4753 scope.go:117] "RemoveContainer" containerID="6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7" Jan 29 15:35:57 crc kubenswrapper[4753]: E0129 15:35:57.566733 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7\": container with ID starting with 6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7 not found: ID does not exist" containerID="6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.566769 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7"} err="failed to get container status \"6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7\": rpc error: code = NotFound desc = could not find container \"6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7\": container with ID starting with 6fd811f1d936dec9bc1d7dc82407d397af48577b67787add1a3a235fb2ff12d7 not found: ID does not exist" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.570013 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.571794 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.574804 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.588726 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.624327 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493fa42b-a667-4c4f-8473-2848bb63714b-config-data\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.624443 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8d9h\" (UniqueName: \"kubernetes.io/projected/493fa42b-a667-4c4f-8473-2848bb63714b-kube-api-access-t8d9h\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.624891 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493fa42b-a667-4c4f-8473-2848bb63714b-logs\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.624970 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493fa42b-a667-4c4f-8473-2848bb63714b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.728557 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fcb37d-8d12-4311-859b-6aeb6e200031-config-data\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.728786 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493fa42b-a667-4c4f-8473-2848bb63714b-logs\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.728843 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493fa42b-a667-4c4f-8473-2848bb63714b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.728939 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80fcb37d-8d12-4311-859b-6aeb6e200031-logs\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.729005 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493fa42b-a667-4c4f-8473-2848bb63714b-config-data\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.729038 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwftq\" (UniqueName: \"kubernetes.io/projected/80fcb37d-8d12-4311-859b-6aeb6e200031-kube-api-access-zwftq\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.729091 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fcb37d-8d12-4311-859b-6aeb6e200031-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.729349 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/493fa42b-a667-4c4f-8473-2848bb63714b-logs\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.729404 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8d9h\" (UniqueName: \"kubernetes.io/projected/493fa42b-a667-4c4f-8473-2848bb63714b-kube-api-access-t8d9h\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.732509 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493fa42b-a667-4c4f-8473-2848bb63714b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.733006 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/493fa42b-a667-4c4f-8473-2848bb63714b-config-data\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.748520 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8d9h\" (UniqueName: \"kubernetes.io/projected/493fa42b-a667-4c4f-8473-2848bb63714b-kube-api-access-t8d9h\") pod \"nova-api-0\" (UID: \"493fa42b-a667-4c4f-8473-2848bb63714b\") " pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.822528 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.830943 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80fcb37d-8d12-4311-859b-6aeb6e200031-logs\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.831056 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwftq\" (UniqueName: \"kubernetes.io/projected/80fcb37d-8d12-4311-859b-6aeb6e200031-kube-api-access-zwftq\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.831088 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fcb37d-8d12-4311-859b-6aeb6e200031-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.831223 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fcb37d-8d12-4311-859b-6aeb6e200031-config-data\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.831489 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80fcb37d-8d12-4311-859b-6aeb6e200031-logs\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.834883 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80fcb37d-8d12-4311-859b-6aeb6e200031-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.844085 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80fcb37d-8d12-4311-859b-6aeb6e200031-config-data\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.851237 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwftq\" (UniqueName: \"kubernetes.io/projected/80fcb37d-8d12-4311-859b-6aeb6e200031-kube-api-access-zwftq\") pod \"nova-metadata-0\" (UID: \"80fcb37d-8d12-4311-859b-6aeb6e200031\") " pod="openstack/nova-metadata-0" Jan 29 15:35:57 crc kubenswrapper[4753]: I0129 15:35:57.887884 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 29 15:35:58 crc kubenswrapper[4753]: I0129 15:35:58.166184 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="037a17d1-e25f-461b-be74-d4127a64ed11" path="/var/lib/kubelet/pods/037a17d1-e25f-461b-be74-d4127a64ed11/volumes" Jan 29 15:35:58 crc kubenswrapper[4753]: I0129 15:35:58.167240 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="162bd0fb-4ccf-4276-8443-73b9b8a7f99f" path="/var/lib/kubelet/pods/162bd0fb-4ccf-4276-8443-73b9b8a7f99f/volumes" Jan 29 15:35:58 crc kubenswrapper[4753]: I0129 15:35:58.167990 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4c21c48-52e6-4603-97e0-c33fc2ab2896" path="/var/lib/kubelet/pods/d4c21c48-52e6-4603-97e0-c33fc2ab2896/volumes" Jan 29 15:35:58 crc kubenswrapper[4753]: I0129 15:35:58.328564 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 29 15:35:58 crc kubenswrapper[4753]: I0129 15:35:58.396913 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"33161748-700b-402e-b133-35b428c5887d","Type":"ContainerStarted","Data":"5ae2236198396af81271466693c9c2deaed6eca73d452aed019d75a6e23fb502"} Jan 29 15:35:58 crc kubenswrapper[4753]: I0129 15:35:58.397418 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"33161748-700b-402e-b133-35b428c5887d","Type":"ContainerStarted","Data":"3a2a26344d120b0e59fd3c29731e65727f40fe47349a51911770d2df07eb96b1"} Jan 29 15:35:58 crc kubenswrapper[4753]: I0129 15:35:58.397491 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 29 15:35:58 crc kubenswrapper[4753]: I0129 15:35:58.398396 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"493fa42b-a667-4c4f-8473-2848bb63714b","Type":"ContainerStarted","Data":"e23e81f0a71e3620dc00ec0845cf41804bbfb94571eebc8d9c3c438615fa84e2"} Jan 29 15:35:58 crc kubenswrapper[4753]: I0129 15:35:58.417528 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.417508332 podStartE2EDuration="2.417508332s" podCreationTimestamp="2026-01-29 15:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:35:58.413098904 +0000 UTC m=+5593.107833296" watchObservedRunningTime="2026-01-29 15:35:58.417508332 +0000 UTC m=+5593.112242714" Jan 29 15:35:58 crc kubenswrapper[4753]: I0129 15:35:58.440607 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 29 15:35:58 crc kubenswrapper[4753]: W0129 15:35:58.443343 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80fcb37d_8d12_4311_859b_6aeb6e200031.slice/crio-cbd5fa849222873421dd348d9b15bb82a657fef50526d3a2793fcae3338426cc WatchSource:0}: Error finding container cbd5fa849222873421dd348d9b15bb82a657fef50526d3a2793fcae3338426cc: Status 404 returned error can't find the container with id cbd5fa849222873421dd348d9b15bb82a657fef50526d3a2793fcae3338426cc Jan 29 15:35:59 crc kubenswrapper[4753]: I0129 15:35:59.408331 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"493fa42b-a667-4c4f-8473-2848bb63714b","Type":"ContainerStarted","Data":"333e138be6253f10e77bcd7823772c364e76491a99d58b1e96aeafd6b40a219f"} Jan 29 15:35:59 crc kubenswrapper[4753]: I0129 15:35:59.408718 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"493fa42b-a667-4c4f-8473-2848bb63714b","Type":"ContainerStarted","Data":"adcdb9a965be43b3bb3567410bcb2e0db488a91e98f29b64a14d3dc194abc588"} Jan 29 15:35:59 crc kubenswrapper[4753]: I0129 15:35:59.410140 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80fcb37d-8d12-4311-859b-6aeb6e200031","Type":"ContainerStarted","Data":"96da31b6e167b077047640ae250f644fc2ceecf4a16a5d2db695c872d4b3f2cb"} Jan 29 15:35:59 crc kubenswrapper[4753]: I0129 15:35:59.410198 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80fcb37d-8d12-4311-859b-6aeb6e200031","Type":"ContainerStarted","Data":"195e9088a6ea1eef67d9c7eaf3504d683d16101e31241f124925263076db12d6"} Jan 29 15:35:59 crc kubenswrapper[4753]: I0129 15:35:59.410214 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80fcb37d-8d12-4311-859b-6aeb6e200031","Type":"ContainerStarted","Data":"cbd5fa849222873421dd348d9b15bb82a657fef50526d3a2793fcae3338426cc"} Jan 29 15:35:59 crc kubenswrapper[4753]: I0129 15:35:59.437203 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.437180462 podStartE2EDuration="2.437180462s" podCreationTimestamp="2026-01-29 15:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:35:59.428450008 +0000 UTC m=+5594.123184390" watchObservedRunningTime="2026-01-29 15:35:59.437180462 +0000 UTC m=+5594.131914864" Jan 29 15:35:59 crc kubenswrapper[4753]: I0129 15:35:59.457089 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.457066835 podStartE2EDuration="2.457066835s" podCreationTimestamp="2026-01-29 15:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:35:59.450384046 +0000 UTC m=+5594.145118438" watchObservedRunningTime="2026-01-29 15:35:59.457066835 +0000 UTC m=+5594.151801217" Jan 29 15:35:59 crc kubenswrapper[4753]: I0129 15:35:59.723869 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:36:00 crc kubenswrapper[4753]: E0129 15:36:00.027082 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 15:36:00 crc kubenswrapper[4753]: E0129 15:36:00.029595 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 15:36:00 crc kubenswrapper[4753]: E0129 15:36:00.030991 4753 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 29 15:36:00 crc kubenswrapper[4753]: E0129 15:36:00.031067 4753 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dcad9510-6729-4c00-8857-b386e6f05806" containerName="nova-scheduler-scheduler" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.412331 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.422668 4753 generic.go:334] "Generic (PLEG): container finished" podID="c40be523-36a9-4bd7-8568-94e2921da7bd" containerID="7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae" exitCode=0 Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.422730 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.422761 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c40be523-36a9-4bd7-8568-94e2921da7bd","Type":"ContainerDied","Data":"7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae"} Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.422806 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c40be523-36a9-4bd7-8568-94e2921da7bd","Type":"ContainerDied","Data":"2d6167c935c3ed3859388a9123b851d778486d3268117fd4f964957967df613c"} Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.422824 4753 scope.go:117] "RemoveContainer" containerID="7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.477947 4753 scope.go:117] "RemoveContainer" containerID="7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae" Jan 29 15:36:00 crc kubenswrapper[4753]: E0129 15:36:00.484329 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae\": container with ID starting with 7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae not found: ID does not exist" containerID="7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.484381 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae"} err="failed to get container status \"7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae\": rpc error: code = NotFound desc = could not find container \"7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae\": container with ID starting with 7c2dd558793e99a0f32c7695a900487dca4e73436d8a4ece56efa6423ff251ae not found: ID does not exist" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.490822 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-config-data\") pod \"c40be523-36a9-4bd7-8568-94e2921da7bd\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.491092 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpw5j\" (UniqueName: \"kubernetes.io/projected/c40be523-36a9-4bd7-8568-94e2921da7bd-kube-api-access-dpw5j\") pod \"c40be523-36a9-4bd7-8568-94e2921da7bd\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.491248 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-combined-ca-bundle\") pod \"c40be523-36a9-4bd7-8568-94e2921da7bd\" (UID: \"c40be523-36a9-4bd7-8568-94e2921da7bd\") " Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.498511 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40be523-36a9-4bd7-8568-94e2921da7bd-kube-api-access-dpw5j" (OuterVolumeSpecName: "kube-api-access-dpw5j") pod "c40be523-36a9-4bd7-8568-94e2921da7bd" (UID: "c40be523-36a9-4bd7-8568-94e2921da7bd"). InnerVolumeSpecName "kube-api-access-dpw5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.518032 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c40be523-36a9-4bd7-8568-94e2921da7bd" (UID: "c40be523-36a9-4bd7-8568-94e2921da7bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.530401 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-config-data" (OuterVolumeSpecName: "config-data") pod "c40be523-36a9-4bd7-8568-94e2921da7bd" (UID: "c40be523-36a9-4bd7-8568-94e2921da7bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.593779 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.593831 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40be523-36a9-4bd7-8568-94e2921da7bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.593847 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpw5j\" (UniqueName: \"kubernetes.io/projected/c40be523-36a9-4bd7-8568-94e2921da7bd-kube-api-access-dpw5j\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.756838 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.768302 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.780283 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 15:36:00 crc kubenswrapper[4753]: E0129 15:36:00.780833 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40be523-36a9-4bd7-8568-94e2921da7bd" containerName="nova-cell1-conductor-conductor" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.780861 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40be523-36a9-4bd7-8568-94e2921da7bd" containerName="nova-cell1-conductor-conductor" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.781111 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="c40be523-36a9-4bd7-8568-94e2921da7bd" containerName="nova-cell1-conductor-conductor" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.782009 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.784135 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.790371 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.799027 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016bbe0f-680c-4d36-a48d-8092b9174669-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"016bbe0f-680c-4d36-a48d-8092b9174669\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.799163 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016bbe0f-680c-4d36-a48d-8092b9174669-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"016bbe0f-680c-4d36-a48d-8092b9174669\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.799292 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p2qw\" (UniqueName: \"kubernetes.io/projected/016bbe0f-680c-4d36-a48d-8092b9174669-kube-api-access-8p2qw\") pod \"nova-cell1-conductor-0\" (UID: \"016bbe0f-680c-4d36-a48d-8092b9174669\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.900846 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p2qw\" (UniqueName: \"kubernetes.io/projected/016bbe0f-680c-4d36-a48d-8092b9174669-kube-api-access-8p2qw\") pod \"nova-cell1-conductor-0\" (UID: \"016bbe0f-680c-4d36-a48d-8092b9174669\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.900971 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016bbe0f-680c-4d36-a48d-8092b9174669-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"016bbe0f-680c-4d36-a48d-8092b9174669\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.901016 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016bbe0f-680c-4d36-a48d-8092b9174669-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"016bbe0f-680c-4d36-a48d-8092b9174669\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.906315 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/016bbe0f-680c-4d36-a48d-8092b9174669-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"016bbe0f-680c-4d36-a48d-8092b9174669\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.907267 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/016bbe0f-680c-4d36-a48d-8092b9174669-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"016bbe0f-680c-4d36-a48d-8092b9174669\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:00 crc kubenswrapper[4753]: I0129 15:36:00.930434 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p2qw\" (UniqueName: \"kubernetes.io/projected/016bbe0f-680c-4d36-a48d-8092b9174669-kube-api-access-8p2qw\") pod \"nova-cell1-conductor-0\" (UID: \"016bbe0f-680c-4d36-a48d-8092b9174669\") " pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:01 crc kubenswrapper[4753]: I0129 15:36:01.106410 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:01 crc kubenswrapper[4753]: I0129 15:36:01.654231 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 29 15:36:02 crc kubenswrapper[4753]: I0129 15:36:02.159980 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40be523-36a9-4bd7-8568-94e2921da7bd" path="/var/lib/kubelet/pods/c40be523-36a9-4bd7-8568-94e2921da7bd/volumes" Jan 29 15:36:02 crc kubenswrapper[4753]: I0129 15:36:02.443317 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"016bbe0f-680c-4d36-a48d-8092b9174669","Type":"ContainerStarted","Data":"d1bbe91f769e4992d8ee283919dbece39c9c811876ba38edaf5ce77582e7b8d9"} Jan 29 15:36:02 crc kubenswrapper[4753]: I0129 15:36:02.443371 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"016bbe0f-680c-4d36-a48d-8092b9174669","Type":"ContainerStarted","Data":"bb5a2b6a42abb5009f703916ef5a160b26c336fe3f839ed38252d94a8d69f83e"} Jan 29 15:36:02 crc kubenswrapper[4753]: I0129 15:36:02.443486 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:02 crc kubenswrapper[4753]: I0129 15:36:02.475976 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.47595112 podStartE2EDuration="2.47595112s" podCreationTimestamp="2026-01-29 15:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:36:02.462843548 +0000 UTC m=+5597.157577930" watchObservedRunningTime="2026-01-29 15:36:02.47595112 +0000 UTC m=+5597.170685532" Jan 29 15:36:02 crc kubenswrapper[4753]: I0129 15:36:02.888713 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 15:36:02 crc kubenswrapper[4753]: I0129 15:36:02.888782 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 29 15:36:03 crc kubenswrapper[4753]: I0129 15:36:03.903572 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.067241 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-combined-ca-bundle\") pod \"dcad9510-6729-4c00-8857-b386e6f05806\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.067321 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89rsp\" (UniqueName: \"kubernetes.io/projected/dcad9510-6729-4c00-8857-b386e6f05806-kube-api-access-89rsp\") pod \"dcad9510-6729-4c00-8857-b386e6f05806\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.068403 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-config-data\") pod \"dcad9510-6729-4c00-8857-b386e6f05806\" (UID: \"dcad9510-6729-4c00-8857-b386e6f05806\") " Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.072731 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcad9510-6729-4c00-8857-b386e6f05806-kube-api-access-89rsp" (OuterVolumeSpecName: "kube-api-access-89rsp") pod "dcad9510-6729-4c00-8857-b386e6f05806" (UID: "dcad9510-6729-4c00-8857-b386e6f05806"). InnerVolumeSpecName "kube-api-access-89rsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.095956 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcad9510-6729-4c00-8857-b386e6f05806" (UID: "dcad9510-6729-4c00-8857-b386e6f05806"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.097850 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-config-data" (OuterVolumeSpecName: "config-data") pod "dcad9510-6729-4c00-8857-b386e6f05806" (UID: "dcad9510-6729-4c00-8857-b386e6f05806"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.171258 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.171293 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89rsp\" (UniqueName: \"kubernetes.io/projected/dcad9510-6729-4c00-8857-b386e6f05806-kube-api-access-89rsp\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.171306 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcad9510-6729-4c00-8857-b386e6f05806-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.463592 4753 generic.go:334] "Generic (PLEG): container finished" podID="dcad9510-6729-4c00-8857-b386e6f05806" containerID="0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" exitCode=0 Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.463869 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcad9510-6729-4c00-8857-b386e6f05806","Type":"ContainerDied","Data":"0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336"} Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.463898 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dcad9510-6729-4c00-8857-b386e6f05806","Type":"ContainerDied","Data":"a9676b80d4b1fc0fd2c6f6ba0b692b5d43430ed22e2b95200a192ecd7d07e539"} Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.463915 4753 scope.go:117] "RemoveContainer" containerID="0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.464042 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.490194 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.498891 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.507728 4753 scope.go:117] "RemoveContainer" containerID="0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" Jan 29 15:36:04 crc kubenswrapper[4753]: E0129 15:36:04.508343 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336\": container with ID starting with 0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336 not found: ID does not exist" containerID="0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.508376 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336"} err="failed to get container status \"0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336\": rpc error: code = NotFound desc = could not find container \"0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336\": container with ID starting with 0c8e8edc0bd063eea472c15b742848b61306e591188e6fb13f5a41a0e5a80336 not found: ID does not exist" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.512549 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:36:04 crc kubenswrapper[4753]: E0129 15:36:04.513064 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcad9510-6729-4c00-8857-b386e6f05806" containerName="nova-scheduler-scheduler" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.513087 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcad9510-6729-4c00-8857-b386e6f05806" containerName="nova-scheduler-scheduler" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.513361 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcad9510-6729-4c00-8857-b386e6f05806" containerName="nova-scheduler-scheduler" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.514130 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.520086 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.549486 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.686441 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwcjt\" (UniqueName: \"kubernetes.io/projected/dbd601de-74ab-4a96-9e9d-47fc53a047ac-kube-api-access-xwcjt\") pod \"nova-scheduler-0\" (UID: \"dbd601de-74ab-4a96-9e9d-47fc53a047ac\") " pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.686695 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd601de-74ab-4a96-9e9d-47fc53a047ac-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbd601de-74ab-4a96-9e9d-47fc53a047ac\") " pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.686918 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd601de-74ab-4a96-9e9d-47fc53a047ac-config-data\") pod \"nova-scheduler-0\" (UID: \"dbd601de-74ab-4a96-9e9d-47fc53a047ac\") " pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.724921 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.740932 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.788483 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd601de-74ab-4a96-9e9d-47fc53a047ac-config-data\") pod \"nova-scheduler-0\" (UID: \"dbd601de-74ab-4a96-9e9d-47fc53a047ac\") " pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.788547 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwcjt\" (UniqueName: \"kubernetes.io/projected/dbd601de-74ab-4a96-9e9d-47fc53a047ac-kube-api-access-xwcjt\") pod \"nova-scheduler-0\" (UID: \"dbd601de-74ab-4a96-9e9d-47fc53a047ac\") " pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.788636 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd601de-74ab-4a96-9e9d-47fc53a047ac-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbd601de-74ab-4a96-9e9d-47fc53a047ac\") " pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.794430 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd601de-74ab-4a96-9e9d-47fc53a047ac-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbd601de-74ab-4a96-9e9d-47fc53a047ac\") " pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.804419 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd601de-74ab-4a96-9e9d-47fc53a047ac-config-data\") pod \"nova-scheduler-0\" (UID: \"dbd601de-74ab-4a96-9e9d-47fc53a047ac\") " pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.808713 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwcjt\" (UniqueName: \"kubernetes.io/projected/dbd601de-74ab-4a96-9e9d-47fc53a047ac-kube-api-access-xwcjt\") pod \"nova-scheduler-0\" (UID: \"dbd601de-74ab-4a96-9e9d-47fc53a047ac\") " pod="openstack/nova-scheduler-0" Jan 29 15:36:04 crc kubenswrapper[4753]: I0129 15:36:04.880195 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 29 15:36:05 crc kubenswrapper[4753]: I0129 15:36:05.328659 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 29 15:36:05 crc kubenswrapper[4753]: I0129 15:36:05.492442 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbd601de-74ab-4a96-9e9d-47fc53a047ac","Type":"ContainerStarted","Data":"0ffd67546bf4e98de41d1f54f3343f516a2ca37a78349c5077c31877cf445ab4"} Jan 29 15:36:05 crc kubenswrapper[4753]: I0129 15:36:05.510873 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 29 15:36:06 crc kubenswrapper[4753]: I0129 15:36:06.142186 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 29 15:36:06 crc kubenswrapper[4753]: I0129 15:36:06.174735 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcad9510-6729-4c00-8857-b386e6f05806" path="/var/lib/kubelet/pods/dcad9510-6729-4c00-8857-b386e6f05806/volumes" Jan 29 15:36:06 crc kubenswrapper[4753]: I0129 15:36:06.507877 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbd601de-74ab-4a96-9e9d-47fc53a047ac","Type":"ContainerStarted","Data":"4c8a85ccc4683e3e7ef1603b25c45db83e47d68088715137bc5ad984eb129ef7"} Jan 29 15:36:06 crc kubenswrapper[4753]: I0129 15:36:06.543904 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.543885001 podStartE2EDuration="2.543885001s" podCreationTimestamp="2026-01-29 15:36:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:36:06.540545882 +0000 UTC m=+5601.235280264" watchObservedRunningTime="2026-01-29 15:36:06.543885001 +0000 UTC m=+5601.238619383" Jan 29 15:36:06 crc kubenswrapper[4753]: I0129 15:36:06.777033 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 29 15:36:07 crc kubenswrapper[4753]: I0129 15:36:07.823021 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 15:36:07 crc kubenswrapper[4753]: I0129 15:36:07.823099 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 29 15:36:07 crc kubenswrapper[4753]: I0129 15:36:07.888784 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 15:36:07 crc kubenswrapper[4753]: I0129 15:36:07.888839 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 29 15:36:08 crc kubenswrapper[4753]: I0129 15:36:08.906381 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="493fa42b-a667-4c4f-8473-2848bb63714b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.92:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:36:08 crc kubenswrapper[4753]: I0129 15:36:08.906380 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="493fa42b-a667-4c4f-8473-2848bb63714b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.92:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:36:08 crc kubenswrapper[4753]: I0129 15:36:08.989533 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80fcb37d-8d12-4311-859b-6aeb6e200031" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.93:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:36:08 crc kubenswrapper[4753]: I0129 15:36:08.989533 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80fcb37d-8d12-4311-859b-6aeb6e200031" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.93:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 29 15:36:09 crc kubenswrapper[4753]: I0129 15:36:09.880410 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 29 15:36:11 crc kubenswrapper[4753]: I0129 15:36:11.806038 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 15:36:11 crc kubenswrapper[4753]: I0129 15:36:11.808441 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 15:36:11 crc kubenswrapper[4753]: I0129 15:36:11.810512 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 29 15:36:11 crc kubenswrapper[4753]: I0129 15:36:11.816527 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 15:36:11 crc kubenswrapper[4753]: I0129 15:36:11.950822 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqnq6\" (UniqueName: \"kubernetes.io/projected/22c2d787-a0d2-4f05-899d-d50c58bec04d-kube-api-access-pqnq6\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:11 crc kubenswrapper[4753]: I0129 15:36:11.950897 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-scripts\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:11 crc kubenswrapper[4753]: I0129 15:36:11.950917 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:11 crc kubenswrapper[4753]: I0129 15:36:11.951211 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:11 crc kubenswrapper[4753]: I0129 15:36:11.951275 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22c2d787-a0d2-4f05-899d-d50c58bec04d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:11 crc kubenswrapper[4753]: I0129 15:36:11.951319 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.053558 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.053629 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22c2d787-a0d2-4f05-899d-d50c58bec04d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.053665 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.053793 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqnq6\" (UniqueName: \"kubernetes.io/projected/22c2d787-a0d2-4f05-899d-d50c58bec04d-kube-api-access-pqnq6\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.053847 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-scripts\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.053876 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.054323 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22c2d787-a0d2-4f05-899d-d50c58bec04d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.062248 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.062369 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.062722 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-scripts\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.062727 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.083505 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqnq6\" (UniqueName: \"kubernetes.io/projected/22c2d787-a0d2-4f05-899d-d50c58bec04d-kube-api-access-pqnq6\") pod \"cinder-scheduler-0\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.129990 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.592660 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.967954 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.968210 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d0b86796-88a4-4177-92ce-ea4cbceb4749" containerName="cinder-api-log" containerID="cri-o://f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90" gracePeriod=30 Jan 29 15:36:12 crc kubenswrapper[4753]: I0129 15:36:12.968364 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d0b86796-88a4-4177-92ce-ea4cbceb4749" containerName="cinder-api" containerID="cri-o://9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf" gracePeriod=30 Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.575796 4753 generic.go:334] "Generic (PLEG): container finished" podID="d0b86796-88a4-4177-92ce-ea4cbceb4749" containerID="f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90" exitCode=143 Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.575856 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d0b86796-88a4-4177-92ce-ea4cbceb4749","Type":"ContainerDied","Data":"f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90"} Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.578493 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22c2d787-a0d2-4f05-899d-d50c58bec04d","Type":"ContainerStarted","Data":"919a220bef0dd7a70c1dc5f2dd0b2d0195db3b783b2bf68dbd28cfa2a2e6a66e"} Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.578538 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22c2d787-a0d2-4f05-899d-d50c58bec04d","Type":"ContainerStarted","Data":"d21eae36cc95b74bf5a1dac734b960516a9162792b123fbcc45d36851b31fe49"} Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.605102 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.606714 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.609597 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.620914 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786178 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786224 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786251 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786269 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786292 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786310 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786332 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-sys\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786353 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786374 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786409 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786425 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-dev\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786439 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786461 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnt8g\" (UniqueName: \"kubernetes.io/projected/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-kube-api-access-vnt8g\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786491 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786522 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-run\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.786538 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.887802 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.887853 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-dev\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.887879 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.887909 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnt8g\" (UniqueName: \"kubernetes.io/projected/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-kube-api-access-vnt8g\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.887947 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.887990 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-run\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.887983 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-dev\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888042 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888014 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888006 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888129 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888137 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888212 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888242 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888263 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888293 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888316 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888344 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-sys\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888373 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888404 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888675 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888689 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-run\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888832 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888787 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-sys\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.888978 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.892356 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.893942 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.895243 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.895343 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.896916 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.897323 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.914988 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnt8g\" (UniqueName: \"kubernetes.io/projected/6d501b41-fcc7-47e2-8dae-b76c5b7b0519-kube-api-access-vnt8g\") pod \"cinder-volume-volume1-0\" (UID: \"6d501b41-fcc7-47e2-8dae-b76c5b7b0519\") " pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:13 crc kubenswrapper[4753]: I0129 15:36:13.940648 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.583088 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.590677 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.594600 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.606454 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.610334 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22c2d787-a0d2-4f05-899d-d50c58bec04d","Type":"ContainerStarted","Data":"e38367a314fbe2b1f35471d80ead236585fc42d34bcd7e0e66cc76da0dcc59ec"} Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.672165 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.672133128 podStartE2EDuration="3.672133128s" podCreationTimestamp="2026-01-29 15:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:36:14.660953609 +0000 UTC m=+5609.355687991" watchObservedRunningTime="2026-01-29 15:36:14.672133128 +0000 UTC m=+5609.366867510" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.694768 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.709686 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.709929 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/567bff90-0d20-4a19-b301-eac95246fc6d-ceph\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.710000 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-sys\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.710067 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-run\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.710143 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbv9v\" (UniqueName: \"kubernetes.io/projected/567bff90-0d20-4a19-b301-eac95246fc6d-kube-api-access-gbv9v\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.710253 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.710433 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.710527 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.710599 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.710675 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-scripts\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.710757 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.711369 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-lib-modules\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.711637 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-config-data\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.711813 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.711842 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-dev\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.711862 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.813598 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/567bff90-0d20-4a19-b301-eac95246fc6d-ceph\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.813978 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-sys\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814010 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-run\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814039 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbv9v\" (UniqueName: \"kubernetes.io/projected/567bff90-0d20-4a19-b301-eac95246fc6d-kube-api-access-gbv9v\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814070 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814091 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-sys\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814117 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814139 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814205 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-run\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814146 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814345 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814397 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814492 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814531 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-scripts\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814631 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814748 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-lib-modules\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814807 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-config-data\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814849 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814874 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814901 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-dev\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.814984 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.815668 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.816385 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.816446 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-lib-modules\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.816480 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-dev\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.816560 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/567bff90-0d20-4a19-b301-eac95246fc6d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.819929 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.820367 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-config-data\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.820587 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-scripts\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.823587 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/567bff90-0d20-4a19-b301-eac95246fc6d-ceph\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.836299 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/567bff90-0d20-4a19-b301-eac95246fc6d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.844834 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbv9v\" (UniqueName: \"kubernetes.io/projected/567bff90-0d20-4a19-b301-eac95246fc6d-kube-api-access-gbv9v\") pod \"cinder-backup-0\" (UID: \"567bff90-0d20-4a19-b301-eac95246fc6d\") " pod="openstack/cinder-backup-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.881194 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.916069 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 29 15:36:14 crc kubenswrapper[4753]: I0129 15:36:14.953768 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 29 15:36:15 crc kubenswrapper[4753]: I0129 15:36:15.529784 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 29 15:36:15 crc kubenswrapper[4753]: W0129 15:36:15.565577 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod567bff90_0d20_4a19_b301_eac95246fc6d.slice/crio-ef06676f5ae7790b81118f796c746925cfebb284be7a183bcc7969945e2dc1bd WatchSource:0}: Error finding container ef06676f5ae7790b81118f796c746925cfebb284be7a183bcc7969945e2dc1bd: Status 404 returned error can't find the container with id ef06676f5ae7790b81118f796c746925cfebb284be7a183bcc7969945e2dc1bd Jan 29 15:36:15 crc kubenswrapper[4753]: I0129 15:36:15.621217 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"6d501b41-fcc7-47e2-8dae-b76c5b7b0519","Type":"ContainerStarted","Data":"791639bb1beb9a7cc632e44f6e760f75ab621fe4a9ec7ea374b15c178dc5ca91"} Jan 29 15:36:15 crc kubenswrapper[4753]: I0129 15:36:15.622813 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"567bff90-0d20-4a19-b301-eac95246fc6d","Type":"ContainerStarted","Data":"ef06676f5ae7790b81118f796c746925cfebb284be7a183bcc7969945e2dc1bd"} Jan 29 15:36:15 crc kubenswrapper[4753]: I0129 15:36:15.656533 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.602214 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.659877 4753 generic.go:334] "Generic (PLEG): container finished" podID="d0b86796-88a4-4177-92ce-ea4cbceb4749" containerID="9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf" exitCode=0 Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.659957 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d0b86796-88a4-4177-92ce-ea4cbceb4749","Type":"ContainerDied","Data":"9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf"} Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.660000 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d0b86796-88a4-4177-92ce-ea4cbceb4749","Type":"ContainerDied","Data":"76160e8d7dcdfea6edddc64b78fda41581a602189c60bf5f79dd166501d5de94"} Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.660025 4753 scope.go:117] "RemoveContainer" containerID="9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.660304 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.669515 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"6d501b41-fcc7-47e2-8dae-b76c5b7b0519","Type":"ContainerStarted","Data":"99f94497671d0d41c8f411b91e7b2955449cb40bc932244df965393ff5d8989a"} Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.669579 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"6d501b41-fcc7-47e2-8dae-b76c5b7b0519","Type":"ContainerStarted","Data":"9cf97be7f003276e01d487bd3eaa62f8d3e48d5d01dc4b4f9719aae448e9654a"} Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.711429 4753 scope.go:117] "RemoveContainer" containerID="f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.746507 4753 scope.go:117] "RemoveContainer" containerID="9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf" Jan 29 15:36:16 crc kubenswrapper[4753]: E0129 15:36:16.749658 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf\": container with ID starting with 9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf not found: ID does not exist" containerID="9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.749721 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf"} err="failed to get container status \"9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf\": rpc error: code = NotFound desc = could not find container \"9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf\": container with ID starting with 9cc637cfff8938fb87e54981e9719fca19413e00efd13b8becf105ef402019cf not found: ID does not exist" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.749759 4753 scope.go:117] "RemoveContainer" containerID="f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90" Jan 29 15:36:16 crc kubenswrapper[4753]: E0129 15:36:16.751186 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90\": container with ID starting with f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90 not found: ID does not exist" containerID="f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.751208 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90"} err="failed to get container status \"f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90\": rpc error: code = NotFound desc = could not find container \"f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90\": container with ID starting with f06e7e187974b9d63c727eb3d7da72da9b98c97828138037fa3ada263b9d9f90 not found: ID does not exist" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.763664 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data-custom\") pod \"d0b86796-88a4-4177-92ce-ea4cbceb4749\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.763735 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0b86796-88a4-4177-92ce-ea4cbceb4749-etc-machine-id\") pod \"d0b86796-88a4-4177-92ce-ea4cbceb4749\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.763816 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-combined-ca-bundle\") pod \"d0b86796-88a4-4177-92ce-ea4cbceb4749\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.763880 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0b86796-88a4-4177-92ce-ea4cbceb4749-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d0b86796-88a4-4177-92ce-ea4cbceb4749" (UID: "d0b86796-88a4-4177-92ce-ea4cbceb4749"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.763923 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b86796-88a4-4177-92ce-ea4cbceb4749-logs\") pod \"d0b86796-88a4-4177-92ce-ea4cbceb4749\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.763943 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data\") pod \"d0b86796-88a4-4177-92ce-ea4cbceb4749\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.764054 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkvnh\" (UniqueName: \"kubernetes.io/projected/d0b86796-88a4-4177-92ce-ea4cbceb4749-kube-api-access-rkvnh\") pod \"d0b86796-88a4-4177-92ce-ea4cbceb4749\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.764076 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-scripts\") pod \"d0b86796-88a4-4177-92ce-ea4cbceb4749\" (UID: \"d0b86796-88a4-4177-92ce-ea4cbceb4749\") " Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.764675 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b86796-88a4-4177-92ce-ea4cbceb4749-logs" (OuterVolumeSpecName: "logs") pod "d0b86796-88a4-4177-92ce-ea4cbceb4749" (UID: "d0b86796-88a4-4177-92ce-ea4cbceb4749"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.766554 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0b86796-88a4-4177-92ce-ea4cbceb4749-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.766577 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b86796-88a4-4177-92ce-ea4cbceb4749-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.772026 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b86796-88a4-4177-92ce-ea4cbceb4749-kube-api-access-rkvnh" (OuterVolumeSpecName: "kube-api-access-rkvnh") pod "d0b86796-88a4-4177-92ce-ea4cbceb4749" (UID: "d0b86796-88a4-4177-92ce-ea4cbceb4749"). InnerVolumeSpecName "kube-api-access-rkvnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.777268 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-scripts" (OuterVolumeSpecName: "scripts") pod "d0b86796-88a4-4177-92ce-ea4cbceb4749" (UID: "d0b86796-88a4-4177-92ce-ea4cbceb4749"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.777773 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d0b86796-88a4-4177-92ce-ea4cbceb4749" (UID: "d0b86796-88a4-4177-92ce-ea4cbceb4749"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.816720 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0b86796-88a4-4177-92ce-ea4cbceb4749" (UID: "d0b86796-88a4-4177-92ce-ea4cbceb4749"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.825105 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data" (OuterVolumeSpecName: "config-data") pod "d0b86796-88a4-4177-92ce-ea4cbceb4749" (UID: "d0b86796-88a4-4177-92ce-ea4cbceb4749"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.868216 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.868262 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkvnh\" (UniqueName: \"kubernetes.io/projected/d0b86796-88a4-4177-92ce-ea4cbceb4749-kube-api-access-rkvnh\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.868278 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.868289 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:16 crc kubenswrapper[4753]: I0129 15:36:16.868305 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b86796-88a4-4177-92ce-ea4cbceb4749-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.003561 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.087857744 podStartE2EDuration="4.003539439s" podCreationTimestamp="2026-01-29 15:36:13 +0000 UTC" firstStartedPulling="2026-01-29 15:36:14.695282979 +0000 UTC m=+5609.390017361" lastFinishedPulling="2026-01-29 15:36:15.610964674 +0000 UTC m=+5610.305699056" observedRunningTime="2026-01-29 15:36:16.709571007 +0000 UTC m=+5611.404305389" watchObservedRunningTime="2026-01-29 15:36:17.003539439 +0000 UTC m=+5611.698273831" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.011216 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.063692 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.099588 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 29 15:36:17 crc kubenswrapper[4753]: E0129 15:36:17.111271 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b86796-88a4-4177-92ce-ea4cbceb4749" containerName="cinder-api" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.111331 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b86796-88a4-4177-92ce-ea4cbceb4749" containerName="cinder-api" Jan 29 15:36:17 crc kubenswrapper[4753]: E0129 15:36:17.111359 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b86796-88a4-4177-92ce-ea4cbceb4749" containerName="cinder-api-log" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.111367 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b86796-88a4-4177-92ce-ea4cbceb4749" containerName="cinder-api-log" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.112012 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b86796-88a4-4177-92ce-ea4cbceb4749" containerName="cinder-api-log" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.112036 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b86796-88a4-4177-92ce-ea4cbceb4749" containerName="cinder-api" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.122568 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.122672 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.125250 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.135564 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.177343 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-config-data-custom\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.177466 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d567cae9-e775-4bd7-af21-b75b563dd220-logs\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.177550 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-scripts\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.177764 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d567cae9-e775-4bd7-af21-b75b563dd220-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.177790 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn477\" (UniqueName: \"kubernetes.io/projected/d567cae9-e775-4bd7-af21-b75b563dd220-kube-api-access-vn477\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.177831 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.177860 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-config-data\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.280046 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-scripts\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.280192 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d567cae9-e775-4bd7-af21-b75b563dd220-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.280221 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn477\" (UniqueName: \"kubernetes.io/projected/d567cae9-e775-4bd7-af21-b75b563dd220-kube-api-access-vn477\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.280257 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.280282 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-config-data\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.280362 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-config-data-custom\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.280380 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d567cae9-e775-4bd7-af21-b75b563dd220-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.280426 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d567cae9-e775-4bd7-af21-b75b563dd220-logs\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.280781 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d567cae9-e775-4bd7-af21-b75b563dd220-logs\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.284746 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-scripts\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.285103 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-config-data-custom\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.285170 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.288191 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d567cae9-e775-4bd7-af21-b75b563dd220-config-data\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.304875 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn477\" (UniqueName: \"kubernetes.io/projected/d567cae9-e775-4bd7-af21-b75b563dd220-kube-api-access-vn477\") pod \"cinder-api-0\" (UID: \"d567cae9-e775-4bd7-af21-b75b563dd220\") " pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.453891 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.683675 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"567bff90-0d20-4a19-b301-eac95246fc6d","Type":"ContainerStarted","Data":"8474c7d883abf0bc26a823097d69d313687e849699459a779b4681cad3dfa30f"} Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.684041 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"567bff90-0d20-4a19-b301-eac95246fc6d","Type":"ContainerStarted","Data":"392853ee76570520f72551050c91480a2188c4f758d98bc1a5a22fe5277b1e8d"} Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.713125 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.870880808 podStartE2EDuration="3.713108774s" podCreationTimestamp="2026-01-29 15:36:14 +0000 UTC" firstStartedPulling="2026-01-29 15:36:15.567472439 +0000 UTC m=+5610.262206821" lastFinishedPulling="2026-01-29 15:36:16.409700405 +0000 UTC m=+5611.104434787" observedRunningTime="2026-01-29 15:36:17.706623661 +0000 UTC m=+5612.401358053" watchObservedRunningTime="2026-01-29 15:36:17.713108774 +0000 UTC m=+5612.407843156" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.828653 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.829190 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.829675 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.829695 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.834456 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.836518 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.898662 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.911450 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 15:36:17 crc kubenswrapper[4753]: I0129 15:36:17.913007 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 29 15:36:18 crc kubenswrapper[4753]: I0129 15:36:18.002534 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 29 15:36:18 crc kubenswrapper[4753]: I0129 15:36:18.168574 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b86796-88a4-4177-92ce-ea4cbceb4749" path="/var/lib/kubelet/pods/d0b86796-88a4-4177-92ce-ea4cbceb4749/volumes" Jan 29 15:36:18 crc kubenswrapper[4753]: I0129 15:36:18.699860 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d567cae9-e775-4bd7-af21-b75b563dd220","Type":"ContainerStarted","Data":"4f4a8b0d38e5c7401f8782c5ea0171b9b425f32f82bfad49e02fb4a77d14a713"} Jan 29 15:36:18 crc kubenswrapper[4753]: I0129 15:36:18.704581 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 29 15:36:18 crc kubenswrapper[4753]: I0129 15:36:18.961709 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:19 crc kubenswrapper[4753]: I0129 15:36:19.710488 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d567cae9-e775-4bd7-af21-b75b563dd220","Type":"ContainerStarted","Data":"40a049275076599f4a75d77c1c60b9b5698c71adfb42acf196e55d4196d58c61"} Jan 29 15:36:19 crc kubenswrapper[4753]: I0129 15:36:19.710767 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d567cae9-e775-4bd7-af21-b75b563dd220","Type":"ContainerStarted","Data":"83a742d926e6bcfefd6422edfa2680a4f8e3982ee05e4fdb296f2cb9491aaae3"} Jan 29 15:36:19 crc kubenswrapper[4753]: I0129 15:36:19.733037 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.7330162529999997 podStartE2EDuration="2.733016253s" podCreationTimestamp="2026-01-29 15:36:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:36:19.732547781 +0000 UTC m=+5614.427282163" watchObservedRunningTime="2026-01-29 15:36:19.733016253 +0000 UTC m=+5614.427750635" Jan 29 15:36:19 crc kubenswrapper[4753]: I0129 15:36:19.954843 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Jan 29 15:36:20 crc kubenswrapper[4753]: I0129 15:36:20.720961 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 29 15:36:22 crc kubenswrapper[4753]: I0129 15:36:22.336995 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 29 15:36:22 crc kubenswrapper[4753]: I0129 15:36:22.422114 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 15:36:22 crc kubenswrapper[4753]: I0129 15:36:22.738441 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="22c2d787-a0d2-4f05-899d-d50c58bec04d" containerName="cinder-scheduler" containerID="cri-o://919a220bef0dd7a70c1dc5f2dd0b2d0195db3b783b2bf68dbd28cfa2a2e6a66e" gracePeriod=30 Jan 29 15:36:22 crc kubenswrapper[4753]: I0129 15:36:22.738569 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="22c2d787-a0d2-4f05-899d-d50c58bec04d" containerName="probe" containerID="cri-o://e38367a314fbe2b1f35471d80ead236585fc42d34bcd7e0e66cc76da0dcc59ec" gracePeriod=30 Jan 29 15:36:23 crc kubenswrapper[4753]: I0129 15:36:23.751391 4753 generic.go:334] "Generic (PLEG): container finished" podID="22c2d787-a0d2-4f05-899d-d50c58bec04d" containerID="e38367a314fbe2b1f35471d80ead236585fc42d34bcd7e0e66cc76da0dcc59ec" exitCode=0 Jan 29 15:36:23 crc kubenswrapper[4753]: I0129 15:36:23.751531 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22c2d787-a0d2-4f05-899d-d50c58bec04d","Type":"ContainerDied","Data":"e38367a314fbe2b1f35471d80ead236585fc42d34bcd7e0e66cc76da0dcc59ec"} Jan 29 15:36:24 crc kubenswrapper[4753]: I0129 15:36:24.147187 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Jan 29 15:36:24 crc kubenswrapper[4753]: I0129 15:36:24.761997 4753 generic.go:334] "Generic (PLEG): container finished" podID="22c2d787-a0d2-4f05-899d-d50c58bec04d" containerID="919a220bef0dd7a70c1dc5f2dd0b2d0195db3b783b2bf68dbd28cfa2a2e6a66e" exitCode=0 Jan 29 15:36:24 crc kubenswrapper[4753]: I0129 15:36:24.762071 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22c2d787-a0d2-4f05-899d-d50c58bec04d","Type":"ContainerDied","Data":"919a220bef0dd7a70c1dc5f2dd0b2d0195db3b783b2bf68dbd28cfa2a2e6a66e"} Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.119444 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.184338 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22c2d787-a0d2-4f05-899d-d50c58bec04d-etc-machine-id\") pod \"22c2d787-a0d2-4f05-899d-d50c58bec04d\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.184403 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqnq6\" (UniqueName: \"kubernetes.io/projected/22c2d787-a0d2-4f05-899d-d50c58bec04d-kube-api-access-pqnq6\") pod \"22c2d787-a0d2-4f05-899d-d50c58bec04d\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.184483 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22c2d787-a0d2-4f05-899d-d50c58bec04d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "22c2d787-a0d2-4f05-899d-d50c58bec04d" (UID: "22c2d787-a0d2-4f05-899d-d50c58bec04d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.184550 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-combined-ca-bundle\") pod \"22c2d787-a0d2-4f05-899d-d50c58bec04d\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.184577 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data\") pod \"22c2d787-a0d2-4f05-899d-d50c58bec04d\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.184615 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-scripts\") pod \"22c2d787-a0d2-4f05-899d-d50c58bec04d\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.184676 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data-custom\") pod \"22c2d787-a0d2-4f05-899d-d50c58bec04d\" (UID: \"22c2d787-a0d2-4f05-899d-d50c58bec04d\") " Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.185173 4753 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22c2d787-a0d2-4f05-899d-d50c58bec04d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.193097 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-scripts" (OuterVolumeSpecName: "scripts") pod "22c2d787-a0d2-4f05-899d-d50c58bec04d" (UID: "22c2d787-a0d2-4f05-899d-d50c58bec04d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.193359 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "22c2d787-a0d2-4f05-899d-d50c58bec04d" (UID: "22c2d787-a0d2-4f05-899d-d50c58bec04d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.194272 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c2d787-a0d2-4f05-899d-d50c58bec04d-kube-api-access-pqnq6" (OuterVolumeSpecName: "kube-api-access-pqnq6") pod "22c2d787-a0d2-4f05-899d-d50c58bec04d" (UID: "22c2d787-a0d2-4f05-899d-d50c58bec04d"). InnerVolumeSpecName "kube-api-access-pqnq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.214419 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.272531 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22c2d787-a0d2-4f05-899d-d50c58bec04d" (UID: "22c2d787-a0d2-4f05-899d-d50c58bec04d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.287091 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqnq6\" (UniqueName: \"kubernetes.io/projected/22c2d787-a0d2-4f05-899d-d50c58bec04d-kube-api-access-pqnq6\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.287122 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.287132 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.287141 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.330519 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data" (OuterVolumeSpecName: "config-data") pod "22c2d787-a0d2-4f05-899d-d50c58bec04d" (UID: "22c2d787-a0d2-4f05-899d-d50c58bec04d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.389123 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c2d787-a0d2-4f05-899d-d50c58bec04d-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.779962 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22c2d787-a0d2-4f05-899d-d50c58bec04d","Type":"ContainerDied","Data":"d21eae36cc95b74bf5a1dac734b960516a9162792b123fbcc45d36851b31fe49"} Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.780570 4753 scope.go:117] "RemoveContainer" containerID="e38367a314fbe2b1f35471d80ead236585fc42d34bcd7e0e66cc76da0dcc59ec" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.780520 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.828367 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.835442 4753 scope.go:117] "RemoveContainer" containerID="919a220bef0dd7a70c1dc5f2dd0b2d0195db3b783b2bf68dbd28cfa2a2e6a66e" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.838110 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.862446 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 15:36:25 crc kubenswrapper[4753]: E0129 15:36:25.863292 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c2d787-a0d2-4f05-899d-d50c58bec04d" containerName="cinder-scheduler" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.863418 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c2d787-a0d2-4f05-899d-d50c58bec04d" containerName="cinder-scheduler" Jan 29 15:36:25 crc kubenswrapper[4753]: E0129 15:36:25.863530 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c2d787-a0d2-4f05-899d-d50c58bec04d" containerName="probe" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.863604 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c2d787-a0d2-4f05-899d-d50c58bec04d" containerName="probe" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.863927 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c2d787-a0d2-4f05-899d-d50c58bec04d" containerName="probe" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.864032 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c2d787-a0d2-4f05-899d-d50c58bec04d" containerName="cinder-scheduler" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.865499 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.870958 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.874750 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.906391 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xvnp\" (UniqueName: \"kubernetes.io/projected/78998281-5df0-4295-80f6-3b9e0cd7fac4-kube-api-access-8xvnp\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.906502 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78998281-5df0-4295-80f6-3b9e0cd7fac4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.906561 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-config-data\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.906603 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-scripts\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.906663 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:25 crc kubenswrapper[4753]: I0129 15:36:25.906761 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.009660 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xvnp\" (UniqueName: \"kubernetes.io/projected/78998281-5df0-4295-80f6-3b9e0cd7fac4-kube-api-access-8xvnp\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.009803 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78998281-5df0-4295-80f6-3b9e0cd7fac4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.009891 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-config-data\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.009892 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78998281-5df0-4295-80f6-3b9e0cd7fac4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.009970 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-scripts\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.010068 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.010330 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.014317 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-scripts\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.014677 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.014966 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.027993 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78998281-5df0-4295-80f6-3b9e0cd7fac4-config-data\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.029648 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xvnp\" (UniqueName: \"kubernetes.io/projected/78998281-5df0-4295-80f6-3b9e0cd7fac4-kube-api-access-8xvnp\") pod \"cinder-scheduler-0\" (UID: \"78998281-5df0-4295-80f6-3b9e0cd7fac4\") " pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.163486 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c2d787-a0d2-4f05-899d-d50c58bec04d" path="/var/lib/kubelet/pods/22c2d787-a0d2-4f05-899d-d50c58bec04d/volumes" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.191079 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.644949 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 29 15:36:26 crc kubenswrapper[4753]: I0129 15:36:26.792280 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78998281-5df0-4295-80f6-3b9e0cd7fac4","Type":"ContainerStarted","Data":"732bb37666079aac7b039224070b0da2f861f1df2afc551440763d9af52b2b51"} Jan 29 15:36:27 crc kubenswrapper[4753]: I0129 15:36:27.804703 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78998281-5df0-4295-80f6-3b9e0cd7fac4","Type":"ContainerStarted","Data":"3762ff152625c1f55c75354debdb99110e34518898e6c02956a80538755dd63f"} Jan 29 15:36:28 crc kubenswrapper[4753]: I0129 15:36:28.816175 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78998281-5df0-4295-80f6-3b9e0cd7fac4","Type":"ContainerStarted","Data":"6fa9e3d1c4b61a2922ec15471fb4f5aa6dc9b93d5b38aa7b83fe151270112310"} Jan 29 15:36:28 crc kubenswrapper[4753]: I0129 15:36:28.840610 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.84056857 podStartE2EDuration="3.84056857s" podCreationTimestamp="2026-01-29 15:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:36:28.830932212 +0000 UTC m=+5623.525666614" watchObservedRunningTime="2026-01-29 15:36:28.84056857 +0000 UTC m=+5623.535302972" Jan 29 15:36:29 crc kubenswrapper[4753]: I0129 15:36:29.311661 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 29 15:36:31 crc kubenswrapper[4753]: I0129 15:36:31.192142 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 29 15:36:36 crc kubenswrapper[4753]: I0129 15:36:36.401587 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 29 15:36:57 crc kubenswrapper[4753]: I0129 15:36:57.054830 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:36:57 crc kubenswrapper[4753]: I0129 15:36:57.055422 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:37:27 crc kubenswrapper[4753]: I0129 15:37:27.055547 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:37:27 crc kubenswrapper[4753]: I0129 15:37:27.056211 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:37:57 crc kubenswrapper[4753]: I0129 15:37:57.055031 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:37:57 crc kubenswrapper[4753]: I0129 15:37:57.055589 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:37:57 crc kubenswrapper[4753]: I0129 15:37:57.055638 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:37:57 crc kubenswrapper[4753]: I0129 15:37:57.056421 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:37:57 crc kubenswrapper[4753]: I0129 15:37:57.056485 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" gracePeriod=600 Jan 29 15:37:57 crc kubenswrapper[4753]: E0129 15:37:57.178528 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:37:57 crc kubenswrapper[4753]: I0129 15:37:57.742798 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" exitCode=0 Jan 29 15:37:57 crc kubenswrapper[4753]: I0129 15:37:57.742853 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e"} Jan 29 15:37:57 crc kubenswrapper[4753]: I0129 15:37:57.742895 4753 scope.go:117] "RemoveContainer" containerID="20e283980ad77b065d0dfa0d4018e594dc6a0c2625911542352b6567ce9e5f09" Jan 29 15:37:57 crc kubenswrapper[4753]: I0129 15:37:57.743597 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:37:57 crc kubenswrapper[4753]: E0129 15:37:57.743954 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:38:11 crc kubenswrapper[4753]: I0129 15:38:11.148772 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:38:11 crc kubenswrapper[4753]: E0129 15:38:11.149639 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.661320 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jr6fv"] Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.663135 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.665290 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.668142 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-dnzmv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.679696 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-fljn5"] Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.700566 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.733408 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jr6fv"] Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.760371 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fljn5"] Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.838708 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-scripts\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.839182 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddj2q\" (UniqueName: \"kubernetes.io/projected/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-kube-api-access-ddj2q\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.839279 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e68ccb6b-3714-472e-a754-247bb456104d-var-run-ovn\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.839451 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e68ccb6b-3714-472e-a754-247bb456104d-var-run\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.839542 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-var-lib\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.839623 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e68ccb6b-3714-472e-a754-247bb456104d-scripts\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.839709 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-etc-ovs\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.839797 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e68ccb6b-3714-472e-a754-247bb456104d-var-log-ovn\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.839874 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-var-run\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.839958 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-var-log\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.840288 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v4wt\" (UniqueName: \"kubernetes.io/projected/e68ccb6b-3714-472e-a754-247bb456104d-kube-api-access-4v4wt\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.941904 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v4wt\" (UniqueName: \"kubernetes.io/projected/e68ccb6b-3714-472e-a754-247bb456104d-kube-api-access-4v4wt\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942040 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-scripts\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942073 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddj2q\" (UniqueName: \"kubernetes.io/projected/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-kube-api-access-ddj2q\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942091 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e68ccb6b-3714-472e-a754-247bb456104d-var-run-ovn\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942124 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e68ccb6b-3714-472e-a754-247bb456104d-var-run\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942143 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-var-lib\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942190 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e68ccb6b-3714-472e-a754-247bb456104d-scripts\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942211 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-etc-ovs\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942237 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e68ccb6b-3714-472e-a754-247bb456104d-var-log-ovn\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942255 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-var-run\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942278 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-var-log\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942587 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-var-log\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942644 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-etc-ovs\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942682 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e68ccb6b-3714-472e-a754-247bb456104d-var-log-ovn\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942701 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e68ccb6b-3714-472e-a754-247bb456104d-var-run\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942714 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-var-run\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942748 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-var-lib\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.942836 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e68ccb6b-3714-472e-a754-247bb456104d-var-run-ovn\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.944569 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-scripts\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.944851 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e68ccb6b-3714-472e-a754-247bb456104d-scripts\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.963733 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddj2q\" (UniqueName: \"kubernetes.io/projected/68a6ad2e-376c-4b67-94f1-2c39cd523d5b-kube-api-access-ddj2q\") pod \"ovn-controller-ovs-fljn5\" (UID: \"68a6ad2e-376c-4b67-94f1-2c39cd523d5b\") " pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:16 crc kubenswrapper[4753]: I0129 15:38:16.971498 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v4wt\" (UniqueName: \"kubernetes.io/projected/e68ccb6b-3714-472e-a754-247bb456104d-kube-api-access-4v4wt\") pod \"ovn-controller-jr6fv\" (UID: \"e68ccb6b-3714-472e-a754-247bb456104d\") " pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:17 crc kubenswrapper[4753]: I0129 15:38:17.000295 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:17 crc kubenswrapper[4753]: I0129 15:38:17.020445 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:17 crc kubenswrapper[4753]: I0129 15:38:17.050559 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wqvzr"] Jan 29 15:38:17 crc kubenswrapper[4753]: I0129 15:38:17.068206 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-36bb-account-create-update-sfm8j"] Jan 29 15:38:17 crc kubenswrapper[4753]: I0129 15:38:17.077559 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wqvzr"] Jan 29 15:38:17 crc kubenswrapper[4753]: I0129 15:38:17.087542 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-36bb-account-create-update-sfm8j"] Jan 29 15:38:17 crc kubenswrapper[4753]: I0129 15:38:17.542072 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jr6fv"] Jan 29 15:38:17 crc kubenswrapper[4753]: I0129 15:38:17.860166 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fljn5"] Jan 29 15:38:17 crc kubenswrapper[4753]: I0129 15:38:17.947053 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jr6fv" event={"ID":"e68ccb6b-3714-472e-a754-247bb456104d","Type":"ContainerStarted","Data":"4663f36e82cbafd60d0ac92b837dd37988285dd5988269c8c42877ddc3fbd8fa"} Jan 29 15:38:17 crc kubenswrapper[4753]: I0129 15:38:17.951009 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fljn5" event={"ID":"68a6ad2e-376c-4b67-94f1-2c39cd523d5b","Type":"ContainerStarted","Data":"e85004e2452c30622ee55695010f7a9dd1ad17487a2593441e356221a27a0161"} Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.160832 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08804285-e806-48a0-a925-67672aed97b5" path="/var/lib/kubelet/pods/08804285-e806-48a0-a925-67672aed97b5/volumes" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.165378 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479fd91e-fae0-432b-a820-d08f11fb8229" path="/var/lib/kubelet/pods/479fd91e-fae0-432b-a820-d08f11fb8229/volumes" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.167341 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-jr9zv"] Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.168656 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.172725 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jr9zv"] Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.175684 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.205401 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d7baa621-2d41-4a8a-ad05-87491d3f20ad-ovs-rundir\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.206483 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d7baa621-2d41-4a8a-ad05-87491d3f20ad-ovn-rundir\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.206529 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7baa621-2d41-4a8a-ad05-87491d3f20ad-config\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.206548 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjpj9\" (UniqueName: \"kubernetes.io/projected/d7baa621-2d41-4a8a-ad05-87491d3f20ad-kube-api-access-tjpj9\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.309236 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d7baa621-2d41-4a8a-ad05-87491d3f20ad-ovn-rundir\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.309312 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7baa621-2d41-4a8a-ad05-87491d3f20ad-config\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.309338 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjpj9\" (UniqueName: \"kubernetes.io/projected/d7baa621-2d41-4a8a-ad05-87491d3f20ad-kube-api-access-tjpj9\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.309459 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d7baa621-2d41-4a8a-ad05-87491d3f20ad-ovs-rundir\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.309957 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d7baa621-2d41-4a8a-ad05-87491d3f20ad-ovs-rundir\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.310208 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7baa621-2d41-4a8a-ad05-87491d3f20ad-config\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.310278 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d7baa621-2d41-4a8a-ad05-87491d3f20ad-ovn-rundir\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.328724 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjpj9\" (UniqueName: \"kubernetes.io/projected/d7baa621-2d41-4a8a-ad05-87491d3f20ad-kube-api-access-tjpj9\") pod \"ovn-controller-metrics-jr9zv\" (UID: \"d7baa621-2d41-4a8a-ad05-87491d3f20ad\") " pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.517651 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jr9zv" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.526728 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-wmq68"] Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.527977 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-wmq68" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.541570 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-wmq68"] Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.614043 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-operator-scripts\") pod \"octavia-db-create-wmq68\" (UID: \"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f\") " pod="openstack/octavia-db-create-wmq68" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.614500 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smgvq\" (UniqueName: \"kubernetes.io/projected/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-kube-api-access-smgvq\") pod \"octavia-db-create-wmq68\" (UID: \"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f\") " pod="openstack/octavia-db-create-wmq68" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.721139 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smgvq\" (UniqueName: \"kubernetes.io/projected/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-kube-api-access-smgvq\") pod \"octavia-db-create-wmq68\" (UID: \"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f\") " pod="openstack/octavia-db-create-wmq68" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.721408 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-operator-scripts\") pod \"octavia-db-create-wmq68\" (UID: \"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f\") " pod="openstack/octavia-db-create-wmq68" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.722834 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-operator-scripts\") pod \"octavia-db-create-wmq68\" (UID: \"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f\") " pod="openstack/octavia-db-create-wmq68" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.749269 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smgvq\" (UniqueName: \"kubernetes.io/projected/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-kube-api-access-smgvq\") pod \"octavia-db-create-wmq68\" (UID: \"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f\") " pod="openstack/octavia-db-create-wmq68" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.943283 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-wmq68" Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.989477 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jr6fv" event={"ID":"e68ccb6b-3714-472e-a754-247bb456104d","Type":"ContainerStarted","Data":"16029e7cb2c2d169d49c0911c35b536d13b76deca19182b7af7e7e8b17809c95"} Jan 29 15:38:18 crc kubenswrapper[4753]: I0129 15:38:18.990601 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:19 crc kubenswrapper[4753]: I0129 15:38:19.004750 4753 generic.go:334] "Generic (PLEG): container finished" podID="68a6ad2e-376c-4b67-94f1-2c39cd523d5b" containerID="55aebc917e98924d350c7d6b5c1e12a063c001bfc6f15074ea9be9481ca9e246" exitCode=0 Jan 29 15:38:19 crc kubenswrapper[4753]: I0129 15:38:19.004810 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fljn5" event={"ID":"68a6ad2e-376c-4b67-94f1-2c39cd523d5b","Type":"ContainerDied","Data":"55aebc917e98924d350c7d6b5c1e12a063c001bfc6f15074ea9be9481ca9e246"} Jan 29 15:38:19 crc kubenswrapper[4753]: I0129 15:38:19.014386 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-jr6fv" podStartSLOduration=3.014364272 podStartE2EDuration="3.014364272s" podCreationTimestamp="2026-01-29 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:38:19.008216648 +0000 UTC m=+5733.702951040" watchObservedRunningTime="2026-01-29 15:38:19.014364272 +0000 UTC m=+5733.709098654" Jan 29 15:38:19 crc kubenswrapper[4753]: I0129 15:38:19.041507 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jr9zv"] Jan 29 15:38:19 crc kubenswrapper[4753]: I0129 15:38:19.620692 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-wmq68"] Jan 29 15:38:19 crc kubenswrapper[4753]: W0129 15:38:19.627765 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce7b0f5a_10b7_49e6_9dcc_c959ef4b698f.slice/crio-7a2a42d3373035854ff801a5da371411513410081a6968b2b92d65925933fab6 WatchSource:0}: Error finding container 7a2a42d3373035854ff801a5da371411513410081a6968b2b92d65925933fab6: Status 404 returned error can't find the container with id 7a2a42d3373035854ff801a5da371411513410081a6968b2b92d65925933fab6 Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.017879 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fljn5" event={"ID":"68a6ad2e-376c-4b67-94f1-2c39cd523d5b","Type":"ContainerStarted","Data":"3da055322e4d543462a4a60ae8c4fadb8e5a9266bb1133a54fdcaa4894d1eb49"} Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.018366 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fljn5" event={"ID":"68a6ad2e-376c-4b67-94f1-2c39cd523d5b","Type":"ContainerStarted","Data":"0844269b6d58c5949ed152714d676de30bdeb658ffac7f62d19c6a74741d0cdb"} Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.019635 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.019672 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.021653 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-wmq68" event={"ID":"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f","Type":"ContainerStarted","Data":"bc3407d0a5c92f58438310b7f48aeacedd9b97b3c25829b886e6d324ee6bf3d4"} Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.021686 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-wmq68" event={"ID":"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f","Type":"ContainerStarted","Data":"7a2a42d3373035854ff801a5da371411513410081a6968b2b92d65925933fab6"} Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.030652 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jr9zv" event={"ID":"d7baa621-2d41-4a8a-ad05-87491d3f20ad","Type":"ContainerStarted","Data":"cbaa341094b835a7e75a0ff1150a036dab48d27c60d402432248ffda500edf57"} Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.030711 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jr9zv" event={"ID":"d7baa621-2d41-4a8a-ad05-87491d3f20ad","Type":"ContainerStarted","Data":"b8be8d02b224fd4a73eb852fab38b96b0bbbca447458cf815f1c43069ea2ed34"} Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.054699 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-fljn5" podStartSLOduration=4.054678605 podStartE2EDuration="4.054678605s" podCreationTimestamp="2026-01-29 15:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:38:20.051437788 +0000 UTC m=+5734.746172170" watchObservedRunningTime="2026-01-29 15:38:20.054678605 +0000 UTC m=+5734.749412987" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.077731 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-jr9zv" podStartSLOduration=2.077703742 podStartE2EDuration="2.077703742s" podCreationTimestamp="2026-01-29 15:38:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:38:20.06570124 +0000 UTC m=+5734.760435622" watchObservedRunningTime="2026-01-29 15:38:20.077703742 +0000 UTC m=+5734.772438124" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.112346 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-create-wmq68" podStartSLOduration=2.112321878 podStartE2EDuration="2.112321878s" podCreationTimestamp="2026-01-29 15:38:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:38:20.08850132 +0000 UTC m=+5734.783235712" watchObservedRunningTime="2026-01-29 15:38:20.112321878 +0000 UTC m=+5734.807056260" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.143312 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-7d0c-account-create-update-qfdv9"] Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.144810 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-7d0c-account-create-update-qfdv9" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.153736 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.178864 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-7d0c-account-create-update-qfdv9"] Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.274773 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8606fff5-4c4f-4f5f-9471-88e3b77284da-operator-scripts\") pod \"octavia-7d0c-account-create-update-qfdv9\" (UID: \"8606fff5-4c4f-4f5f-9471-88e3b77284da\") " pod="openstack/octavia-7d0c-account-create-update-qfdv9" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.274902 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9x8c\" (UniqueName: \"kubernetes.io/projected/8606fff5-4c4f-4f5f-9471-88e3b77284da-kube-api-access-w9x8c\") pod \"octavia-7d0c-account-create-update-qfdv9\" (UID: \"8606fff5-4c4f-4f5f-9471-88e3b77284da\") " pod="openstack/octavia-7d0c-account-create-update-qfdv9" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.376488 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8606fff5-4c4f-4f5f-9471-88e3b77284da-operator-scripts\") pod \"octavia-7d0c-account-create-update-qfdv9\" (UID: \"8606fff5-4c4f-4f5f-9471-88e3b77284da\") " pod="openstack/octavia-7d0c-account-create-update-qfdv9" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.376561 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9x8c\" (UniqueName: \"kubernetes.io/projected/8606fff5-4c4f-4f5f-9471-88e3b77284da-kube-api-access-w9x8c\") pod \"octavia-7d0c-account-create-update-qfdv9\" (UID: \"8606fff5-4c4f-4f5f-9471-88e3b77284da\") " pod="openstack/octavia-7d0c-account-create-update-qfdv9" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.377481 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8606fff5-4c4f-4f5f-9471-88e3b77284da-operator-scripts\") pod \"octavia-7d0c-account-create-update-qfdv9\" (UID: \"8606fff5-4c4f-4f5f-9471-88e3b77284da\") " pod="openstack/octavia-7d0c-account-create-update-qfdv9" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.397806 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9x8c\" (UniqueName: \"kubernetes.io/projected/8606fff5-4c4f-4f5f-9471-88e3b77284da-kube-api-access-w9x8c\") pod \"octavia-7d0c-account-create-update-qfdv9\" (UID: \"8606fff5-4c4f-4f5f-9471-88e3b77284da\") " pod="openstack/octavia-7d0c-account-create-update-qfdv9" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.476196 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-7d0c-account-create-update-qfdv9" Jan 29 15:38:20 crc kubenswrapper[4753]: I0129 15:38:20.980508 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-7d0c-account-create-update-qfdv9"] Jan 29 15:38:21 crc kubenswrapper[4753]: I0129 15:38:21.041560 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-7d0c-account-create-update-qfdv9" event={"ID":"8606fff5-4c4f-4f5f-9471-88e3b77284da","Type":"ContainerStarted","Data":"852aed17cac39651e0174e40126257791d1f50964a9477b870140ee60eaacb97"} Jan 29 15:38:21 crc kubenswrapper[4753]: I0129 15:38:21.043639 4753 generic.go:334] "Generic (PLEG): container finished" podID="ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f" containerID="bc3407d0a5c92f58438310b7f48aeacedd9b97b3c25829b886e6d324ee6bf3d4" exitCode=0 Jan 29 15:38:21 crc kubenswrapper[4753]: I0129 15:38:21.043769 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-wmq68" event={"ID":"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f","Type":"ContainerDied","Data":"bc3407d0a5c92f58438310b7f48aeacedd9b97b3c25829b886e6d324ee6bf3d4"} Jan 29 15:38:22 crc kubenswrapper[4753]: I0129 15:38:22.080332 4753 generic.go:334] "Generic (PLEG): container finished" podID="8606fff5-4c4f-4f5f-9471-88e3b77284da" containerID="07300aa973e7cedf0320cd4b96ddc63261f22a588c275ed9e0d41101076bfdc4" exitCode=0 Jan 29 15:38:22 crc kubenswrapper[4753]: I0129 15:38:22.080472 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-7d0c-account-create-update-qfdv9" event={"ID":"8606fff5-4c4f-4f5f-9471-88e3b77284da","Type":"ContainerDied","Data":"07300aa973e7cedf0320cd4b96ddc63261f22a588c275ed9e0d41101076bfdc4"} Jan 29 15:38:22 crc kubenswrapper[4753]: I0129 15:38:22.476237 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-wmq68" Jan 29 15:38:22 crc kubenswrapper[4753]: I0129 15:38:22.618435 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smgvq\" (UniqueName: \"kubernetes.io/projected/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-kube-api-access-smgvq\") pod \"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f\" (UID: \"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f\") " Jan 29 15:38:22 crc kubenswrapper[4753]: I0129 15:38:22.618646 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-operator-scripts\") pod \"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f\" (UID: \"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f\") " Jan 29 15:38:22 crc kubenswrapper[4753]: I0129 15:38:22.619336 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f" (UID: "ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:38:22 crc kubenswrapper[4753]: I0129 15:38:22.626661 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-kube-api-access-smgvq" (OuterVolumeSpecName: "kube-api-access-smgvq") pod "ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f" (UID: "ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f"). InnerVolumeSpecName "kube-api-access-smgvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:38:22 crc kubenswrapper[4753]: I0129 15:38:22.721314 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smgvq\" (UniqueName: \"kubernetes.io/projected/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-kube-api-access-smgvq\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:22 crc kubenswrapper[4753]: I0129 15:38:22.721359 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.036967 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-thdvt"] Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.047311 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-thdvt"] Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.089959 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-wmq68" event={"ID":"ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f","Type":"ContainerDied","Data":"7a2a42d3373035854ff801a5da371411513410081a6968b2b92d65925933fab6"} Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.090009 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a2a42d3373035854ff801a5da371411513410081a6968b2b92d65925933fab6" Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.089975 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-wmq68" Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.573062 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-7d0c-account-create-update-qfdv9" Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.641679 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9x8c\" (UniqueName: \"kubernetes.io/projected/8606fff5-4c4f-4f5f-9471-88e3b77284da-kube-api-access-w9x8c\") pod \"8606fff5-4c4f-4f5f-9471-88e3b77284da\" (UID: \"8606fff5-4c4f-4f5f-9471-88e3b77284da\") " Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.641742 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8606fff5-4c4f-4f5f-9471-88e3b77284da-operator-scripts\") pod \"8606fff5-4c4f-4f5f-9471-88e3b77284da\" (UID: \"8606fff5-4c4f-4f5f-9471-88e3b77284da\") " Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.642415 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8606fff5-4c4f-4f5f-9471-88e3b77284da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8606fff5-4c4f-4f5f-9471-88e3b77284da" (UID: "8606fff5-4c4f-4f5f-9471-88e3b77284da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.646769 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8606fff5-4c4f-4f5f-9471-88e3b77284da-kube-api-access-w9x8c" (OuterVolumeSpecName: "kube-api-access-w9x8c") pod "8606fff5-4c4f-4f5f-9471-88e3b77284da" (UID: "8606fff5-4c4f-4f5f-9471-88e3b77284da"). InnerVolumeSpecName "kube-api-access-w9x8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.744018 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9x8c\" (UniqueName: \"kubernetes.io/projected/8606fff5-4c4f-4f5f-9471-88e3b77284da-kube-api-access-w9x8c\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:23 crc kubenswrapper[4753]: I0129 15:38:23.744058 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8606fff5-4c4f-4f5f-9471-88e3b77284da-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:24 crc kubenswrapper[4753]: I0129 15:38:24.136885 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-7d0c-account-create-update-qfdv9" event={"ID":"8606fff5-4c4f-4f5f-9471-88e3b77284da","Type":"ContainerDied","Data":"852aed17cac39651e0174e40126257791d1f50964a9477b870140ee60eaacb97"} Jan 29 15:38:24 crc kubenswrapper[4753]: I0129 15:38:24.136923 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="852aed17cac39651e0174e40126257791d1f50964a9477b870140ee60eaacb97" Jan 29 15:38:24 crc kubenswrapper[4753]: I0129 15:38:24.136953 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-7d0c-account-create-update-qfdv9" Jan 29 15:38:24 crc kubenswrapper[4753]: I0129 15:38:24.161020 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d843255e-b0da-492a-92e6-6d42d8ef9848" path="/var/lib/kubelet/pods/d843255e-b0da-492a-92e6-6d42d8ef9848/volumes" Jan 29 15:38:25 crc kubenswrapper[4753]: I0129 15:38:25.149880 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:38:25 crc kubenswrapper[4753]: E0129 15:38:25.150123 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.585712 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-5rwnd"] Jan 29 15:38:26 crc kubenswrapper[4753]: E0129 15:38:26.586433 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8606fff5-4c4f-4f5f-9471-88e3b77284da" containerName="mariadb-account-create-update" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.586450 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8606fff5-4c4f-4f5f-9471-88e3b77284da" containerName="mariadb-account-create-update" Jan 29 15:38:26 crc kubenswrapper[4753]: E0129 15:38:26.586481 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f" containerName="mariadb-database-create" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.586487 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f" containerName="mariadb-database-create" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.586687 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8606fff5-4c4f-4f5f-9471-88e3b77284da" containerName="mariadb-account-create-update" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.586704 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f" containerName="mariadb-database-create" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.587314 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-5rwnd" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.593566 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52g6x\" (UniqueName: \"kubernetes.io/projected/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-kube-api-access-52g6x\") pod \"octavia-persistence-db-create-5rwnd\" (UID: \"f9de6cc8-8d76-4818-9ac7-5242037ba4dd\") " pod="openstack/octavia-persistence-db-create-5rwnd" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.593642 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-operator-scripts\") pod \"octavia-persistence-db-create-5rwnd\" (UID: \"f9de6cc8-8d76-4818-9ac7-5242037ba4dd\") " pod="openstack/octavia-persistence-db-create-5rwnd" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.596380 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-5rwnd"] Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.694623 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52g6x\" (UniqueName: \"kubernetes.io/projected/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-kube-api-access-52g6x\") pod \"octavia-persistence-db-create-5rwnd\" (UID: \"f9de6cc8-8d76-4818-9ac7-5242037ba4dd\") " pod="openstack/octavia-persistence-db-create-5rwnd" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.694688 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-operator-scripts\") pod \"octavia-persistence-db-create-5rwnd\" (UID: \"f9de6cc8-8d76-4818-9ac7-5242037ba4dd\") " pod="openstack/octavia-persistence-db-create-5rwnd" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.695478 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-operator-scripts\") pod \"octavia-persistence-db-create-5rwnd\" (UID: \"f9de6cc8-8d76-4818-9ac7-5242037ba4dd\") " pod="openstack/octavia-persistence-db-create-5rwnd" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.722935 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52g6x\" (UniqueName: \"kubernetes.io/projected/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-kube-api-access-52g6x\") pod \"octavia-persistence-db-create-5rwnd\" (UID: \"f9de6cc8-8d76-4818-9ac7-5242037ba4dd\") " pod="openstack/octavia-persistence-db-create-5rwnd" Jan 29 15:38:26 crc kubenswrapper[4753]: I0129 15:38:26.908882 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-5rwnd" Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.377449 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-5rwnd"] Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.603688 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-41eb-account-create-update-mq7gf"] Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.605275 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-41eb-account-create-update-mq7gf" Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.614090 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.628914 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kpzh\" (UniqueName: \"kubernetes.io/projected/156f1d4c-b1f1-4c66-9425-65a6819d5efb-kube-api-access-6kpzh\") pod \"octavia-41eb-account-create-update-mq7gf\" (UID: \"156f1d4c-b1f1-4c66-9425-65a6819d5efb\") " pod="openstack/octavia-41eb-account-create-update-mq7gf" Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.628984 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/156f1d4c-b1f1-4c66-9425-65a6819d5efb-operator-scripts\") pod \"octavia-41eb-account-create-update-mq7gf\" (UID: \"156f1d4c-b1f1-4c66-9425-65a6819d5efb\") " pod="openstack/octavia-41eb-account-create-update-mq7gf" Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.635245 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-41eb-account-create-update-mq7gf"] Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.730529 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kpzh\" (UniqueName: \"kubernetes.io/projected/156f1d4c-b1f1-4c66-9425-65a6819d5efb-kube-api-access-6kpzh\") pod \"octavia-41eb-account-create-update-mq7gf\" (UID: \"156f1d4c-b1f1-4c66-9425-65a6819d5efb\") " pod="openstack/octavia-41eb-account-create-update-mq7gf" Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.730592 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/156f1d4c-b1f1-4c66-9425-65a6819d5efb-operator-scripts\") pod \"octavia-41eb-account-create-update-mq7gf\" (UID: \"156f1d4c-b1f1-4c66-9425-65a6819d5efb\") " pod="openstack/octavia-41eb-account-create-update-mq7gf" Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.731372 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/156f1d4c-b1f1-4c66-9425-65a6819d5efb-operator-scripts\") pod \"octavia-41eb-account-create-update-mq7gf\" (UID: \"156f1d4c-b1f1-4c66-9425-65a6819d5efb\") " pod="openstack/octavia-41eb-account-create-update-mq7gf" Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.764831 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kpzh\" (UniqueName: \"kubernetes.io/projected/156f1d4c-b1f1-4c66-9425-65a6819d5efb-kube-api-access-6kpzh\") pod \"octavia-41eb-account-create-update-mq7gf\" (UID: \"156f1d4c-b1f1-4c66-9425-65a6819d5efb\") " pod="openstack/octavia-41eb-account-create-update-mq7gf" Jan 29 15:38:27 crc kubenswrapper[4753]: I0129 15:38:27.984058 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-41eb-account-create-update-mq7gf" Jan 29 15:38:28 crc kubenswrapper[4753]: I0129 15:38:28.183657 4753 generic.go:334] "Generic (PLEG): container finished" podID="f9de6cc8-8d76-4818-9ac7-5242037ba4dd" containerID="11ecf0c88f759c8d15f6d7bfd2818d985fd245da02f5fa9dcefaaace099b83a6" exitCode=0 Jan 29 15:38:28 crc kubenswrapper[4753]: I0129 15:38:28.183724 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-5rwnd" event={"ID":"f9de6cc8-8d76-4818-9ac7-5242037ba4dd","Type":"ContainerDied","Data":"11ecf0c88f759c8d15f6d7bfd2818d985fd245da02f5fa9dcefaaace099b83a6"} Jan 29 15:38:28 crc kubenswrapper[4753]: I0129 15:38:28.183996 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-5rwnd" event={"ID":"f9de6cc8-8d76-4818-9ac7-5242037ba4dd","Type":"ContainerStarted","Data":"a404c8ed21419b1c7485f2ce94066d739a64585668bee16525ac4af2d01aa91e"} Jan 29 15:38:28 crc kubenswrapper[4753]: I0129 15:38:28.423314 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-41eb-account-create-update-mq7gf"] Jan 29 15:38:28 crc kubenswrapper[4753]: W0129 15:38:28.424751 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod156f1d4c_b1f1_4c66_9425_65a6819d5efb.slice/crio-81bd9818f8a5e11865af09c3c20810c4c3879de34301aa93256f6f00eae99ff7 WatchSource:0}: Error finding container 81bd9818f8a5e11865af09c3c20810c4c3879de34301aa93256f6f00eae99ff7: Status 404 returned error can't find the container with id 81bd9818f8a5e11865af09c3c20810c4c3879de34301aa93256f6f00eae99ff7 Jan 29 15:38:29 crc kubenswrapper[4753]: I0129 15:38:29.198242 4753 generic.go:334] "Generic (PLEG): container finished" podID="156f1d4c-b1f1-4c66-9425-65a6819d5efb" containerID="d20fca9eb1965c2ee9d34c6086a3515cca60ee05516e5cdd03b22850d563cfed" exitCode=0 Jan 29 15:38:29 crc kubenswrapper[4753]: I0129 15:38:29.198361 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-41eb-account-create-update-mq7gf" event={"ID":"156f1d4c-b1f1-4c66-9425-65a6819d5efb","Type":"ContainerDied","Data":"d20fca9eb1965c2ee9d34c6086a3515cca60ee05516e5cdd03b22850d563cfed"} Jan 29 15:38:29 crc kubenswrapper[4753]: I0129 15:38:29.198681 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-41eb-account-create-update-mq7gf" event={"ID":"156f1d4c-b1f1-4c66-9425-65a6819d5efb","Type":"ContainerStarted","Data":"81bd9818f8a5e11865af09c3c20810c4c3879de34301aa93256f6f00eae99ff7"} Jan 29 15:38:29 crc kubenswrapper[4753]: I0129 15:38:29.578895 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-5rwnd" Jan 29 15:38:29 crc kubenswrapper[4753]: I0129 15:38:29.669956 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52g6x\" (UniqueName: \"kubernetes.io/projected/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-kube-api-access-52g6x\") pod \"f9de6cc8-8d76-4818-9ac7-5242037ba4dd\" (UID: \"f9de6cc8-8d76-4818-9ac7-5242037ba4dd\") " Jan 29 15:38:29 crc kubenswrapper[4753]: I0129 15:38:29.670165 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-operator-scripts\") pod \"f9de6cc8-8d76-4818-9ac7-5242037ba4dd\" (UID: \"f9de6cc8-8d76-4818-9ac7-5242037ba4dd\") " Jan 29 15:38:29 crc kubenswrapper[4753]: I0129 15:38:29.670824 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9de6cc8-8d76-4818-9ac7-5242037ba4dd" (UID: "f9de6cc8-8d76-4818-9ac7-5242037ba4dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:38:29 crc kubenswrapper[4753]: I0129 15:38:29.675663 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-kube-api-access-52g6x" (OuterVolumeSpecName: "kube-api-access-52g6x") pod "f9de6cc8-8d76-4818-9ac7-5242037ba4dd" (UID: "f9de6cc8-8d76-4818-9ac7-5242037ba4dd"). InnerVolumeSpecName "kube-api-access-52g6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:38:29 crc kubenswrapper[4753]: I0129 15:38:29.772039 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52g6x\" (UniqueName: \"kubernetes.io/projected/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-kube-api-access-52g6x\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:29 crc kubenswrapper[4753]: I0129 15:38:29.772072 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9de6cc8-8d76-4818-9ac7-5242037ba4dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:30 crc kubenswrapper[4753]: I0129 15:38:30.208569 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-5rwnd" Jan 29 15:38:30 crc kubenswrapper[4753]: I0129 15:38:30.209300 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-5rwnd" event={"ID":"f9de6cc8-8d76-4818-9ac7-5242037ba4dd","Type":"ContainerDied","Data":"a404c8ed21419b1c7485f2ce94066d739a64585668bee16525ac4af2d01aa91e"} Jan 29 15:38:30 crc kubenswrapper[4753]: I0129 15:38:30.209324 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a404c8ed21419b1c7485f2ce94066d739a64585668bee16525ac4af2d01aa91e" Jan 29 15:38:30 crc kubenswrapper[4753]: I0129 15:38:30.549183 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-41eb-account-create-update-mq7gf" Jan 29 15:38:30 crc kubenswrapper[4753]: I0129 15:38:30.596623 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/156f1d4c-b1f1-4c66-9425-65a6819d5efb-operator-scripts\") pod \"156f1d4c-b1f1-4c66-9425-65a6819d5efb\" (UID: \"156f1d4c-b1f1-4c66-9425-65a6819d5efb\") " Jan 29 15:38:30 crc kubenswrapper[4753]: I0129 15:38:30.596727 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kpzh\" (UniqueName: \"kubernetes.io/projected/156f1d4c-b1f1-4c66-9425-65a6819d5efb-kube-api-access-6kpzh\") pod \"156f1d4c-b1f1-4c66-9425-65a6819d5efb\" (UID: \"156f1d4c-b1f1-4c66-9425-65a6819d5efb\") " Jan 29 15:38:30 crc kubenswrapper[4753]: I0129 15:38:30.597219 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/156f1d4c-b1f1-4c66-9425-65a6819d5efb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "156f1d4c-b1f1-4c66-9425-65a6819d5efb" (UID: "156f1d4c-b1f1-4c66-9425-65a6819d5efb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:38:30 crc kubenswrapper[4753]: I0129 15:38:30.597706 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/156f1d4c-b1f1-4c66-9425-65a6819d5efb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:30 crc kubenswrapper[4753]: I0129 15:38:30.602366 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156f1d4c-b1f1-4c66-9425-65a6819d5efb-kube-api-access-6kpzh" (OuterVolumeSpecName: "kube-api-access-6kpzh") pod "156f1d4c-b1f1-4c66-9425-65a6819d5efb" (UID: "156f1d4c-b1f1-4c66-9425-65a6819d5efb"). InnerVolumeSpecName "kube-api-access-6kpzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:38:30 crc kubenswrapper[4753]: I0129 15:38:30.700941 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kpzh\" (UniqueName: \"kubernetes.io/projected/156f1d4c-b1f1-4c66-9425-65a6819d5efb-kube-api-access-6kpzh\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:31 crc kubenswrapper[4753]: I0129 15:38:31.220243 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-41eb-account-create-update-mq7gf" event={"ID":"156f1d4c-b1f1-4c66-9425-65a6819d5efb","Type":"ContainerDied","Data":"81bd9818f8a5e11865af09c3c20810c4c3879de34301aa93256f6f00eae99ff7"} Jan 29 15:38:31 crc kubenswrapper[4753]: I0129 15:38:31.220288 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81bd9818f8a5e11865af09c3c20810c4c3879de34301aa93256f6f00eae99ff7" Jan 29 15:38:31 crc kubenswrapper[4753]: I0129 15:38:31.220298 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-41eb-account-create-update-mq7gf" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.721431 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-6db4d9d88d-tngdc"] Jan 29 15:38:33 crc kubenswrapper[4753]: E0129 15:38:33.723032 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9de6cc8-8d76-4818-9ac7-5242037ba4dd" containerName="mariadb-database-create" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.723070 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9de6cc8-8d76-4818-9ac7-5242037ba4dd" containerName="mariadb-database-create" Jan 29 15:38:33 crc kubenswrapper[4753]: E0129 15:38:33.723123 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156f1d4c-b1f1-4c66-9425-65a6819d5efb" containerName="mariadb-account-create-update" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.723131 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="156f1d4c-b1f1-4c66-9425-65a6819d5efb" containerName="mariadb-account-create-update" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.723371 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9de6cc8-8d76-4818-9ac7-5242037ba4dd" containerName="mariadb-database-create" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.723393 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="156f1d4c-b1f1-4c66-9425-65a6819d5efb" containerName="mariadb-account-create-update" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.725141 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.727230 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-khpts" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.727350 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.728249 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.736973 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6db4d9d88d-tngdc"] Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.763537 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-octavia-run\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.763648 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-config-data\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.763804 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-scripts\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.763874 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-config-data-merged\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.763989 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-combined-ca-bundle\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.866174 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-scripts\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.866242 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-config-data-merged\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.866311 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-combined-ca-bundle\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.866409 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-octavia-run\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.866462 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-config-data\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.866864 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-config-data-merged\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.866988 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-octavia-run\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.872976 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-combined-ca-bundle\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.873611 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-config-data\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:33 crc kubenswrapper[4753]: I0129 15:38:33.875768 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f2e6df8-8fdf-43d7-bf49-b97301d2767a-scripts\") pod \"octavia-api-6db4d9d88d-tngdc\" (UID: \"8f2e6df8-8fdf-43d7-bf49-b97301d2767a\") " pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:34 crc kubenswrapper[4753]: I0129 15:38:34.050142 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:34 crc kubenswrapper[4753]: I0129 15:38:34.551792 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6db4d9d88d-tngdc"] Jan 29 15:38:35 crc kubenswrapper[4753]: I0129 15:38:35.262084 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6db4d9d88d-tngdc" event={"ID":"8f2e6df8-8fdf-43d7-bf49-b97301d2767a","Type":"ContainerStarted","Data":"3ef31d72ac6bc3950e98b750eb1c88da3becde248a25801bdafe63140bae7e5e"} Jan 29 15:38:36 crc kubenswrapper[4753]: I0129 15:38:36.160708 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:38:36 crc kubenswrapper[4753]: E0129 15:38:36.160987 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:38:37 crc kubenswrapper[4753]: I0129 15:38:37.032817 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5482c"] Jan 29 15:38:37 crc kubenswrapper[4753]: I0129 15:38:37.045035 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5482c"] Jan 29 15:38:38 crc kubenswrapper[4753]: I0129 15:38:38.164405 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b556f5d-9b16-408d-88fb-6f9490dde420" path="/var/lib/kubelet/pods/1b556f5d-9b16-408d-88fb-6f9490dde420/volumes" Jan 29 15:38:44 crc kubenswrapper[4753]: I0129 15:38:44.344430 4753 generic.go:334] "Generic (PLEG): container finished" podID="8f2e6df8-8fdf-43d7-bf49-b97301d2767a" containerID="10879d93751c5604d4a2e3d5b27844a1f129817123cf15cd37092ebc5ab918d2" exitCode=0 Jan 29 15:38:44 crc kubenswrapper[4753]: I0129 15:38:44.344486 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6db4d9d88d-tngdc" event={"ID":"8f2e6df8-8fdf-43d7-bf49-b97301d2767a","Type":"ContainerDied","Data":"10879d93751c5604d4a2e3d5b27844a1f129817123cf15cd37092ebc5ab918d2"} Jan 29 15:38:45 crc kubenswrapper[4753]: I0129 15:38:45.354877 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6db4d9d88d-tngdc" event={"ID":"8f2e6df8-8fdf-43d7-bf49-b97301d2767a","Type":"ContainerStarted","Data":"e42e072182f1aa2662f48fb2ff1a3484e50d92114763a9732f6d82efadec62d1"} Jan 29 15:38:45 crc kubenswrapper[4753]: I0129 15:38:45.355430 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6db4d9d88d-tngdc" event={"ID":"8f2e6df8-8fdf-43d7-bf49-b97301d2767a","Type":"ContainerStarted","Data":"3292b91e3e3987b3b9e51a971e1351bd4b5564e556362ae2f9ea1be32f12d587"} Jan 29 15:38:45 crc kubenswrapper[4753]: I0129 15:38:45.356499 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:45 crc kubenswrapper[4753]: I0129 15:38:45.356528 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:45 crc kubenswrapper[4753]: I0129 15:38:45.384993 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-6db4d9d88d-tngdc" podStartSLOduration=3.40238425 podStartE2EDuration="12.384970029s" podCreationTimestamp="2026-01-29 15:38:33 +0000 UTC" firstStartedPulling="2026-01-29 15:38:34.555665543 +0000 UTC m=+5749.250399965" lastFinishedPulling="2026-01-29 15:38:43.538251372 +0000 UTC m=+5758.232985744" observedRunningTime="2026-01-29 15:38:45.377092018 +0000 UTC m=+5760.071826410" watchObservedRunningTime="2026-01-29 15:38:45.384970029 +0000 UTC m=+5760.079704401" Jan 29 15:38:51 crc kubenswrapper[4753]: I0129 15:38:51.150131 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:38:51 crc kubenswrapper[4753]: E0129 15:38:51.150938 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.050453 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-jr6fv" podUID="e68ccb6b-3714-472e-a754-247bb456104d" containerName="ovn-controller" probeResult="failure" output=< Jan 29 15:38:52 crc kubenswrapper[4753]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 29 15:38:52 crc kubenswrapper[4753]: > Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.071039 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.074351 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fljn5" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.223682 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jr6fv-config-9phfn"] Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.225192 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.228254 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.269559 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jr6fv-config-9phfn"] Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.362835 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-log-ovn\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.362912 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run-ovn\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.362994 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-additional-scripts\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.363062 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-scripts\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.363127 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.363170 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjzmg\" (UniqueName: \"kubernetes.io/projected/8c5a63f3-4599-48a7-8aac-4f0062515a88-kube-api-access-fjzmg\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.464542 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-log-ovn\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.464631 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run-ovn\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.464693 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-additional-scripts\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.464774 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-scripts\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.464854 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.464892 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjzmg\" (UniqueName: \"kubernetes.io/projected/8c5a63f3-4599-48a7-8aac-4f0062515a88-kube-api-access-fjzmg\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.465558 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-log-ovn\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.465581 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run-ovn\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.466516 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-additional-scripts\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.466604 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.467814 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-scripts\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.492738 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjzmg\" (UniqueName: \"kubernetes.io/projected/8c5a63f3-4599-48a7-8aac-4f0062515a88-kube-api-access-fjzmg\") pod \"ovn-controller-jr6fv-config-9phfn\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:52 crc kubenswrapper[4753]: I0129 15:38:52.557271 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:53 crc kubenswrapper[4753]: I0129 15:38:53.074832 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jr6fv-config-9phfn"] Jan 29 15:38:53 crc kubenswrapper[4753]: I0129 15:38:53.437548 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jr6fv-config-9phfn" event={"ID":"8c5a63f3-4599-48a7-8aac-4f0062515a88","Type":"ContainerStarted","Data":"45f3369822e9d27958bbe7571a9cd1bb7d754295cdeb7ea6d115b8d7b110b73d"} Jan 29 15:38:53 crc kubenswrapper[4753]: I0129 15:38:53.437908 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jr6fv-config-9phfn" event={"ID":"8c5a63f3-4599-48a7-8aac-4f0062515a88","Type":"ContainerStarted","Data":"377852e467e30ecb24dd7e7ac76bf7cda85c52ca0f0f8e643f1e267b079dfee5"} Jan 29 15:38:53 crc kubenswrapper[4753]: I0129 15:38:53.464375 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-jr6fv-config-9phfn" podStartSLOduration=1.464355986 podStartE2EDuration="1.464355986s" podCreationTimestamp="2026-01-29 15:38:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:38:53.460790429 +0000 UTC m=+5768.155524821" watchObservedRunningTime="2026-01-29 15:38:53.464355986 +0000 UTC m=+5768.159090368" Jan 29 15:38:53 crc kubenswrapper[4753]: I0129 15:38:53.659588 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:53 crc kubenswrapper[4753]: I0129 15:38:53.837636 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6db4d9d88d-tngdc" Jan 29 15:38:55 crc kubenswrapper[4753]: I0129 15:38:55.456033 4753 generic.go:334] "Generic (PLEG): container finished" podID="8c5a63f3-4599-48a7-8aac-4f0062515a88" containerID="45f3369822e9d27958bbe7571a9cd1bb7d754295cdeb7ea6d115b8d7b110b73d" exitCode=0 Jan 29 15:38:55 crc kubenswrapper[4753]: I0129 15:38:55.456125 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jr6fv-config-9phfn" event={"ID":"8c5a63f3-4599-48a7-8aac-4f0062515a88","Type":"ContainerDied","Data":"45f3369822e9d27958bbe7571a9cd1bb7d754295cdeb7ea6d115b8d7b110b73d"} Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.484222 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-fs4wp"] Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.487115 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.493872 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.494144 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.494524 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.510789 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-fs4wp"] Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.565327 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4c395f-96c2-4745-a815-dc86a0a60498-config-data\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.565381 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4c395f-96c2-4745-a815-dc86a0a60498-scripts\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.565560 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4f4c395f-96c2-4745-a815-dc86a0a60498-hm-ports\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.565741 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4f4c395f-96c2-4745-a815-dc86a0a60498-config-data-merged\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.667741 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4c395f-96c2-4745-a815-dc86a0a60498-config-data\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.667810 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4c395f-96c2-4745-a815-dc86a0a60498-scripts\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.667877 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4f4c395f-96c2-4745-a815-dc86a0a60498-hm-ports\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.667954 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4f4c395f-96c2-4745-a815-dc86a0a60498-config-data-merged\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.668622 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4f4c395f-96c2-4745-a815-dc86a0a60498-config-data-merged\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.669037 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4f4c395f-96c2-4745-a815-dc86a0a60498-hm-ports\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.677953 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4c395f-96c2-4745-a815-dc86a0a60498-scripts\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.680845 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4c395f-96c2-4745-a815-dc86a0a60498-config-data\") pod \"octavia-rsyslog-fs4wp\" (UID: \"4f4c395f-96c2-4745-a815-dc86a0a60498\") " pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.821675 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:38:56 crc kubenswrapper[4753]: I0129 15:38:56.925362 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.076177 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-jr6fv" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.082000 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-additional-scripts\") pod \"8c5a63f3-4599-48a7-8aac-4f0062515a88\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.082103 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-log-ovn\") pod \"8c5a63f3-4599-48a7-8aac-4f0062515a88\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.082141 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run-ovn\") pod \"8c5a63f3-4599-48a7-8aac-4f0062515a88\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.082212 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjzmg\" (UniqueName: \"kubernetes.io/projected/8c5a63f3-4599-48a7-8aac-4f0062515a88-kube-api-access-fjzmg\") pod \"8c5a63f3-4599-48a7-8aac-4f0062515a88\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.082278 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-scripts\") pod \"8c5a63f3-4599-48a7-8aac-4f0062515a88\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.082336 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run\") pod \"8c5a63f3-4599-48a7-8aac-4f0062515a88\" (UID: \"8c5a63f3-4599-48a7-8aac-4f0062515a88\") " Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.082383 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8c5a63f3-4599-48a7-8aac-4f0062515a88" (UID: "8c5a63f3-4599-48a7-8aac-4f0062515a88"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.082888 4753 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.082945 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run" (OuterVolumeSpecName: "var-run") pod "8c5a63f3-4599-48a7-8aac-4f0062515a88" (UID: "8c5a63f3-4599-48a7-8aac-4f0062515a88"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.082975 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8c5a63f3-4599-48a7-8aac-4f0062515a88" (UID: "8c5a63f3-4599-48a7-8aac-4f0062515a88"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.083184 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8c5a63f3-4599-48a7-8aac-4f0062515a88" (UID: "8c5a63f3-4599-48a7-8aac-4f0062515a88"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.083481 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-scripts" (OuterVolumeSpecName: "scripts") pod "8c5a63f3-4599-48a7-8aac-4f0062515a88" (UID: "8c5a63f3-4599-48a7-8aac-4f0062515a88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.108473 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5a63f3-4599-48a7-8aac-4f0062515a88-kube-api-access-fjzmg" (OuterVolumeSpecName: "kube-api-access-fjzmg") pod "8c5a63f3-4599-48a7-8aac-4f0062515a88" (UID: "8c5a63f3-4599-48a7-8aac-4f0062515a88"). InnerVolumeSpecName "kube-api-access-fjzmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.185380 4753 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.185410 4753 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.185419 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjzmg\" (UniqueName: \"kubernetes.io/projected/8c5a63f3-4599-48a7-8aac-4f0062515a88-kube-api-access-fjzmg\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.185428 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c5a63f3-4599-48a7-8aac-4f0062515a88-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.185438 4753 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c5a63f3-4599-48a7-8aac-4f0062515a88-var-run\") on node \"crc\" DevicePath \"\"" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.241399 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-flkkm"] Jan 29 15:38:57 crc kubenswrapper[4753]: E0129 15:38:57.241884 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5a63f3-4599-48a7-8aac-4f0062515a88" containerName="ovn-config" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.241906 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5a63f3-4599-48a7-8aac-4f0062515a88" containerName="ovn-config" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.242179 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5a63f3-4599-48a7-8aac-4f0062515a88" containerName="ovn-config" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.246102 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.250746 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.279431 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-flkkm"] Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.389892 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07c66a66-0505-46c5-ab26-584d30841864-httpd-config\") pod \"octavia-image-upload-65dd99cb46-flkkm\" (UID: \"07c66a66-0505-46c5-ab26-584d30841864\") " pod="openstack/octavia-image-upload-65dd99cb46-flkkm" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.390030 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/07c66a66-0505-46c5-ab26-584d30841864-amphora-image\") pod \"octavia-image-upload-65dd99cb46-flkkm\" (UID: \"07c66a66-0505-46c5-ab26-584d30841864\") " pod="openstack/octavia-image-upload-65dd99cb46-flkkm" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.483825 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-fs4wp" event={"ID":"4f4c395f-96c2-4745-a815-dc86a0a60498","Type":"ContainerStarted","Data":"6c53117d6628511f55a032bc28e901c19cd0bb5329ef2eb28fad00922606eba1"} Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.487768 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jr6fv-config-9phfn" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.488100 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jr6fv-config-9phfn" event={"ID":"8c5a63f3-4599-48a7-8aac-4f0062515a88","Type":"ContainerDied","Data":"377852e467e30ecb24dd7e7ac76bf7cda85c52ca0f0f8e643f1e267b079dfee5"} Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.488276 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="377852e467e30ecb24dd7e7ac76bf7cda85c52ca0f0f8e643f1e267b079dfee5" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.495099 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/07c66a66-0505-46c5-ab26-584d30841864-amphora-image\") pod \"octavia-image-upload-65dd99cb46-flkkm\" (UID: \"07c66a66-0505-46c5-ab26-584d30841864\") " pod="openstack/octavia-image-upload-65dd99cb46-flkkm" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.495262 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07c66a66-0505-46c5-ab26-584d30841864-httpd-config\") pod \"octavia-image-upload-65dd99cb46-flkkm\" (UID: \"07c66a66-0505-46c5-ab26-584d30841864\") " pod="openstack/octavia-image-upload-65dd99cb46-flkkm" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.498893 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/07c66a66-0505-46c5-ab26-584d30841864-amphora-image\") pod \"octavia-image-upload-65dd99cb46-flkkm\" (UID: \"07c66a66-0505-46c5-ab26-584d30841864\") " pod="openstack/octavia-image-upload-65dd99cb46-flkkm" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.502209 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07c66a66-0505-46c5-ab26-584d30841864-httpd-config\") pod \"octavia-image-upload-65dd99cb46-flkkm\" (UID: \"07c66a66-0505-46c5-ab26-584d30841864\") " pod="openstack/octavia-image-upload-65dd99cb46-flkkm" Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.511736 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-fs4wp"] Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.561595 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-fs4wp"] Jan 29 15:38:57 crc kubenswrapper[4753]: I0129 15:38:57.572050 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.022966 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jr6fv-config-9phfn"] Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.046716 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jr6fv-config-9phfn"] Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.063224 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-jc6gq"] Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.065134 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.066119 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-jc6gq"] Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.072822 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.103736 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-flkkm"] Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.162822 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5a63f3-4599-48a7-8aac-4f0062515a88" path="/var/lib/kubelet/pods/8c5a63f3-4599-48a7-8aac-4f0062515a88/volumes" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.214094 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/782d2b35-9247-42b6-91b9-a935997474e7-config-data-merged\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.214515 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-config-data\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.214674 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-combined-ca-bundle\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.214810 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-scripts\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.317111 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/782d2b35-9247-42b6-91b9-a935997474e7-config-data-merged\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.317612 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-config-data\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.317664 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/782d2b35-9247-42b6-91b9-a935997474e7-config-data-merged\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.317697 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-combined-ca-bundle\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.317729 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-scripts\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.323774 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-scripts\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.323815 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-combined-ca-bundle\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.338578 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-config-data\") pod \"octavia-db-sync-jc6gq\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.432787 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:38:58 crc kubenswrapper[4753]: I0129 15:38:58.576775 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" event={"ID":"07c66a66-0505-46c5-ab26-584d30841864","Type":"ContainerStarted","Data":"e7b7c6ac79122bdbbbf2f39a8b1a420148693517d3cbfd7c8ca9aad399856f2c"} Jan 29 15:38:59 crc kubenswrapper[4753]: I0129 15:38:59.057945 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-jc6gq"] Jan 29 15:38:59 crc kubenswrapper[4753]: W0129 15:38:59.088488 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod782d2b35_9247_42b6_91b9_a935997474e7.slice/crio-83cab8102b5785c6b3e58be8ab93f7696c3cd13e165a8ee26144f9f7c074717f WatchSource:0}: Error finding container 83cab8102b5785c6b3e58be8ab93f7696c3cd13e165a8ee26144f9f7c074717f: Status 404 returned error can't find the container with id 83cab8102b5785c6b3e58be8ab93f7696c3cd13e165a8ee26144f9f7c074717f Jan 29 15:38:59 crc kubenswrapper[4753]: I0129 15:38:59.590176 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-jc6gq" event={"ID":"782d2b35-9247-42b6-91b9-a935997474e7","Type":"ContainerStarted","Data":"83cab8102b5785c6b3e58be8ab93f7696c3cd13e165a8ee26144f9f7c074717f"} Jan 29 15:39:00 crc kubenswrapper[4753]: I0129 15:39:00.601462 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-jc6gq" event={"ID":"782d2b35-9247-42b6-91b9-a935997474e7","Type":"ContainerStarted","Data":"8d455b5e8c4810211d797d03e1c8774ea06e385b76a85777d4260549aad7323b"} Jan 29 15:39:01 crc kubenswrapper[4753]: I0129 15:39:01.611636 4753 generic.go:334] "Generic (PLEG): container finished" podID="782d2b35-9247-42b6-91b9-a935997474e7" containerID="8d455b5e8c4810211d797d03e1c8774ea06e385b76a85777d4260549aad7323b" exitCode=0 Jan 29 15:39:01 crc kubenswrapper[4753]: I0129 15:39:01.612025 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-jc6gq" event={"ID":"782d2b35-9247-42b6-91b9-a935997474e7","Type":"ContainerDied","Data":"8d455b5e8c4810211d797d03e1c8774ea06e385b76a85777d4260549aad7323b"} Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.163429 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-t2tws"] Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.176293 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-t2tws"] Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.176418 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.179644 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.179791 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.181299 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.310008 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/65774cd6-bd93-4aef-8747-d2436ecb13ec-config-data-merged\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.310096 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-scripts\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.310345 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-config-data\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.310397 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-combined-ca-bundle\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.310430 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/65774cd6-bd93-4aef-8747-d2436ecb13ec-hm-ports\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.310464 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-amphora-certs\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.412580 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-config-data\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.412640 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-combined-ca-bundle\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.412667 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/65774cd6-bd93-4aef-8747-d2436ecb13ec-hm-ports\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.412687 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-amphora-certs\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.412750 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/65774cd6-bd93-4aef-8747-d2436ecb13ec-config-data-merged\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.412787 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-scripts\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.413494 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/65774cd6-bd93-4aef-8747-d2436ecb13ec-config-data-merged\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.414173 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/65774cd6-bd93-4aef-8747-d2436ecb13ec-hm-ports\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.512558 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-config-data\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.513188 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-scripts\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.514447 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-amphora-certs\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.519937 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65774cd6-bd93-4aef-8747-d2436ecb13ec-combined-ca-bundle\") pod \"octavia-healthmanager-t2tws\" (UID: \"65774cd6-bd93-4aef-8747-d2436ecb13ec\") " pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:02 crc kubenswrapper[4753]: I0129 15:39:02.802291 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:03 crc kubenswrapper[4753]: I0129 15:39:03.566494 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-t2tws"] Jan 29 15:39:03 crc kubenswrapper[4753]: I0129 15:39:03.638337 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-jc6gq" event={"ID":"782d2b35-9247-42b6-91b9-a935997474e7","Type":"ContainerStarted","Data":"21313d876224b59e882c7a5a5c997a5c1f413fa7b0bf73bb1300a2090b4343ea"} Jan 29 15:39:03 crc kubenswrapper[4753]: I0129 15:39:03.639581 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-t2tws" event={"ID":"65774cd6-bd93-4aef-8747-d2436ecb13ec","Type":"ContainerStarted","Data":"1d7593bafe611370f6308acca449b8c543423cba13e128977a300a6ee704b185"} Jan 29 15:39:03 crc kubenswrapper[4753]: I0129 15:39:03.641703 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-fs4wp" event={"ID":"4f4c395f-96c2-4745-a815-dc86a0a60498","Type":"ContainerStarted","Data":"5021b2b02211c8cdfe0b888d0995b2067c81262425de91249a9270d5829ed012"} Jan 29 15:39:03 crc kubenswrapper[4753]: I0129 15:39:03.698220 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-jc6gq" podStartSLOduration=5.698203493 podStartE2EDuration="5.698203493s" podCreationTimestamp="2026-01-29 15:38:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:39:03.668952333 +0000 UTC m=+5778.363686735" watchObservedRunningTime="2026-01-29 15:39:03.698203493 +0000 UTC m=+5778.392937875" Jan 29 15:39:03 crc kubenswrapper[4753]: I0129 15:39:03.931971 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-vv6cz"] Jan 29 15:39:03 crc kubenswrapper[4753]: I0129 15:39:03.933759 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:03 crc kubenswrapper[4753]: I0129 15:39:03.937225 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Jan 29 15:39:03 crc kubenswrapper[4753]: I0129 15:39:03.937277 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Jan 29 15:39:03 crc kubenswrapper[4753]: I0129 15:39:03.953293 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-vv6cz"] Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.053736 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f72baddf-f7f0-4f14-9d4e-cf49b4790797-hm-ports\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.053816 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-combined-ca-bundle\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.054093 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-scripts\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.054193 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-amphora-certs\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.054282 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-config-data\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.054312 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f72baddf-f7f0-4f14-9d4e-cf49b4790797-config-data-merged\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.157731 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-config-data\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.158340 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f72baddf-f7f0-4f14-9d4e-cf49b4790797-config-data-merged\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.158419 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f72baddf-f7f0-4f14-9d4e-cf49b4790797-hm-ports\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.158520 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-combined-ca-bundle\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.158713 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-scripts\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.158763 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-amphora-certs\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.158791 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f72baddf-f7f0-4f14-9d4e-cf49b4790797-config-data-merged\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.160459 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f72baddf-f7f0-4f14-9d4e-cf49b4790797-hm-ports\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.166185 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-amphora-certs\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.166226 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-scripts\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.170266 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-config-data\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.171678 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72baddf-f7f0-4f14-9d4e-cf49b4790797-combined-ca-bundle\") pod \"octavia-housekeeping-vv6cz\" (UID: \"f72baddf-f7f0-4f14-9d4e-cf49b4790797\") " pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.271450 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.657095 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-t2tws" event={"ID":"65774cd6-bd93-4aef-8747-d2436ecb13ec","Type":"ContainerStarted","Data":"551e2b689c50b9e6b2b4c0a85d5e9f3cfdc80638600e4e45c048516f40bb31e3"} Jan 29 15:39:04 crc kubenswrapper[4753]: W0129 15:39:04.955459 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf72baddf_f7f0_4f14_9d4e_cf49b4790797.slice/crio-bf0103658275795d514e6065bd464fae0437cc88ee4b2a37c0b5e8f04314bb5f WatchSource:0}: Error finding container bf0103658275795d514e6065bd464fae0437cc88ee4b2a37c0b5e8f04314bb5f: Status 404 returned error can't find the container with id bf0103658275795d514e6065bd464fae0437cc88ee4b2a37c0b5e8f04314bb5f Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.958557 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 15:39:04 crc kubenswrapper[4753]: I0129 15:39:04.972469 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-vv6cz"] Jan 29 15:39:05 crc kubenswrapper[4753]: I0129 15:39:05.150078 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:39:05 crc kubenswrapper[4753]: E0129 15:39:05.150403 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:39:05 crc kubenswrapper[4753]: I0129 15:39:05.668372 4753 generic.go:334] "Generic (PLEG): container finished" podID="4f4c395f-96c2-4745-a815-dc86a0a60498" containerID="5021b2b02211c8cdfe0b888d0995b2067c81262425de91249a9270d5829ed012" exitCode=0 Jan 29 15:39:05 crc kubenswrapper[4753]: I0129 15:39:05.668440 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-fs4wp" event={"ID":"4f4c395f-96c2-4745-a815-dc86a0a60498","Type":"ContainerDied","Data":"5021b2b02211c8cdfe0b888d0995b2067c81262425de91249a9270d5829ed012"} Jan 29 15:39:05 crc kubenswrapper[4753]: I0129 15:39:05.670839 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-vv6cz" event={"ID":"f72baddf-f7f0-4f14-9d4e-cf49b4790797","Type":"ContainerStarted","Data":"bf0103658275795d514e6065bd464fae0437cc88ee4b2a37c0b5e8f04314bb5f"} Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.692443 4753 generic.go:334] "Generic (PLEG): container finished" podID="65774cd6-bd93-4aef-8747-d2436ecb13ec" containerID="551e2b689c50b9e6b2b4c0a85d5e9f3cfdc80638600e4e45c048516f40bb31e3" exitCode=0 Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.692495 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-t2tws" event={"ID":"65774cd6-bd93-4aef-8747-d2436ecb13ec","Type":"ContainerDied","Data":"551e2b689c50b9e6b2b4c0a85d5e9f3cfdc80638600e4e45c048516f40bb31e3"} Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.797525 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-vdlhc"] Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.799766 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.802902 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.803004 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.814169 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-vdlhc"] Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.935903 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-config-data\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.935975 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6bb0bc99-03cf-48b9-a648-7141b56c18ba-config-data-merged\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.936012 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-amphora-certs\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.936027 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-combined-ca-bundle\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.936066 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-scripts\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:06 crc kubenswrapper[4753]: I0129 15:39:06.936087 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6bb0bc99-03cf-48b9-a648-7141b56c18ba-hm-ports\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.039079 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-config-data\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.039259 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6bb0bc99-03cf-48b9-a648-7141b56c18ba-config-data-merged\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.039327 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-amphora-certs\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.039354 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-combined-ca-bundle\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.039407 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-scripts\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.039435 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6bb0bc99-03cf-48b9-a648-7141b56c18ba-hm-ports\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.040689 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6bb0bc99-03cf-48b9-a648-7141b56c18ba-config-data-merged\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.041347 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6bb0bc99-03cf-48b9-a648-7141b56c18ba-hm-ports\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.047791 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-config-data\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.050712 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-combined-ca-bundle\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.050972 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-amphora-certs\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.055739 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bb0bc99-03cf-48b9-a648-7141b56c18ba-scripts\") pod \"octavia-worker-vdlhc\" (UID: \"6bb0bc99-03cf-48b9-a648-7141b56c18ba\") " pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.161555 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.410532 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-t2tws"] Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.702754 4753 generic.go:334] "Generic (PLEG): container finished" podID="782d2b35-9247-42b6-91b9-a935997474e7" containerID="21313d876224b59e882c7a5a5c997a5c1f413fa7b0bf73bb1300a2090b4343ea" exitCode=0 Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.702811 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-jc6gq" event={"ID":"782d2b35-9247-42b6-91b9-a935997474e7","Type":"ContainerDied","Data":"21313d876224b59e882c7a5a5c997a5c1f413fa7b0bf73bb1300a2090b4343ea"} Jan 29 15:39:07 crc kubenswrapper[4753]: I0129 15:39:07.821619 4753 scope.go:117] "RemoveContainer" containerID="60ad808dfa00971620b658170303e06e7ae34ec07319de287c382d6ff5a3d568" Jan 29 15:39:09 crc kubenswrapper[4753]: I0129 15:39:09.814036 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:39:09 crc kubenswrapper[4753]: I0129 15:39:09.902355 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-combined-ca-bundle\") pod \"782d2b35-9247-42b6-91b9-a935997474e7\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " Jan 29 15:39:09 crc kubenswrapper[4753]: I0129 15:39:09.902430 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/782d2b35-9247-42b6-91b9-a935997474e7-config-data-merged\") pod \"782d2b35-9247-42b6-91b9-a935997474e7\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " Jan 29 15:39:09 crc kubenswrapper[4753]: I0129 15:39:09.902571 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-scripts\") pod \"782d2b35-9247-42b6-91b9-a935997474e7\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " Jan 29 15:39:09 crc kubenswrapper[4753]: I0129 15:39:09.902660 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-config-data\") pod \"782d2b35-9247-42b6-91b9-a935997474e7\" (UID: \"782d2b35-9247-42b6-91b9-a935997474e7\") " Jan 29 15:39:09 crc kubenswrapper[4753]: I0129 15:39:09.908295 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-scripts" (OuterVolumeSpecName: "scripts") pod "782d2b35-9247-42b6-91b9-a935997474e7" (UID: "782d2b35-9247-42b6-91b9-a935997474e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:39:09 crc kubenswrapper[4753]: I0129 15:39:09.908358 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-config-data" (OuterVolumeSpecName: "config-data") pod "782d2b35-9247-42b6-91b9-a935997474e7" (UID: "782d2b35-9247-42b6-91b9-a935997474e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:39:09 crc kubenswrapper[4753]: I0129 15:39:09.930993 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "782d2b35-9247-42b6-91b9-a935997474e7" (UID: "782d2b35-9247-42b6-91b9-a935997474e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:39:09 crc kubenswrapper[4753]: I0129 15:39:09.943426 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/782d2b35-9247-42b6-91b9-a935997474e7-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "782d2b35-9247-42b6-91b9-a935997474e7" (UID: "782d2b35-9247-42b6-91b9-a935997474e7"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:39:10 crc kubenswrapper[4753]: I0129 15:39:10.005329 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:39:10 crc kubenswrapper[4753]: I0129 15:39:10.005378 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:39:10 crc kubenswrapper[4753]: I0129 15:39:10.005394 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/782d2b35-9247-42b6-91b9-a935997474e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:39:10 crc kubenswrapper[4753]: I0129 15:39:10.005407 4753 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/782d2b35-9247-42b6-91b9-a935997474e7-config-data-merged\") on node \"crc\" DevicePath \"\"" Jan 29 15:39:10 crc kubenswrapper[4753]: I0129 15:39:10.064501 4753 scope.go:117] "RemoveContainer" containerID="5eb433e494a46864bdc3f079b1b838b7681c3b991a7ceb1f29fd14d8678b18c6" Jan 29 15:39:10 crc kubenswrapper[4753]: I0129 15:39:10.734658 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-jc6gq" event={"ID":"782d2b35-9247-42b6-91b9-a935997474e7","Type":"ContainerDied","Data":"83cab8102b5785c6b3e58be8ab93f7696c3cd13e165a8ee26144f9f7c074717f"} Jan 29 15:39:10 crc kubenswrapper[4753]: I0129 15:39:10.735064 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83cab8102b5785c6b3e58be8ab93f7696c3cd13e165a8ee26144f9f7c074717f" Jan 29 15:39:10 crc kubenswrapper[4753]: I0129 15:39:10.735173 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-jc6gq" Jan 29 15:39:10 crc kubenswrapper[4753]: I0129 15:39:10.748239 4753 scope.go:117] "RemoveContainer" containerID="8b88f155823524b0f724f10d95de9fe9d8446ee03582534c88dc48df98219555" Jan 29 15:39:10 crc kubenswrapper[4753]: I0129 15:39:10.994546 4753 scope.go:117] "RemoveContainer" containerID="cc2a6fc09f44ac5217c4ffde4c2928b7956d1a5ed5ca9d830a11b1554079ca1f" Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.292755 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-vdlhc"] Jan 29 15:39:11 crc kubenswrapper[4753]: W0129 15:39:11.315841 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bb0bc99_03cf_48b9_a648_7141b56c18ba.slice/crio-40c1796374d4a51682011e20cd66f3774063e7a5016d88f58c2b82ab5e3c92c1 WatchSource:0}: Error finding container 40c1796374d4a51682011e20cd66f3774063e7a5016d88f58c2b82ab5e3c92c1: Status 404 returned error can't find the container with id 40c1796374d4a51682011e20cd66f3774063e7a5016d88f58c2b82ab5e3c92c1 Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.750246 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-vdlhc" event={"ID":"6bb0bc99-03cf-48b9-a648-7141b56c18ba","Type":"ContainerStarted","Data":"40c1796374d4a51682011e20cd66f3774063e7a5016d88f58c2b82ab5e3c92c1"} Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.757477 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-vv6cz" event={"ID":"f72baddf-f7f0-4f14-9d4e-cf49b4790797","Type":"ContainerStarted","Data":"336f2913e04233bbda6369dd3110208853fac5b1a1d9bb6f5477a5bde9ff511f"} Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.765731 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-t2tws" event={"ID":"65774cd6-bd93-4aef-8747-d2436ecb13ec","Type":"ContainerStarted","Data":"553b321b062a47139c5d686f27796394d5c892c3e714d715d12c5415c626192d"} Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.766552 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.776362 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-fs4wp" event={"ID":"4f4c395f-96c2-4745-a815-dc86a0a60498","Type":"ContainerStarted","Data":"286fa38a0f98def296ed76bd0c552696f3fbf490d157222199eabe6dfcbb0e0a"} Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.776759 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.780900 4753 generic.go:334] "Generic (PLEG): container finished" podID="07c66a66-0505-46c5-ab26-584d30841864" containerID="fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52" exitCode=0 Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.780950 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" event={"ID":"07c66a66-0505-46c5-ab26-584d30841864","Type":"ContainerDied","Data":"fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52"} Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.802232 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-t2tws" podStartSLOduration=9.802215141 podStartE2EDuration="9.802215141s" podCreationTimestamp="2026-01-29 15:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:39:11.801992546 +0000 UTC m=+5786.496726948" watchObservedRunningTime="2026-01-29 15:39:11.802215141 +0000 UTC m=+5786.496949523" Jan 29 15:39:11 crc kubenswrapper[4753]: I0129 15:39:11.822516 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-fs4wp" podStartSLOduration=2.507125849 podStartE2EDuration="15.822497438s" podCreationTimestamp="2026-01-29 15:38:56 +0000 UTC" firstStartedPulling="2026-01-29 15:38:57.463803219 +0000 UTC m=+5772.158537601" lastFinishedPulling="2026-01-29 15:39:10.779174808 +0000 UTC m=+5785.473909190" observedRunningTime="2026-01-29 15:39:11.820584207 +0000 UTC m=+5786.515318579" watchObservedRunningTime="2026-01-29 15:39:11.822497438 +0000 UTC m=+5786.517231810" Jan 29 15:39:12 crc kubenswrapper[4753]: I0129 15:39:12.793860 4753 generic.go:334] "Generic (PLEG): container finished" podID="f72baddf-f7f0-4f14-9d4e-cf49b4790797" containerID="336f2913e04233bbda6369dd3110208853fac5b1a1d9bb6f5477a5bde9ff511f" exitCode=0 Jan 29 15:39:12 crc kubenswrapper[4753]: I0129 15:39:12.793940 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-vv6cz" event={"ID":"f72baddf-f7f0-4f14-9d4e-cf49b4790797","Type":"ContainerDied","Data":"336f2913e04233bbda6369dd3110208853fac5b1a1d9bb6f5477a5bde9ff511f"} Jan 29 15:39:12 crc kubenswrapper[4753]: I0129 15:39:12.799101 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" event={"ID":"07c66a66-0505-46c5-ab26-584d30841864","Type":"ContainerStarted","Data":"2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99"} Jan 29 15:39:12 crc kubenswrapper[4753]: I0129 15:39:12.832239 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" podStartSLOduration=3.096553354 podStartE2EDuration="15.832219372s" podCreationTimestamp="2026-01-29 15:38:57 +0000 UTC" firstStartedPulling="2026-01-29 15:38:58.104471875 +0000 UTC m=+5772.799206257" lastFinishedPulling="2026-01-29 15:39:10.840137883 +0000 UTC m=+5785.534872275" observedRunningTime="2026-01-29 15:39:12.828736088 +0000 UTC m=+5787.523470480" watchObservedRunningTime="2026-01-29 15:39:12.832219372 +0000 UTC m=+5787.526953744" Jan 29 15:39:14 crc kubenswrapper[4753]: I0129 15:39:14.822985 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-vv6cz" event={"ID":"f72baddf-f7f0-4f14-9d4e-cf49b4790797","Type":"ContainerStarted","Data":"8bd62c66e29a863f6b6c71935ba20f32c6de430ca38c51556be1c83d12328c15"} Jan 29 15:39:14 crc kubenswrapper[4753]: I0129 15:39:14.823742 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:14 crc kubenswrapper[4753]: I0129 15:39:14.864857 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-vv6cz" podStartSLOduration=6.019889581 podStartE2EDuration="11.864828425s" podCreationTimestamp="2026-01-29 15:39:03 +0000 UTC" firstStartedPulling="2026-01-29 15:39:04.958355724 +0000 UTC m=+5779.653090106" lastFinishedPulling="2026-01-29 15:39:10.803294568 +0000 UTC m=+5785.498028950" observedRunningTime="2026-01-29 15:39:14.852361509 +0000 UTC m=+5789.547095901" watchObservedRunningTime="2026-01-29 15:39:14.864828425 +0000 UTC m=+5789.559562827" Jan 29 15:39:15 crc kubenswrapper[4753]: I0129 15:39:15.840378 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-vdlhc" event={"ID":"6bb0bc99-03cf-48b9-a648-7141b56c18ba","Type":"ContainerStarted","Data":"89938607c13ecec09b068e1086fcb861a7b494602890256898ecf3f69806abb8"} Jan 29 15:39:16 crc kubenswrapper[4753]: I0129 15:39:16.851217 4753 generic.go:334] "Generic (PLEG): container finished" podID="6bb0bc99-03cf-48b9-a648-7141b56c18ba" containerID="89938607c13ecec09b068e1086fcb861a7b494602890256898ecf3f69806abb8" exitCode=0 Jan 29 15:39:16 crc kubenswrapper[4753]: I0129 15:39:16.851273 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-vdlhc" event={"ID":"6bb0bc99-03cf-48b9-a648-7141b56c18ba","Type":"ContainerDied","Data":"89938607c13ecec09b068e1086fcb861a7b494602890256898ecf3f69806abb8"} Jan 29 15:39:17 crc kubenswrapper[4753]: I0129 15:39:17.833475 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-t2tws" Jan 29 15:39:17 crc kubenswrapper[4753]: I0129 15:39:17.863015 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-vdlhc" event={"ID":"6bb0bc99-03cf-48b9-a648-7141b56c18ba","Type":"ContainerStarted","Data":"42e0e48f1aec4e94e38f577f602621966c60ad03aa7cd1ccca98e83b96edbc53"} Jan 29 15:39:17 crc kubenswrapper[4753]: I0129 15:39:17.863290 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:17 crc kubenswrapper[4753]: I0129 15:39:17.888293 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-vdlhc" podStartSLOduration=8.45804697 podStartE2EDuration="11.888270193s" podCreationTimestamp="2026-01-29 15:39:06 +0000 UTC" firstStartedPulling="2026-01-29 15:39:11.319393004 +0000 UTC m=+5786.014127386" lastFinishedPulling="2026-01-29 15:39:14.749616227 +0000 UTC m=+5789.444350609" observedRunningTime="2026-01-29 15:39:17.880131243 +0000 UTC m=+5792.574865645" watchObservedRunningTime="2026-01-29 15:39:17.888270193 +0000 UTC m=+5792.583004595" Jan 29 15:39:19 crc kubenswrapper[4753]: I0129 15:39:19.149797 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:39:19 crc kubenswrapper[4753]: E0129 15:39:19.151539 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:39:19 crc kubenswrapper[4753]: I0129 15:39:19.301414 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-vv6cz" Jan 29 15:39:22 crc kubenswrapper[4753]: I0129 15:39:22.195288 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-vdlhc" Jan 29 15:39:26 crc kubenswrapper[4753]: I0129 15:39:26.852846 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-fs4wp" Jan 29 15:39:33 crc kubenswrapper[4753]: I0129 15:39:33.150344 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:39:33 crc kubenswrapper[4753]: E0129 15:39:33.151476 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:39:40 crc kubenswrapper[4753]: I0129 15:39:40.508079 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-flkkm"] Jan 29 15:39:40 crc kubenswrapper[4753]: I0129 15:39:40.509839 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" podUID="07c66a66-0505-46c5-ab26-584d30841864" containerName="octavia-amphora-httpd" containerID="cri-o://2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99" gracePeriod=30 Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.050453 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.107987 4753 generic.go:334] "Generic (PLEG): container finished" podID="07c66a66-0505-46c5-ab26-584d30841864" containerID="2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99" exitCode=0 Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.108175 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.108204 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" event={"ID":"07c66a66-0505-46c5-ab26-584d30841864","Type":"ContainerDied","Data":"2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99"} Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.108593 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-flkkm" event={"ID":"07c66a66-0505-46c5-ab26-584d30841864","Type":"ContainerDied","Data":"e7b7c6ac79122bdbbbf2f39a8b1a420148693517d3cbfd7c8ca9aad399856f2c"} Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.108619 4753 scope.go:117] "RemoveContainer" containerID="2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.139452 4753 scope.go:117] "RemoveContainer" containerID="fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.163942 4753 scope.go:117] "RemoveContainer" containerID="2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99" Jan 29 15:39:41 crc kubenswrapper[4753]: E0129 15:39:41.164581 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99\": container with ID starting with 2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99 not found: ID does not exist" containerID="2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.164625 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99"} err="failed to get container status \"2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99\": rpc error: code = NotFound desc = could not find container \"2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99\": container with ID starting with 2f232c1105be024de99794cc6a3a51b40a9650ab35c3f125328be8c6029c9b99 not found: ID does not exist" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.164655 4753 scope.go:117] "RemoveContainer" containerID="fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52" Jan 29 15:39:41 crc kubenswrapper[4753]: E0129 15:39:41.164990 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52\": container with ID starting with fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52 not found: ID does not exist" containerID="fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.165077 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52"} err="failed to get container status \"fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52\": rpc error: code = NotFound desc = could not find container \"fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52\": container with ID starting with fb30a071f8646ec81e5877857fb213818feda122abe79a837f0201bf93179f52 not found: ID does not exist" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.199350 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07c66a66-0505-46c5-ab26-584d30841864-httpd-config\") pod \"07c66a66-0505-46c5-ab26-584d30841864\" (UID: \"07c66a66-0505-46c5-ab26-584d30841864\") " Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.199642 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/07c66a66-0505-46c5-ab26-584d30841864-amphora-image\") pod \"07c66a66-0505-46c5-ab26-584d30841864\" (UID: \"07c66a66-0505-46c5-ab26-584d30841864\") " Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.235605 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c66a66-0505-46c5-ab26-584d30841864-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "07c66a66-0505-46c5-ab26-584d30841864" (UID: "07c66a66-0505-46c5-ab26-584d30841864"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.268602 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c66a66-0505-46c5-ab26-584d30841864-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "07c66a66-0505-46c5-ab26-584d30841864" (UID: "07c66a66-0505-46c5-ab26-584d30841864"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.301560 4753 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/07c66a66-0505-46c5-ab26-584d30841864-amphora-image\") on node \"crc\" DevicePath \"\"" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.301597 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07c66a66-0505-46c5-ab26-584d30841864-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.448506 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-flkkm"] Jan 29 15:39:41 crc kubenswrapper[4753]: I0129 15:39:41.458164 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-flkkm"] Jan 29 15:39:42 crc kubenswrapper[4753]: I0129 15:39:42.162559 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c66a66-0505-46c5-ab26-584d30841864" path="/var/lib/kubelet/pods/07c66a66-0505-46c5-ab26-584d30841864/volumes" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.149646 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:39:44 crc kubenswrapper[4753]: E0129 15:39:44.150193 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.505804 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-96gzr"] Jan 29 15:39:44 crc kubenswrapper[4753]: E0129 15:39:44.506618 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c66a66-0505-46c5-ab26-584d30841864" containerName="init" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.506644 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c66a66-0505-46c5-ab26-584d30841864" containerName="init" Jan 29 15:39:44 crc kubenswrapper[4753]: E0129 15:39:44.506697 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782d2b35-9247-42b6-91b9-a935997474e7" containerName="init" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.506706 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="782d2b35-9247-42b6-91b9-a935997474e7" containerName="init" Jan 29 15:39:44 crc kubenswrapper[4753]: E0129 15:39:44.506724 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782d2b35-9247-42b6-91b9-a935997474e7" containerName="octavia-db-sync" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.506731 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="782d2b35-9247-42b6-91b9-a935997474e7" containerName="octavia-db-sync" Jan 29 15:39:44 crc kubenswrapper[4753]: E0129 15:39:44.506739 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c66a66-0505-46c5-ab26-584d30841864" containerName="octavia-amphora-httpd" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.506745 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c66a66-0505-46c5-ab26-584d30841864" containerName="octavia-amphora-httpd" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.506948 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c66a66-0505-46c5-ab26-584d30841864" containerName="octavia-amphora-httpd" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.506964 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="782d2b35-9247-42b6-91b9-a935997474e7" containerName="octavia-db-sync" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.508043 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-96gzr" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.512719 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.517175 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-96gzr"] Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.562046 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce9f7a8b-5457-4348-8850-959a150b9a35-httpd-config\") pod \"octavia-image-upload-65dd99cb46-96gzr\" (UID: \"ce9f7a8b-5457-4348-8850-959a150b9a35\") " pod="openstack/octavia-image-upload-65dd99cb46-96gzr" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.562111 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/ce9f7a8b-5457-4348-8850-959a150b9a35-amphora-image\") pod \"octavia-image-upload-65dd99cb46-96gzr\" (UID: \"ce9f7a8b-5457-4348-8850-959a150b9a35\") " pod="openstack/octavia-image-upload-65dd99cb46-96gzr" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.664389 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce9f7a8b-5457-4348-8850-959a150b9a35-httpd-config\") pod \"octavia-image-upload-65dd99cb46-96gzr\" (UID: \"ce9f7a8b-5457-4348-8850-959a150b9a35\") " pod="openstack/octavia-image-upload-65dd99cb46-96gzr" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.664507 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/ce9f7a8b-5457-4348-8850-959a150b9a35-amphora-image\") pod \"octavia-image-upload-65dd99cb46-96gzr\" (UID: \"ce9f7a8b-5457-4348-8850-959a150b9a35\") " pod="openstack/octavia-image-upload-65dd99cb46-96gzr" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.665116 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/ce9f7a8b-5457-4348-8850-959a150b9a35-amphora-image\") pod \"octavia-image-upload-65dd99cb46-96gzr\" (UID: \"ce9f7a8b-5457-4348-8850-959a150b9a35\") " pod="openstack/octavia-image-upload-65dd99cb46-96gzr" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.670578 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce9f7a8b-5457-4348-8850-959a150b9a35-httpd-config\") pod \"octavia-image-upload-65dd99cb46-96gzr\" (UID: \"ce9f7a8b-5457-4348-8850-959a150b9a35\") " pod="openstack/octavia-image-upload-65dd99cb46-96gzr" Jan 29 15:39:44 crc kubenswrapper[4753]: I0129 15:39:44.832251 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-96gzr" Jan 29 15:39:45 crc kubenswrapper[4753]: I0129 15:39:45.271598 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-96gzr"] Jan 29 15:39:46 crc kubenswrapper[4753]: I0129 15:39:46.164635 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-96gzr" event={"ID":"ce9f7a8b-5457-4348-8850-959a150b9a35","Type":"ContainerStarted","Data":"a16402a6a638e189a3340ac978e8491eedb7748608cc855e8f4795f3ea549e09"} Jan 29 15:39:46 crc kubenswrapper[4753]: I0129 15:39:46.164912 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-96gzr" event={"ID":"ce9f7a8b-5457-4348-8850-959a150b9a35","Type":"ContainerStarted","Data":"2dc6aaa325559246f857b396597a13547ca612cb27dfa20dd83b0577125134db"} Jan 29 15:39:47 crc kubenswrapper[4753]: I0129 15:39:47.174462 4753 generic.go:334] "Generic (PLEG): container finished" podID="ce9f7a8b-5457-4348-8850-959a150b9a35" containerID="a16402a6a638e189a3340ac978e8491eedb7748608cc855e8f4795f3ea549e09" exitCode=0 Jan 29 15:39:47 crc kubenswrapper[4753]: I0129 15:39:47.174544 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-96gzr" event={"ID":"ce9f7a8b-5457-4348-8850-959a150b9a35","Type":"ContainerDied","Data":"a16402a6a638e189a3340ac978e8491eedb7748608cc855e8f4795f3ea549e09"} Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.187688 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-96gzr" event={"ID":"ce9f7a8b-5457-4348-8850-959a150b9a35","Type":"ContainerStarted","Data":"cb0a21431a34bcd050facbaf4c7e34df2f69f017df7b37de26c72ea0bfdb9f27"} Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.409292 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-65dd99cb46-96gzr" podStartSLOduration=4.011491876 podStartE2EDuration="4.409270229s" podCreationTimestamp="2026-01-29 15:39:44 +0000 UTC" firstStartedPulling="2026-01-29 15:39:45.298832754 +0000 UTC m=+5819.993567136" lastFinishedPulling="2026-01-29 15:39:45.696611107 +0000 UTC m=+5820.391345489" observedRunningTime="2026-01-29 15:39:48.214910074 +0000 UTC m=+5822.909644456" watchObservedRunningTime="2026-01-29 15:39:48.409270229 +0000 UTC m=+5823.104004611" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.422891 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k7x7v"] Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.425019 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.453516 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7x7v"] Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.552222 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-catalog-content\") pod \"redhat-marketplace-k7x7v\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.552412 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r78z6\" (UniqueName: \"kubernetes.io/projected/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-kube-api-access-r78z6\") pod \"redhat-marketplace-k7x7v\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.552676 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-utilities\") pod \"redhat-marketplace-k7x7v\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.654856 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-utilities\") pod \"redhat-marketplace-k7x7v\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.655007 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-catalog-content\") pod \"redhat-marketplace-k7x7v\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.655107 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r78z6\" (UniqueName: \"kubernetes.io/projected/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-kube-api-access-r78z6\") pod \"redhat-marketplace-k7x7v\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.655505 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-utilities\") pod \"redhat-marketplace-k7x7v\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.655580 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-catalog-content\") pod \"redhat-marketplace-k7x7v\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.686701 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r78z6\" (UniqueName: \"kubernetes.io/projected/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-kube-api-access-r78z6\") pod \"redhat-marketplace-k7x7v\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:48 crc kubenswrapper[4753]: I0129 15:39:48.758071 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:49 crc kubenswrapper[4753]: I0129 15:39:49.285694 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7x7v"] Jan 29 15:39:50 crc kubenswrapper[4753]: I0129 15:39:50.207293 4753 generic.go:334] "Generic (PLEG): container finished" podID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerID="a227213e7e40d28a123998c5d361e9458e4244e37941fefb27d660587e41a8e4" exitCode=0 Jan 29 15:39:50 crc kubenswrapper[4753]: I0129 15:39:50.207399 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7x7v" event={"ID":"11ca2836-cb62-430b-9ae0-1d7ab1f02bba","Type":"ContainerDied","Data":"a227213e7e40d28a123998c5d361e9458e4244e37941fefb27d660587e41a8e4"} Jan 29 15:39:50 crc kubenswrapper[4753]: I0129 15:39:50.207619 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7x7v" event={"ID":"11ca2836-cb62-430b-9ae0-1d7ab1f02bba","Type":"ContainerStarted","Data":"1893f22852779d71c211d220750126628634a1236eb40594dd967cbde8a5747c"} Jan 29 15:39:51 crc kubenswrapper[4753]: I0129 15:39:51.220081 4753 generic.go:334] "Generic (PLEG): container finished" podID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerID="db60047ab0a67c38bc110f9e59b437638db851315a535143b764973ffdf46bfb" exitCode=0 Jan 29 15:39:51 crc kubenswrapper[4753]: I0129 15:39:51.220590 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7x7v" event={"ID":"11ca2836-cb62-430b-9ae0-1d7ab1f02bba","Type":"ContainerDied","Data":"db60047ab0a67c38bc110f9e59b437638db851315a535143b764973ffdf46bfb"} Jan 29 15:39:52 crc kubenswrapper[4753]: I0129 15:39:52.244714 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7x7v" event={"ID":"11ca2836-cb62-430b-9ae0-1d7ab1f02bba","Type":"ContainerStarted","Data":"eda820a9b10f37efda2e5aab57d5bf482b329b116203645d4a27514a87cf1b5d"} Jan 29 15:39:55 crc kubenswrapper[4753]: I0129 15:39:55.149897 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:39:55 crc kubenswrapper[4753]: E0129 15:39:55.150753 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:39:58 crc kubenswrapper[4753]: I0129 15:39:58.759317 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:58 crc kubenswrapper[4753]: I0129 15:39:58.761923 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:58 crc kubenswrapper[4753]: I0129 15:39:58.808952 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:58 crc kubenswrapper[4753]: I0129 15:39:58.838947 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k7x7v" podStartSLOduration=9.234013625 podStartE2EDuration="10.838924617s" podCreationTimestamp="2026-01-29 15:39:48 +0000 UTC" firstStartedPulling="2026-01-29 15:39:50.21017541 +0000 UTC m=+5824.904909792" lastFinishedPulling="2026-01-29 15:39:51.815086392 +0000 UTC m=+5826.509820784" observedRunningTime="2026-01-29 15:39:52.268047144 +0000 UTC m=+5826.962781526" watchObservedRunningTime="2026-01-29 15:39:58.838924617 +0000 UTC m=+5833.533658999" Jan 29 15:39:59 crc kubenswrapper[4753]: I0129 15:39:59.367729 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:39:59 crc kubenswrapper[4753]: I0129 15:39:59.423287 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7x7v"] Jan 29 15:40:01 crc kubenswrapper[4753]: I0129 15:40:01.337916 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k7x7v" podUID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerName="registry-server" containerID="cri-o://eda820a9b10f37efda2e5aab57d5bf482b329b116203645d4a27514a87cf1b5d" gracePeriod=2 Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.385065 4753 generic.go:334] "Generic (PLEG): container finished" podID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerID="eda820a9b10f37efda2e5aab57d5bf482b329b116203645d4a27514a87cf1b5d" exitCode=0 Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.385107 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7x7v" event={"ID":"11ca2836-cb62-430b-9ae0-1d7ab1f02bba","Type":"ContainerDied","Data":"eda820a9b10f37efda2e5aab57d5bf482b329b116203645d4a27514a87cf1b5d"} Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.496945 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.584763 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-catalog-content\") pod \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.584862 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-utilities\") pod \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.585084 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r78z6\" (UniqueName: \"kubernetes.io/projected/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-kube-api-access-r78z6\") pod \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\" (UID: \"11ca2836-cb62-430b-9ae0-1d7ab1f02bba\") " Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.587987 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-utilities" (OuterVolumeSpecName: "utilities") pod "11ca2836-cb62-430b-9ae0-1d7ab1f02bba" (UID: "11ca2836-cb62-430b-9ae0-1d7ab1f02bba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.591875 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-kube-api-access-r78z6" (OuterVolumeSpecName: "kube-api-access-r78z6") pod "11ca2836-cb62-430b-9ae0-1d7ab1f02bba" (UID: "11ca2836-cb62-430b-9ae0-1d7ab1f02bba"). InnerVolumeSpecName "kube-api-access-r78z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.608574 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11ca2836-cb62-430b-9ae0-1d7ab1f02bba" (UID: "11ca2836-cb62-430b-9ae0-1d7ab1f02bba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.687321 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r78z6\" (UniqueName: \"kubernetes.io/projected/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-kube-api-access-r78z6\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.687365 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:02 crc kubenswrapper[4753]: I0129 15:40:02.687377 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ca2836-cb62-430b-9ae0-1d7ab1f02bba-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:03 crc kubenswrapper[4753]: I0129 15:40:03.398497 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7x7v" event={"ID":"11ca2836-cb62-430b-9ae0-1d7ab1f02bba","Type":"ContainerDied","Data":"1893f22852779d71c211d220750126628634a1236eb40594dd967cbde8a5747c"} Jan 29 15:40:03 crc kubenswrapper[4753]: I0129 15:40:03.398882 4753 scope.go:117] "RemoveContainer" containerID="eda820a9b10f37efda2e5aab57d5bf482b329b116203645d4a27514a87cf1b5d" Jan 29 15:40:03 crc kubenswrapper[4753]: I0129 15:40:03.398740 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7x7v" Jan 29 15:40:03 crc kubenswrapper[4753]: I0129 15:40:03.422716 4753 scope.go:117] "RemoveContainer" containerID="db60047ab0a67c38bc110f9e59b437638db851315a535143b764973ffdf46bfb" Jan 29 15:40:03 crc kubenswrapper[4753]: I0129 15:40:03.445274 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7x7v"] Jan 29 15:40:03 crc kubenswrapper[4753]: I0129 15:40:03.461515 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7x7v"] Jan 29 15:40:03 crc kubenswrapper[4753]: I0129 15:40:03.463133 4753 scope.go:117] "RemoveContainer" containerID="a227213e7e40d28a123998c5d361e9458e4244e37941fefb27d660587e41a8e4" Jan 29 15:40:04 crc kubenswrapper[4753]: I0129 15:40:04.163073 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" path="/var/lib/kubelet/pods/11ca2836-cb62-430b-9ae0-1d7ab1f02bba/volumes" Jan 29 15:40:07 crc kubenswrapper[4753]: I0129 15:40:07.150186 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:40:07 crc kubenswrapper[4753]: E0129 15:40:07.151015 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:40:11 crc kubenswrapper[4753]: I0129 15:40:11.150178 4753 scope.go:117] "RemoveContainer" containerID="6e5c1941db19f53b5bcb21055803069e1176920d9506bad5fab791f8ceb924a7" Jan 29 15:40:11 crc kubenswrapper[4753]: I0129 15:40:11.178385 4753 scope.go:117] "RemoveContainer" containerID="38e695fc9e9497227ba10d0f2948e74182ef75bb37dbac0b0e8eb47027652a96" Jan 29 15:40:11 crc kubenswrapper[4753]: I0129 15:40:11.229097 4753 scope.go:117] "RemoveContainer" containerID="72d4c6de4aacbc90c53efa2c08e666e4e45a0ca12bb3952f5bc66ac530767e3e" Jan 29 15:40:18 crc kubenswrapper[4753]: I0129 15:40:18.150754 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:40:18 crc kubenswrapper[4753]: E0129 15:40:18.151788 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.133799 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c654466df-ckb5g"] Jan 29 15:40:25 crc kubenswrapper[4753]: E0129 15:40:25.134773 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerName="extract-utilities" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.134792 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerName="extract-utilities" Jan 29 15:40:25 crc kubenswrapper[4753]: E0129 15:40:25.134804 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerName="registry-server" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.134812 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerName="registry-server" Jan 29 15:40:25 crc kubenswrapper[4753]: E0129 15:40:25.134832 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerName="extract-content" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.134839 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerName="extract-content" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.135029 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ca2836-cb62-430b-9ae0-1d7ab1f02bba" containerName="registry-server" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.136026 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.139866 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.140113 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.146532 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.146544 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-gvf7n" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.161522 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c654466df-ckb5g"] Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.222712 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.223402 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerName="glance-log" containerID="cri-o://808b00d324845124c97752b5ba087e9e0abc721ec5465939c7268e00de26fe70" gracePeriod=30 Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.223448 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerName="glance-httpd" containerID="cri-o://d16311801fd926af5c1796ec322bc0e2dc4d10badd0b2df0b6e5e7ad1891d62d" gracePeriod=30 Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.245779 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8c64b7b9-zzqxw"] Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.248023 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.261739 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c64b7b9-zzqxw"] Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.273267 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f74ce4f9-2bea-4d8e-a301-b33daef607d6-horizon-secret-key\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.273353 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd52p\" (UniqueName: \"kubernetes.io/projected/f74ce4f9-2bea-4d8e-a301-b33daef607d6-kube-api-access-kd52p\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.273420 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-scripts\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.273617 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74ce4f9-2bea-4d8e-a301-b33daef607d6-logs\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.273645 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-config-data\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.317668 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.317923 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerName="glance-log" containerID="cri-o://4505286665ec8df6d6f490fb8862aaf09f3220c1a84751ff2c473a2b109913b7" gracePeriod=30 Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.318383 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerName="glance-httpd" containerID="cri-o://891128b69da3f3bbbd8efd446640966b20c7879b34a670191412181677ba15c5" gracePeriod=30 Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.375355 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f37cc84a-e714-4e00-8885-68ce4992b34e-horizon-secret-key\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.375425 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f74ce4f9-2bea-4d8e-a301-b33daef607d6-horizon-secret-key\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.375483 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd52p\" (UniqueName: \"kubernetes.io/projected/f74ce4f9-2bea-4d8e-a301-b33daef607d6-kube-api-access-kd52p\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.375547 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-scripts\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.375978 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74ce4f9-2bea-4d8e-a301-b33daef607d6-logs\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.376003 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-config-data\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.376021 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-config-data\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.376050 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-scripts\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.376135 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4vvf\" (UniqueName: \"kubernetes.io/projected/f37cc84a-e714-4e00-8885-68ce4992b34e-kube-api-access-w4vvf\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.376179 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f37cc84a-e714-4e00-8885-68ce4992b34e-logs\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.376670 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-scripts\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.376776 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74ce4f9-2bea-4d8e-a301-b33daef607d6-logs\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.377348 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-config-data\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.383423 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f74ce4f9-2bea-4d8e-a301-b33daef607d6-horizon-secret-key\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.403006 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd52p\" (UniqueName: \"kubernetes.io/projected/f74ce4f9-2bea-4d8e-a301-b33daef607d6-kube-api-access-kd52p\") pod \"horizon-5c654466df-ckb5g\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.472074 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.481281 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-config-data\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.481621 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-scripts\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.481841 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4vvf\" (UniqueName: \"kubernetes.io/projected/f37cc84a-e714-4e00-8885-68ce4992b34e-kube-api-access-w4vvf\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.481969 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f37cc84a-e714-4e00-8885-68ce4992b34e-logs\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.482232 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f37cc84a-e714-4e00-8885-68ce4992b34e-horizon-secret-key\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.482572 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-scripts\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.482878 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-config-data\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.483035 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f37cc84a-e714-4e00-8885-68ce4992b34e-logs\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.488869 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f37cc84a-e714-4e00-8885-68ce4992b34e-horizon-secret-key\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.506867 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4vvf\" (UniqueName: \"kubernetes.io/projected/f37cc84a-e714-4e00-8885-68ce4992b34e-kube-api-access-w4vvf\") pod \"horizon-8c64b7b9-zzqxw\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.580864 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.624505 4753 generic.go:334] "Generic (PLEG): container finished" podID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerID="4505286665ec8df6d6f490fb8862aaf09f3220c1a84751ff2c473a2b109913b7" exitCode=143 Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.624683 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4abaca50-6e4d-4947-b3e1-6b627376a788","Type":"ContainerDied","Data":"4505286665ec8df6d6f490fb8862aaf09f3220c1a84751ff2c473a2b109913b7"} Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.627096 4753 generic.go:334] "Generic (PLEG): container finished" podID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerID="808b00d324845124c97752b5ba087e9e0abc721ec5465939c7268e00de26fe70" exitCode=143 Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.627122 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71bc7903-cddb-464d-a5ae-ef660282f4b7","Type":"ContainerDied","Data":"808b00d324845124c97752b5ba087e9e0abc721ec5465939c7268e00de26fe70"} Jan 29 15:40:25 crc kubenswrapper[4753]: I0129 15:40:25.987224 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c654466df-ckb5g"] Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.007985 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c654466df-ckb5g"] Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.025355 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68cd549bc7-5fnnk"] Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.028224 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.040342 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68cd549bc7-5fnnk"] Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.097237 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92a1036d-2faa-4780-8ad4-153d6a0ac402-logs\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.097304 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-scripts\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.097363 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-config-data\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.097458 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb9gx\" (UniqueName: \"kubernetes.io/projected/92a1036d-2faa-4780-8ad4-153d6a0ac402-kube-api-access-rb9gx\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.097499 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92a1036d-2faa-4780-8ad4-153d6a0ac402-horizon-secret-key\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.105954 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c64b7b9-zzqxw"] Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.201866 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92a1036d-2faa-4780-8ad4-153d6a0ac402-logs\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.202062 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-scripts\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.202959 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-config-data\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.203143 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb9gx\" (UniqueName: \"kubernetes.io/projected/92a1036d-2faa-4780-8ad4-153d6a0ac402-kube-api-access-rb9gx\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.203209 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92a1036d-2faa-4780-8ad4-153d6a0ac402-horizon-secret-key\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.204348 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-scripts\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.204565 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-config-data\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.207867 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92a1036d-2faa-4780-8ad4-153d6a0ac402-logs\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.218636 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92a1036d-2faa-4780-8ad4-153d6a0ac402-horizon-secret-key\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.225658 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb9gx\" (UniqueName: \"kubernetes.io/projected/92a1036d-2faa-4780-8ad4-153d6a0ac402-kube-api-access-rb9gx\") pod \"horizon-68cd549bc7-5fnnk\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.348191 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.638426 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c64b7b9-zzqxw" event={"ID":"f37cc84a-e714-4e00-8885-68ce4992b34e","Type":"ContainerStarted","Data":"d4a8f2e9dc266917f59ab65b350f5d4a103d456dea9846aa83edc56aea46d704"} Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.640925 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c654466df-ckb5g" event={"ID":"f74ce4f9-2bea-4d8e-a301-b33daef607d6","Type":"ContainerStarted","Data":"ed3738305c9e8426b8e12b176e1294e4f01dcd4b5ba171cf5b66f9e91867ff31"} Jan 29 15:40:26 crc kubenswrapper[4753]: I0129 15:40:26.844448 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68cd549bc7-5fnnk"] Jan 29 15:40:26 crc kubenswrapper[4753]: W0129 15:40:26.853656 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92a1036d_2faa_4780_8ad4_153d6a0ac402.slice/crio-b7ebddbe9b65b74ebda3fb617d2df2aeda90a78fb4cb31ff00bdebc314174d65 WatchSource:0}: Error finding container b7ebddbe9b65b74ebda3fb617d2df2aeda90a78fb4cb31ff00bdebc314174d65: Status 404 returned error can't find the container with id b7ebddbe9b65b74ebda3fb617d2df2aeda90a78fb4cb31ff00bdebc314174d65 Jan 29 15:40:27 crc kubenswrapper[4753]: I0129 15:40:27.661650 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd549bc7-5fnnk" event={"ID":"92a1036d-2faa-4780-8ad4-153d6a0ac402","Type":"ContainerStarted","Data":"b7ebddbe9b65b74ebda3fb617d2df2aeda90a78fb4cb31ff00bdebc314174d65"} Jan 29 15:40:28 crc kubenswrapper[4753]: I0129 15:40:28.676540 4753 generic.go:334] "Generic (PLEG): container finished" podID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerID="891128b69da3f3bbbd8efd446640966b20c7879b34a670191412181677ba15c5" exitCode=0 Jan 29 15:40:28 crc kubenswrapper[4753]: I0129 15:40:28.676604 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4abaca50-6e4d-4947-b3e1-6b627376a788","Type":"ContainerDied","Data":"891128b69da3f3bbbd8efd446640966b20c7879b34a670191412181677ba15c5"} Jan 29 15:40:29 crc kubenswrapper[4753]: I0129 15:40:29.689557 4753 generic.go:334] "Generic (PLEG): container finished" podID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerID="d16311801fd926af5c1796ec322bc0e2dc4d10badd0b2df0b6e5e7ad1891d62d" exitCode=0 Jan 29 15:40:29 crc kubenswrapper[4753]: I0129 15:40:29.689611 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71bc7903-cddb-464d-a5ae-ef660282f4b7","Type":"ContainerDied","Data":"d16311801fd926af5c1796ec322bc0e2dc4d10badd0b2df0b6e5e7ad1891d62d"} Jan 29 15:40:29 crc kubenswrapper[4753]: I0129 15:40:29.743937 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.51:9292/healthcheck\": dial tcp 10.217.1.51:9292: connect: connection refused" Jan 29 15:40:29 crc kubenswrapper[4753]: I0129 15:40:29.743937 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.51:9292/healthcheck\": dial tcp 10.217.1.51:9292: connect: connection refused" Jan 29 15:40:31 crc kubenswrapper[4753]: I0129 15:40:31.149393 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:40:31 crc kubenswrapper[4753]: E0129 15:40:31.150025 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:40:31 crc kubenswrapper[4753]: I0129 15:40:31.801685 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.52:9292/healthcheck\": dial tcp 10.217.1.52:9292: connect: connection refused" Jan 29 15:40:31 crc kubenswrapper[4753]: I0129 15:40:31.801760 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.52:9292/healthcheck\": dial tcp 10.217.1.52:9292: connect: connection refused" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.177177 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.279862 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6bgc\" (UniqueName: \"kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-kube-api-access-v6bgc\") pod \"71bc7903-cddb-464d-a5ae-ef660282f4b7\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.279943 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-config-data\") pod \"71bc7903-cddb-464d-a5ae-ef660282f4b7\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.280056 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-scripts\") pod \"71bc7903-cddb-464d-a5ae-ef660282f4b7\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.280105 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-httpd-run\") pod \"71bc7903-cddb-464d-a5ae-ef660282f4b7\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.280226 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-ceph\") pod \"71bc7903-cddb-464d-a5ae-ef660282f4b7\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.280348 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-combined-ca-bundle\") pod \"71bc7903-cddb-464d-a5ae-ef660282f4b7\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.280384 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-logs\") pod \"71bc7903-cddb-464d-a5ae-ef660282f4b7\" (UID: \"71bc7903-cddb-464d-a5ae-ef660282f4b7\") " Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.281921 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "71bc7903-cddb-464d-a5ae-ef660282f4b7" (UID: "71bc7903-cddb-464d-a5ae-ef660282f4b7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.282216 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-logs" (OuterVolumeSpecName: "logs") pod "71bc7903-cddb-464d-a5ae-ef660282f4b7" (UID: "71bc7903-cddb-464d-a5ae-ef660282f4b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.286874 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-scripts" (OuterVolumeSpecName: "scripts") pod "71bc7903-cddb-464d-a5ae-ef660282f4b7" (UID: "71bc7903-cddb-464d-a5ae-ef660282f4b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.298344 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-ceph" (OuterVolumeSpecName: "ceph") pod "71bc7903-cddb-464d-a5ae-ef660282f4b7" (UID: "71bc7903-cddb-464d-a5ae-ef660282f4b7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.299536 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-kube-api-access-v6bgc" (OuterVolumeSpecName: "kube-api-access-v6bgc") pod "71bc7903-cddb-464d-a5ae-ef660282f4b7" (UID: "71bc7903-cddb-464d-a5ae-ef660282f4b7"). InnerVolumeSpecName "kube-api-access-v6bgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.348457 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71bc7903-cddb-464d-a5ae-ef660282f4b7" (UID: "71bc7903-cddb-464d-a5ae-ef660282f4b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.356802 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-config-data" (OuterVolumeSpecName: "config-data") pod "71bc7903-cddb-464d-a5ae-ef660282f4b7" (UID: "71bc7903-cddb-464d-a5ae-ef660282f4b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.384039 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6bgc\" (UniqueName: \"kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-kube-api-access-v6bgc\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.384073 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.384083 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.384092 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.384101 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/71bc7903-cddb-464d-a5ae-ef660282f4b7-ceph\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.384110 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bc7903-cddb-464d-a5ae-ef660282f4b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.384118 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71bc7903-cddb-464d-a5ae-ef660282f4b7-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.738570 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd549bc7-5fnnk" event={"ID":"92a1036d-2faa-4780-8ad4-153d6a0ac402","Type":"ContainerStarted","Data":"6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b"} Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.740618 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c654466df-ckb5g" event={"ID":"f74ce4f9-2bea-4d8e-a301-b33daef607d6","Type":"ContainerStarted","Data":"984abde2e7b5b64d691afd296a0ed70f89e60c371b19b05f4be7ed13583cc479"} Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.742853 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c64b7b9-zzqxw" event={"ID":"f37cc84a-e714-4e00-8885-68ce4992b34e","Type":"ContainerStarted","Data":"f3a64cba0e7e4ee1596ff9909ca82a1164baf82a9e161703e98e7aba36caee5b"} Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.745019 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"71bc7903-cddb-464d-a5ae-ef660282f4b7","Type":"ContainerDied","Data":"ca5beb51d8041f71c92171d514c8ab7e90f6f768b56c8ff5572babbf1ee1b393"} Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.745069 4753 scope.go:117] "RemoveContainer" containerID="d16311801fd926af5c1796ec322bc0e2dc4d10badd0b2df0b6e5e7ad1891d62d" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.745293 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.826605 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.837592 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.842058 4753 scope.go:117] "RemoveContainer" containerID="808b00d324845124c97752b5ba087e9e0abc721ec5465939c7268e00de26fe70" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.857637 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:40:34 crc kubenswrapper[4753]: E0129 15:40:34.858116 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerName="glance-log" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.858137 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerName="glance-log" Jan 29 15:40:34 crc kubenswrapper[4753]: E0129 15:40:34.858180 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerName="glance-httpd" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.858189 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerName="glance-httpd" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.858429 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerName="glance-httpd" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.858455 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" containerName="glance-log" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.859749 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.862481 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 29 15:40:34 crc kubenswrapper[4753]: I0129 15:40:34.896842 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.005076 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/104ef6b1-854f-423e-bfde-2a4d8beedd8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.005556 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89l77\" (UniqueName: \"kubernetes.io/projected/104ef6b1-854f-423e-bfde-2a4d8beedd8f-kube-api-access-89l77\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.005626 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/104ef6b1-854f-423e-bfde-2a4d8beedd8f-ceph\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.005645 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/104ef6b1-854f-423e-bfde-2a4d8beedd8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.005698 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104ef6b1-854f-423e-bfde-2a4d8beedd8f-logs\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.005955 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/104ef6b1-854f-423e-bfde-2a4d8beedd8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.006020 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104ef6b1-854f-423e-bfde-2a4d8beedd8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.070259 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.107618 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89l77\" (UniqueName: \"kubernetes.io/projected/104ef6b1-854f-423e-bfde-2a4d8beedd8f-kube-api-access-89l77\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.107755 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/104ef6b1-854f-423e-bfde-2a4d8beedd8f-ceph\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.107785 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/104ef6b1-854f-423e-bfde-2a4d8beedd8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.107823 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104ef6b1-854f-423e-bfde-2a4d8beedd8f-logs\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.107906 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/104ef6b1-854f-423e-bfde-2a4d8beedd8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.107937 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104ef6b1-854f-423e-bfde-2a4d8beedd8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.108024 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/104ef6b1-854f-423e-bfde-2a4d8beedd8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.108768 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/104ef6b1-854f-423e-bfde-2a4d8beedd8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.110069 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104ef6b1-854f-423e-bfde-2a4d8beedd8f-logs\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.115724 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/104ef6b1-854f-423e-bfde-2a4d8beedd8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.118290 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/104ef6b1-854f-423e-bfde-2a4d8beedd8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.118843 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/104ef6b1-854f-423e-bfde-2a4d8beedd8f-ceph\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.120114 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104ef6b1-854f-423e-bfde-2a4d8beedd8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.142944 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89l77\" (UniqueName: \"kubernetes.io/projected/104ef6b1-854f-423e-bfde-2a4d8beedd8f-kube-api-access-89l77\") pod \"glance-default-external-api-0\" (UID: \"104ef6b1-854f-423e-bfde-2a4d8beedd8f\") " pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.197535 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.209596 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-ceph\") pod \"4abaca50-6e4d-4947-b3e1-6b627376a788\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.209655 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-httpd-run\") pod \"4abaca50-6e4d-4947-b3e1-6b627376a788\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.209774 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-config-data\") pod \"4abaca50-6e4d-4947-b3e1-6b627376a788\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.209805 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-logs\") pod \"4abaca50-6e4d-4947-b3e1-6b627376a788\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.209892 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-scripts\") pod \"4abaca50-6e4d-4947-b3e1-6b627376a788\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.209915 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-combined-ca-bundle\") pod \"4abaca50-6e4d-4947-b3e1-6b627376a788\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.209942 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8svll\" (UniqueName: \"kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-kube-api-access-8svll\") pod \"4abaca50-6e4d-4947-b3e1-6b627376a788\" (UID: \"4abaca50-6e4d-4947-b3e1-6b627376a788\") " Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.211477 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-logs" (OuterVolumeSpecName: "logs") pod "4abaca50-6e4d-4947-b3e1-6b627376a788" (UID: "4abaca50-6e4d-4947-b3e1-6b627376a788"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.215127 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-ceph" (OuterVolumeSpecName: "ceph") pod "4abaca50-6e4d-4947-b3e1-6b627376a788" (UID: "4abaca50-6e4d-4947-b3e1-6b627376a788"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.216230 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4abaca50-6e4d-4947-b3e1-6b627376a788" (UID: "4abaca50-6e4d-4947-b3e1-6b627376a788"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.216466 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-scripts" (OuterVolumeSpecName: "scripts") pod "4abaca50-6e4d-4947-b3e1-6b627376a788" (UID: "4abaca50-6e4d-4947-b3e1-6b627376a788"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.220731 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-kube-api-access-8svll" (OuterVolumeSpecName: "kube-api-access-8svll") pod "4abaca50-6e4d-4947-b3e1-6b627376a788" (UID: "4abaca50-6e4d-4947-b3e1-6b627376a788"). InnerVolumeSpecName "kube-api-access-8svll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.256607 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4abaca50-6e4d-4947-b3e1-6b627376a788" (UID: "4abaca50-6e4d-4947-b3e1-6b627376a788"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.284320 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-config-data" (OuterVolumeSpecName: "config-data") pod "4abaca50-6e4d-4947-b3e1-6b627376a788" (UID: "4abaca50-6e4d-4947-b3e1-6b627376a788"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.314894 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.316804 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8svll\" (UniqueName: \"kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-kube-api-access-8svll\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.316828 4753 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4abaca50-6e4d-4947-b3e1-6b627376a788-ceph\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.316843 4753 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.316858 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.316870 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4abaca50-6e4d-4947-b3e1-6b627376a788-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.316882 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4abaca50-6e4d-4947-b3e1-6b627376a788-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.755113 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c654466df-ckb5g" event={"ID":"f74ce4f9-2bea-4d8e-a301-b33daef607d6","Type":"ContainerStarted","Data":"d68aa678ce0b74699733e3dd8b9df81af227325c040d4b5e092a20b0fc349fc9"} Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.755297 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c654466df-ckb5g" podUID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" containerName="horizon-log" containerID="cri-o://984abde2e7b5b64d691afd296a0ed70f89e60c371b19b05f4be7ed13583cc479" gracePeriod=30 Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.755372 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c654466df-ckb5g" podUID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" containerName="horizon" containerID="cri-o://d68aa678ce0b74699733e3dd8b9df81af227325c040d4b5e092a20b0fc349fc9" gracePeriod=30 Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.758577 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c64b7b9-zzqxw" event={"ID":"f37cc84a-e714-4e00-8885-68ce4992b34e","Type":"ContainerStarted","Data":"98cbaa89515b6b50681f5c5d7f438d34e1ace6e611e1a3cb7e82f420fd5e023a"} Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.767012 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4abaca50-6e4d-4947-b3e1-6b627376a788","Type":"ContainerDied","Data":"87f5ef070045c9fae1265eb3aee9c473b256b36ac042a21a98a75232860db2ca"} Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.767070 4753 scope.go:117] "RemoveContainer" containerID="891128b69da3f3bbbd8efd446640966b20c7879b34a670191412181677ba15c5" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.767217 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.774261 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.789628 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd549bc7-5fnnk" event={"ID":"92a1036d-2faa-4780-8ad4-153d6a0ac402","Type":"ContainerStarted","Data":"2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9"} Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.803649 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c654466df-ckb5g" podStartSLOduration=2.457721507 podStartE2EDuration="10.803622723s" podCreationTimestamp="2026-01-29 15:40:25 +0000 UTC" firstStartedPulling="2026-01-29 15:40:25.985329099 +0000 UTC m=+5860.680063481" lastFinishedPulling="2026-01-29 15:40:34.331230315 +0000 UTC m=+5869.025964697" observedRunningTime="2026-01-29 15:40:35.78354677 +0000 UTC m=+5870.478281152" watchObservedRunningTime="2026-01-29 15:40:35.803622723 +0000 UTC m=+5870.498357125" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.818467 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8c64b7b9-zzqxw" podStartSLOduration=3.043521322 podStartE2EDuration="10.818438552s" podCreationTimestamp="2026-01-29 15:40:25 +0000 UTC" firstStartedPulling="2026-01-29 15:40:26.110731473 +0000 UTC m=+5860.805465865" lastFinishedPulling="2026-01-29 15:40:33.885648713 +0000 UTC m=+5868.580383095" observedRunningTime="2026-01-29 15:40:35.807288221 +0000 UTC m=+5870.502022593" watchObservedRunningTime="2026-01-29 15:40:35.818438552 +0000 UTC m=+5870.513172934" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.852128 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-68cd549bc7-5fnnk" podStartSLOduration=3.68904283 podStartE2EDuration="10.85210403s" podCreationTimestamp="2026-01-29 15:40:25 +0000 UTC" firstStartedPulling="2026-01-29 15:40:26.857225635 +0000 UTC m=+5861.551960037" lastFinishedPulling="2026-01-29 15:40:34.020286855 +0000 UTC m=+5868.715021237" observedRunningTime="2026-01-29 15:40:35.847998539 +0000 UTC m=+5870.542732931" watchObservedRunningTime="2026-01-29 15:40:35.85210403 +0000 UTC m=+5870.546838412" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.926713 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.958475 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.958653 4753 scope.go:117] "RemoveContainer" containerID="4505286665ec8df6d6f490fb8862aaf09f3220c1a84751ff2c473a2b109913b7" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.975267 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:40:35 crc kubenswrapper[4753]: E0129 15:40:35.975765 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerName="glance-log" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.975778 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerName="glance-log" Jan 29 15:40:35 crc kubenswrapper[4753]: E0129 15:40:35.975792 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerName="glance-httpd" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.975800 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerName="glance-httpd" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.975997 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerName="glance-httpd" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.976034 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" containerName="glance-log" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.978417 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.983714 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 29 15:40:35 crc kubenswrapper[4753]: I0129 15:40:35.996630 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.033303 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.033381 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.033420 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.033573 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-logs\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.033603 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.033624 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.033693 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rlq\" (UniqueName: \"kubernetes.io/projected/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-kube-api-access-s9rlq\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.135737 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-logs\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.135785 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.135804 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.135847 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9rlq\" (UniqueName: \"kubernetes.io/projected/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-kube-api-access-s9rlq\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.135924 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.135951 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.135971 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.136388 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.136596 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-logs\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.147407 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.148457 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.174990 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.175935 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.181304 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abaca50-6e4d-4947-b3e1-6b627376a788" path="/var/lib/kubelet/pods/4abaca50-6e4d-4947-b3e1-6b627376a788/volumes" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.184983 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bc7903-cddb-464d-a5ae-ef660282f4b7" path="/var/lib/kubelet/pods/71bc7903-cddb-464d-a5ae-ef660282f4b7/volumes" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.200826 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9rlq\" (UniqueName: \"kubernetes.io/projected/ea96ed85-0d6b-4874-ae27-4844dc5ac67b-kube-api-access-s9rlq\") pod \"glance-default-internal-api-0\" (UID: \"ea96ed85-0d6b-4874-ae27-4844dc5ac67b\") " pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.307638 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.349444 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.349559 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.805188 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"104ef6b1-854f-423e-bfde-2a4d8beedd8f","Type":"ContainerStarted","Data":"b078bdab5ce604666034dfa4edf98d3dccb02719f393635844ed413b5c125237"} Jan 29 15:40:36 crc kubenswrapper[4753]: I0129 15:40:36.968678 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 29 15:40:37 crc kubenswrapper[4753]: I0129 15:40:37.821019 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"104ef6b1-854f-423e-bfde-2a4d8beedd8f","Type":"ContainerStarted","Data":"0138c39caebc5a2db821cfd8a7782db7d7707ddc710e3ab8a7e4b2f3784fad1f"} Jan 29 15:40:37 crc kubenswrapper[4753]: I0129 15:40:37.823400 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ea96ed85-0d6b-4874-ae27-4844dc5ac67b","Type":"ContainerStarted","Data":"448755156e823b5b87a90c6c8bb48dcfd869ceb6d713aab227f0e176f59d7e65"} Jan 29 15:40:38 crc kubenswrapper[4753]: I0129 15:40:38.837336 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ea96ed85-0d6b-4874-ae27-4844dc5ac67b","Type":"ContainerStarted","Data":"4c6f9144aeeacf4a22e1dfda3f2287f6b74dd2b26f6aaeadfeaeb56c2728a9d1"} Jan 29 15:40:38 crc kubenswrapper[4753]: I0129 15:40:38.842348 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"104ef6b1-854f-423e-bfde-2a4d8beedd8f","Type":"ContainerStarted","Data":"21470d66a5a7a6a4c8b8fa7e23bddc695ef51e1d622c9547855fb7d0d1cbbc48"} Jan 29 15:40:39 crc kubenswrapper[4753]: I0129 15:40:39.885415 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.885394715 podStartE2EDuration="5.885394715s" podCreationTimestamp="2026-01-29 15:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:40:39.882582279 +0000 UTC m=+5874.577316661" watchObservedRunningTime="2026-01-29 15:40:39.885394715 +0000 UTC m=+5874.580129097" Jan 29 15:40:40 crc kubenswrapper[4753]: I0129 15:40:40.866181 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ea96ed85-0d6b-4874-ae27-4844dc5ac67b","Type":"ContainerStarted","Data":"9f93d7be9fc903d87dab557c42299d73867cc59c2f74cc15bcd309e639bb500c"} Jan 29 15:40:40 crc kubenswrapper[4753]: I0129 15:40:40.893071 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.893043563 podStartE2EDuration="5.893043563s" podCreationTimestamp="2026-01-29 15:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:40:40.886703072 +0000 UTC m=+5875.581437464" watchObservedRunningTime="2026-01-29 15:40:40.893043563 +0000 UTC m=+5875.587777945" Jan 29 15:40:42 crc kubenswrapper[4753]: I0129 15:40:42.150184 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:40:42 crc kubenswrapper[4753]: E0129 15:40:42.150881 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:40:45 crc kubenswrapper[4753]: I0129 15:40:45.198946 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 15:40:45 crc kubenswrapper[4753]: I0129 15:40:45.200476 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 29 15:40:45 crc kubenswrapper[4753]: I0129 15:40:45.237263 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 15:40:45 crc kubenswrapper[4753]: I0129 15:40:45.246967 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 29 15:40:45 crc kubenswrapper[4753]: I0129 15:40:45.472665 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:40:45 crc kubenswrapper[4753]: I0129 15:40:45.582422 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:45 crc kubenswrapper[4753]: I0129 15:40:45.583184 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:40:45 crc kubenswrapper[4753]: I0129 15:40:45.584277 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8c64b7b9-zzqxw" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.119:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8080: connect: connection refused" Jan 29 15:40:45 crc kubenswrapper[4753]: I0129 15:40:45.909509 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 15:40:45 crc kubenswrapper[4753]: I0129 15:40:45.909551 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 29 15:40:46 crc kubenswrapper[4753]: I0129 15:40:46.310258 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:46 crc kubenswrapper[4753]: I0129 15:40:46.310469 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:46 crc kubenswrapper[4753]: I0129 15:40:46.351636 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:46 crc kubenswrapper[4753]: I0129 15:40:46.352941 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-68cd549bc7-5fnnk" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.120:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8080: connect: connection refused" Jan 29 15:40:46 crc kubenswrapper[4753]: I0129 15:40:46.362544 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:46 crc kubenswrapper[4753]: I0129 15:40:46.919335 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:46 crc kubenswrapper[4753]: I0129 15:40:46.919409 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:48 crc kubenswrapper[4753]: I0129 15:40:48.628047 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 15:40:48 crc kubenswrapper[4753]: I0129 15:40:48.628698 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 15:40:48 crc kubenswrapper[4753]: I0129 15:40:48.641574 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 29 15:40:48 crc kubenswrapper[4753]: I0129 15:40:48.939972 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 15:40:48 crc kubenswrapper[4753]: I0129 15:40:48.940004 4753 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 15:40:49 crc kubenswrapper[4753]: I0129 15:40:49.289827 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:49 crc kubenswrapper[4753]: I0129 15:40:49.296676 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 29 15:40:54 crc kubenswrapper[4753]: I0129 15:40:54.150006 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:40:54 crc kubenswrapper[4753]: E0129 15:40:54.150907 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:40:55 crc kubenswrapper[4753]: I0129 15:40:55.046977 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-fts8x"] Jan 29 15:40:55 crc kubenswrapper[4753]: I0129 15:40:55.058160 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-aa39-account-create-update-mnzk6"] Jan 29 15:40:55 crc kubenswrapper[4753]: I0129 15:40:55.068243 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-fts8x"] Jan 29 15:40:55 crc kubenswrapper[4753]: I0129 15:40:55.077047 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-aa39-account-create-update-mnzk6"] Jan 29 15:40:55 crc kubenswrapper[4753]: I0129 15:40:55.582531 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8c64b7b9-zzqxw" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.119:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8080: connect: connection refused" Jan 29 15:40:56 crc kubenswrapper[4753]: I0129 15:40:56.161494 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beaf5340-3a9d-4712-8524-606a091f544e" path="/var/lib/kubelet/pods/beaf5340-3a9d-4712-8524-606a091f544e/volumes" Jan 29 15:40:56 crc kubenswrapper[4753]: I0129 15:40:56.165320 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c" path="/var/lib/kubelet/pods/c9da281f-b7c0-4cbc-bb66-99f5b92f1d1c/volumes" Jan 29 15:40:56 crc kubenswrapper[4753]: I0129 15:40:56.350195 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-68cd549bc7-5fnnk" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.120:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8080: connect: connection refused" Jan 29 15:41:01 crc kubenswrapper[4753]: I0129 15:41:01.037409 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-46k8w"] Jan 29 15:41:01 crc kubenswrapper[4753]: I0129 15:41:01.047210 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-46k8w"] Jan 29 15:41:02 crc kubenswrapper[4753]: I0129 15:41:02.163639 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2834447-2ee8-4608-85ff-805e2fcbe7c6" path="/var/lib/kubelet/pods/a2834447-2ee8-4608-85ff-805e2fcbe7c6/volumes" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.103697 4753 generic.go:334] "Generic (PLEG): container finished" podID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" containerID="d68aa678ce0b74699733e3dd8b9df81af227325c040d4b5e092a20b0fc349fc9" exitCode=137 Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.105607 4753 generic.go:334] "Generic (PLEG): container finished" podID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" containerID="984abde2e7b5b64d691afd296a0ed70f89e60c371b19b05f4be7ed13583cc479" exitCode=137 Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.103789 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c654466df-ckb5g" event={"ID":"f74ce4f9-2bea-4d8e-a301-b33daef607d6","Type":"ContainerDied","Data":"d68aa678ce0b74699733e3dd8b9df81af227325c040d4b5e092a20b0fc349fc9"} Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.105834 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c654466df-ckb5g" event={"ID":"f74ce4f9-2bea-4d8e-a301-b33daef607d6","Type":"ContainerDied","Data":"984abde2e7b5b64d691afd296a0ed70f89e60c371b19b05f4be7ed13583cc479"} Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.712854 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.801550 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-config-data\") pod \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.801692 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-scripts\") pod \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.801892 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd52p\" (UniqueName: \"kubernetes.io/projected/f74ce4f9-2bea-4d8e-a301-b33daef607d6-kube-api-access-kd52p\") pod \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.801986 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74ce4f9-2bea-4d8e-a301-b33daef607d6-logs\") pod \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.802063 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f74ce4f9-2bea-4d8e-a301-b33daef607d6-horizon-secret-key\") pod \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\" (UID: \"f74ce4f9-2bea-4d8e-a301-b33daef607d6\") " Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.802384 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74ce4f9-2bea-4d8e-a301-b33daef607d6-logs" (OuterVolumeSpecName: "logs") pod "f74ce4f9-2bea-4d8e-a301-b33daef607d6" (UID: "f74ce4f9-2bea-4d8e-a301-b33daef607d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.807232 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74ce4f9-2bea-4d8e-a301-b33daef607d6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f74ce4f9-2bea-4d8e-a301-b33daef607d6" (UID: "f74ce4f9-2bea-4d8e-a301-b33daef607d6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.807531 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74ce4f9-2bea-4d8e-a301-b33daef607d6-kube-api-access-kd52p" (OuterVolumeSpecName: "kube-api-access-kd52p") pod "f74ce4f9-2bea-4d8e-a301-b33daef607d6" (UID: "f74ce4f9-2bea-4d8e-a301-b33daef607d6"). InnerVolumeSpecName "kube-api-access-kd52p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.837883 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-scripts" (OuterVolumeSpecName: "scripts") pod "f74ce4f9-2bea-4d8e-a301-b33daef607d6" (UID: "f74ce4f9-2bea-4d8e-a301-b33daef607d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.853027 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-config-data" (OuterVolumeSpecName: "config-data") pod "f74ce4f9-2bea-4d8e-a301-b33daef607d6" (UID: "f74ce4f9-2bea-4d8e-a301-b33daef607d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.906514 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd52p\" (UniqueName: \"kubernetes.io/projected/f74ce4f9-2bea-4d8e-a301-b33daef607d6-kube-api-access-kd52p\") on node \"crc\" DevicePath \"\"" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.906549 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74ce4f9-2bea-4d8e-a301-b33daef607d6-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.906559 4753 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f74ce4f9-2bea-4d8e-a301-b33daef607d6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.906570 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:41:06 crc kubenswrapper[4753]: I0129 15:41:06.906579 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f74ce4f9-2bea-4d8e-a301-b33daef607d6-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:41:07 crc kubenswrapper[4753]: I0129 15:41:07.117472 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c654466df-ckb5g" event={"ID":"f74ce4f9-2bea-4d8e-a301-b33daef607d6","Type":"ContainerDied","Data":"ed3738305c9e8426b8e12b176e1294e4f01dcd4b5ba171cf5b66f9e91867ff31"} Jan 29 15:41:07 crc kubenswrapper[4753]: I0129 15:41:07.117522 4753 scope.go:117] "RemoveContainer" containerID="d68aa678ce0b74699733e3dd8b9df81af227325c040d4b5e092a20b0fc349fc9" Jan 29 15:41:07 crc kubenswrapper[4753]: I0129 15:41:07.117658 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c654466df-ckb5g" Jan 29 15:41:07 crc kubenswrapper[4753]: I0129 15:41:07.191305 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c654466df-ckb5g"] Jan 29 15:41:07 crc kubenswrapper[4753]: I0129 15:41:07.202422 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c654466df-ckb5g"] Jan 29 15:41:07 crc kubenswrapper[4753]: I0129 15:41:07.315924 4753 scope.go:117] "RemoveContainer" containerID="984abde2e7b5b64d691afd296a0ed70f89e60c371b19b05f4be7ed13583cc479" Jan 29 15:41:07 crc kubenswrapper[4753]: I0129 15:41:07.666857 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:41:08 crc kubenswrapper[4753]: I0129 15:41:08.160976 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" path="/var/lib/kubelet/pods/f74ce4f9-2bea-4d8e-a301-b33daef607d6/volumes" Jan 29 15:41:08 crc kubenswrapper[4753]: I0129 15:41:08.450944 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:41:09 crc kubenswrapper[4753]: I0129 15:41:09.149309 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:41:09 crc kubenswrapper[4753]: E0129 15:41:09.149700 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:41:09 crc kubenswrapper[4753]: I0129 15:41:09.647009 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:41:10 crc kubenswrapper[4753]: I0129 15:41:10.199487 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:41:10 crc kubenswrapper[4753]: I0129 15:41:10.260791 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8c64b7b9-zzqxw"] Jan 29 15:41:10 crc kubenswrapper[4753]: I0129 15:41:10.261057 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8c64b7b9-zzqxw" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon-log" containerID="cri-o://f3a64cba0e7e4ee1596ff9909ca82a1164baf82a9e161703e98e7aba36caee5b" gracePeriod=30 Jan 29 15:41:10 crc kubenswrapper[4753]: I0129 15:41:10.261231 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8c64b7b9-zzqxw" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon" containerID="cri-o://98cbaa89515b6b50681f5c5d7f438d34e1ace6e611e1a3cb7e82f420fd5e023a" gracePeriod=30 Jan 29 15:41:11 crc kubenswrapper[4753]: I0129 15:41:11.330171 4753 scope.go:117] "RemoveContainer" containerID="0ebb9fc893c0cbdfd16f60f2ea3e4e446d75221e24a32b2cf2df3561df3b9191" Jan 29 15:41:11 crc kubenswrapper[4753]: I0129 15:41:11.367022 4753 scope.go:117] "RemoveContainer" containerID="950fdfde8da868542ae3476859a5f714f1afd03535c8e5b01a25b64b5f6fba9b" Jan 29 15:41:11 crc kubenswrapper[4753]: I0129 15:41:11.419211 4753 scope.go:117] "RemoveContainer" containerID="c1072d9eff4fa34a5c6e8776018646f90816190562caca57895700347741c858" Jan 29 15:41:11 crc kubenswrapper[4753]: I0129 15:41:11.444034 4753 scope.go:117] "RemoveContainer" containerID="b08af0664387ab2af96e764a332519279b48e120cb4c65117e46057e6fdf2f43" Jan 29 15:41:11 crc kubenswrapper[4753]: I0129 15:41:11.490288 4753 scope.go:117] "RemoveContainer" containerID="89fb396c49784ee588ac62b06ace3b0e44cc17ff37a9366ec923a48df5fcd70f" Jan 29 15:41:11 crc kubenswrapper[4753]: I0129 15:41:11.528836 4753 scope.go:117] "RemoveContainer" containerID="9a75ac00c7daba2c415d2e3c7b0591c8b551d4903fd8edd980d0e552aba51abf" Jan 29 15:41:14 crc kubenswrapper[4753]: I0129 15:41:14.201901 4753 generic.go:334] "Generic (PLEG): container finished" podID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerID="98cbaa89515b6b50681f5c5d7f438d34e1ace6e611e1a3cb7e82f420fd5e023a" exitCode=0 Jan 29 15:41:14 crc kubenswrapper[4753]: I0129 15:41:14.201994 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c64b7b9-zzqxw" event={"ID":"f37cc84a-e714-4e00-8885-68ce4992b34e","Type":"ContainerDied","Data":"98cbaa89515b6b50681f5c5d7f438d34e1ace6e611e1a3cb7e82f420fd5e023a"} Jan 29 15:41:15 crc kubenswrapper[4753]: I0129 15:41:15.582842 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8c64b7b9-zzqxw" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.119:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8080: connect: connection refused" Jan 29 15:41:22 crc kubenswrapper[4753]: I0129 15:41:22.149083 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:41:22 crc kubenswrapper[4753]: E0129 15:41:22.149859 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:41:25 crc kubenswrapper[4753]: I0129 15:41:25.586400 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8c64b7b9-zzqxw" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.119:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8080: connect: connection refused" Jan 29 15:41:26 crc kubenswrapper[4753]: I0129 15:41:26.047134 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qslf4"] Jan 29 15:41:26 crc kubenswrapper[4753]: I0129 15:41:26.056764 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9cac-account-create-update-fwgr5"] Jan 29 15:41:26 crc kubenswrapper[4753]: I0129 15:41:26.065953 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qslf4"] Jan 29 15:41:26 crc kubenswrapper[4753]: I0129 15:41:26.075657 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9cac-account-create-update-fwgr5"] Jan 29 15:41:26 crc kubenswrapper[4753]: I0129 15:41:26.160115 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b717bb47-ad09-46c8-8f5c-6d760dcdfc9b" path="/var/lib/kubelet/pods/b717bb47-ad09-46c8-8f5c-6d760dcdfc9b/volumes" Jan 29 15:41:26 crc kubenswrapper[4753]: I0129 15:41:26.161888 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b3b498-38f4-46cd-b5a8-7fb0512ffb95" path="/var/lib/kubelet/pods/e8b3b498-38f4-46cd-b5a8-7fb0512ffb95/volumes" Jan 29 15:41:35 crc kubenswrapper[4753]: I0129 15:41:35.042110 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8j5qx"] Jan 29 15:41:35 crc kubenswrapper[4753]: I0129 15:41:35.054036 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8j5qx"] Jan 29 15:41:35 crc kubenswrapper[4753]: I0129 15:41:35.150414 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:41:35 crc kubenswrapper[4753]: E0129 15:41:35.150722 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:41:35 crc kubenswrapper[4753]: I0129 15:41:35.583227 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8c64b7b9-zzqxw" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.119:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8080: connect: connection refused" Jan 29 15:41:35 crc kubenswrapper[4753]: I0129 15:41:35.583391 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:41:36 crc kubenswrapper[4753]: I0129 15:41:36.172567 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df72e2f5-4140-4320-9057-573e0d202332" path="/var/lib/kubelet/pods/df72e2f5-4140-4320-9057-573e0d202332/volumes" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.450505 4753 generic.go:334] "Generic (PLEG): container finished" podID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerID="f3a64cba0e7e4ee1596ff9909ca82a1164baf82a9e161703e98e7aba36caee5b" exitCode=137 Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.450611 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c64b7b9-zzqxw" event={"ID":"f37cc84a-e714-4e00-8885-68ce4992b34e","Type":"ContainerDied","Data":"f3a64cba0e7e4ee1596ff9909ca82a1164baf82a9e161703e98e7aba36caee5b"} Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.697599 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.826904 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f37cc84a-e714-4e00-8885-68ce4992b34e-logs\") pod \"f37cc84a-e714-4e00-8885-68ce4992b34e\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.827066 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-scripts\") pod \"f37cc84a-e714-4e00-8885-68ce4992b34e\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.827140 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f37cc84a-e714-4e00-8885-68ce4992b34e-horizon-secret-key\") pod \"f37cc84a-e714-4e00-8885-68ce4992b34e\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.827246 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-config-data\") pod \"f37cc84a-e714-4e00-8885-68ce4992b34e\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.827354 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4vvf\" (UniqueName: \"kubernetes.io/projected/f37cc84a-e714-4e00-8885-68ce4992b34e-kube-api-access-w4vvf\") pod \"f37cc84a-e714-4e00-8885-68ce4992b34e\" (UID: \"f37cc84a-e714-4e00-8885-68ce4992b34e\") " Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.828481 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37cc84a-e714-4e00-8885-68ce4992b34e-logs" (OuterVolumeSpecName: "logs") pod "f37cc84a-e714-4e00-8885-68ce4992b34e" (UID: "f37cc84a-e714-4e00-8885-68ce4992b34e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.833295 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f37cc84a-e714-4e00-8885-68ce4992b34e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f37cc84a-e714-4e00-8885-68ce4992b34e" (UID: "f37cc84a-e714-4e00-8885-68ce4992b34e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.834483 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37cc84a-e714-4e00-8885-68ce4992b34e-kube-api-access-w4vvf" (OuterVolumeSpecName: "kube-api-access-w4vvf") pod "f37cc84a-e714-4e00-8885-68ce4992b34e" (UID: "f37cc84a-e714-4e00-8885-68ce4992b34e"). InnerVolumeSpecName "kube-api-access-w4vvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.859700 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-config-data" (OuterVolumeSpecName: "config-data") pod "f37cc84a-e714-4e00-8885-68ce4992b34e" (UID: "f37cc84a-e714-4e00-8885-68ce4992b34e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.861608 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-scripts" (OuterVolumeSpecName: "scripts") pod "f37cc84a-e714-4e00-8885-68ce4992b34e" (UID: "f37cc84a-e714-4e00-8885-68ce4992b34e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.931533 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f37cc84a-e714-4e00-8885-68ce4992b34e-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.931613 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.931631 4753 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f37cc84a-e714-4e00-8885-68ce4992b34e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.931646 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f37cc84a-e714-4e00-8885-68ce4992b34e-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:41:40 crc kubenswrapper[4753]: I0129 15:41:40.931663 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4vvf\" (UniqueName: \"kubernetes.io/projected/f37cc84a-e714-4e00-8885-68ce4992b34e-kube-api-access-w4vvf\") on node \"crc\" DevicePath \"\"" Jan 29 15:41:41 crc kubenswrapper[4753]: I0129 15:41:41.463173 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c64b7b9-zzqxw" event={"ID":"f37cc84a-e714-4e00-8885-68ce4992b34e","Type":"ContainerDied","Data":"d4a8f2e9dc266917f59ab65b350f5d4a103d456dea9846aa83edc56aea46d704"} Jan 29 15:41:41 crc kubenswrapper[4753]: I0129 15:41:41.463228 4753 scope.go:117] "RemoveContainer" containerID="98cbaa89515b6b50681f5c5d7f438d34e1ace6e611e1a3cb7e82f420fd5e023a" Jan 29 15:41:41 crc kubenswrapper[4753]: I0129 15:41:41.464169 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c64b7b9-zzqxw" Jan 29 15:41:41 crc kubenswrapper[4753]: I0129 15:41:41.512562 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8c64b7b9-zzqxw"] Jan 29 15:41:41 crc kubenswrapper[4753]: I0129 15:41:41.533832 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8c64b7b9-zzqxw"] Jan 29 15:41:41 crc kubenswrapper[4753]: I0129 15:41:41.627277 4753 scope.go:117] "RemoveContainer" containerID="f3a64cba0e7e4ee1596ff9909ca82a1164baf82a9e161703e98e7aba36caee5b" Jan 29 15:41:42 crc kubenswrapper[4753]: I0129 15:41:42.161574 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" path="/var/lib/kubelet/pods/f37cc84a-e714-4e00-8885-68ce4992b34e/volumes" Jan 29 15:41:47 crc kubenswrapper[4753]: I0129 15:41:47.149562 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:41:47 crc kubenswrapper[4753]: E0129 15:41:47.150195 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:41:58 crc kubenswrapper[4753]: I0129 15:41:58.150370 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:41:58 crc kubenswrapper[4753]: E0129 15:41:58.151773 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:42:10 crc kubenswrapper[4753]: I0129 15:42:10.151678 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:42:10 crc kubenswrapper[4753]: E0129 15:42:10.152825 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:42:11 crc kubenswrapper[4753]: I0129 15:42:11.722759 4753 scope.go:117] "RemoveContainer" containerID="592d94274dcce8c03f05a8a7563552395e5dc3d1e4c1531a077c14acb7ab27cc" Jan 29 15:42:11 crc kubenswrapper[4753]: I0129 15:42:11.769253 4753 scope.go:117] "RemoveContainer" containerID="55719bfd1035cecf78229922aae1343249d3fcc17d4b4201732714e31a0ae252" Jan 29 15:42:11 crc kubenswrapper[4753]: I0129 15:42:11.804211 4753 scope.go:117] "RemoveContainer" containerID="3c45b582985d1e02bd7d28b086406d7fd8c82f48df6231daa33bf78cb067a865" Jan 29 15:42:16 crc kubenswrapper[4753]: I0129 15:42:16.047307 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lgmwt"] Jan 29 15:42:16 crc kubenswrapper[4753]: I0129 15:42:16.058565 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lgmwt"] Jan 29 15:42:16 crc kubenswrapper[4753]: I0129 15:42:16.179865 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f16af97b-955b-4903-b680-f7250f57874f" path="/var/lib/kubelet/pods/f16af97b-955b-4903-b680-f7250f57874f/volumes" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.037666 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bc3f-account-create-update-94pk6"] Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.049983 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bc3f-account-create-update-94pk6"] Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.742115 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6797b66f4f-wv5m7"] Jan 29 15:42:17 crc kubenswrapper[4753]: E0129 15:42:17.742653 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" containerName="horizon-log" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.742675 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" containerName="horizon-log" Jan 29 15:42:17 crc kubenswrapper[4753]: E0129 15:42:17.742709 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon-log" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.742717 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon-log" Jan 29 15:42:17 crc kubenswrapper[4753]: E0129 15:42:17.742736 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" containerName="horizon" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.742742 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" containerName="horizon" Jan 29 15:42:17 crc kubenswrapper[4753]: E0129 15:42:17.742753 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.742759 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.742979 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" containerName="horizon-log" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.742998 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74ce4f9-2bea-4d8e-a301-b33daef607d6" containerName="horizon" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.743012 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.743023 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37cc84a-e714-4e00-8885-68ce4992b34e" containerName="horizon-log" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.748585 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.763353 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6797b66f4f-wv5m7"] Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.935318 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fced629-f257-4241-9f17-7856b0472fb9-logs\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.935396 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb672\" (UniqueName: \"kubernetes.io/projected/5fced629-f257-4241-9f17-7856b0472fb9-kube-api-access-tb672\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.935426 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5fced629-f257-4241-9f17-7856b0472fb9-config-data\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.935607 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fced629-f257-4241-9f17-7856b0472fb9-scripts\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:17 crc kubenswrapper[4753]: I0129 15:42:17.935670 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5fced629-f257-4241-9f17-7856b0472fb9-horizon-secret-key\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.036542 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fced629-f257-4241-9f17-7856b0472fb9-logs\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.036625 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb672\" (UniqueName: \"kubernetes.io/projected/5fced629-f257-4241-9f17-7856b0472fb9-kube-api-access-tb672\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.036662 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5fced629-f257-4241-9f17-7856b0472fb9-config-data\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.036728 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fced629-f257-4241-9f17-7856b0472fb9-scripts\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.036757 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5fced629-f257-4241-9f17-7856b0472fb9-horizon-secret-key\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.036979 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fced629-f257-4241-9f17-7856b0472fb9-logs\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.038225 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5fced629-f257-4241-9f17-7856b0472fb9-config-data\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.038645 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fced629-f257-4241-9f17-7856b0472fb9-scripts\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.043255 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5fced629-f257-4241-9f17-7856b0472fb9-horizon-secret-key\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.058381 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb672\" (UniqueName: \"kubernetes.io/projected/5fced629-f257-4241-9f17-7856b0472fb9-kube-api-access-tb672\") pod \"horizon-6797b66f4f-wv5m7\" (UID: \"5fced629-f257-4241-9f17-7856b0472fb9\") " pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.074018 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.209447 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515e5952-854e-4cbe-9d9b-e6b27d558e69" path="/var/lib/kubelet/pods/515e5952-854e-4cbe-9d9b-e6b27d558e69/volumes" Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.776109 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6797b66f4f-wv5m7"] Jan 29 15:42:18 crc kubenswrapper[4753]: I0129 15:42:18.835316 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6797b66f4f-wv5m7" event={"ID":"5fced629-f257-4241-9f17-7856b0472fb9","Type":"ContainerStarted","Data":"132d6a265fdd77a81a1d1f75c72506f6e7e3969eb8c0508b2b05143aed215f35"} Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.058792 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-2tsdk"] Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.066746 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-2tsdk" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.085317 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-2tsdk"] Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.159027 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5980b21e-1b3d-4944-b456-b7b1047ab256-operator-scripts\") pod \"heat-db-create-2tsdk\" (UID: \"5980b21e-1b3d-4944-b456-b7b1047ab256\") " pod="openstack/heat-db-create-2tsdk" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.159428 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm7qj\" (UniqueName: \"kubernetes.io/projected/5980b21e-1b3d-4944-b456-b7b1047ab256-kube-api-access-rm7qj\") pod \"heat-db-create-2tsdk\" (UID: \"5980b21e-1b3d-4944-b456-b7b1047ab256\") " pod="openstack/heat-db-create-2tsdk" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.177305 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cb3d-account-create-update-zsflm"] Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.179219 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cb3d-account-create-update-zsflm" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.181024 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.195343 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cb3d-account-create-update-zsflm"] Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.261192 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a2d9450-6c48-4896-b5f6-cafb5803c488-operator-scripts\") pod \"heat-cb3d-account-create-update-zsflm\" (UID: \"8a2d9450-6c48-4896-b5f6-cafb5803c488\") " pod="openstack/heat-cb3d-account-create-update-zsflm" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.262143 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5980b21e-1b3d-4944-b456-b7b1047ab256-operator-scripts\") pod \"heat-db-create-2tsdk\" (UID: \"5980b21e-1b3d-4944-b456-b7b1047ab256\") " pod="openstack/heat-db-create-2tsdk" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.262694 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ngr6\" (UniqueName: \"kubernetes.io/projected/8a2d9450-6c48-4896-b5f6-cafb5803c488-kube-api-access-5ngr6\") pod \"heat-cb3d-account-create-update-zsflm\" (UID: \"8a2d9450-6c48-4896-b5f6-cafb5803c488\") " pod="openstack/heat-cb3d-account-create-update-zsflm" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.263415 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm7qj\" (UniqueName: \"kubernetes.io/projected/5980b21e-1b3d-4944-b456-b7b1047ab256-kube-api-access-rm7qj\") pod \"heat-db-create-2tsdk\" (UID: \"5980b21e-1b3d-4944-b456-b7b1047ab256\") " pod="openstack/heat-db-create-2tsdk" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.263449 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5980b21e-1b3d-4944-b456-b7b1047ab256-operator-scripts\") pod \"heat-db-create-2tsdk\" (UID: \"5980b21e-1b3d-4944-b456-b7b1047ab256\") " pod="openstack/heat-db-create-2tsdk" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.292813 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm7qj\" (UniqueName: \"kubernetes.io/projected/5980b21e-1b3d-4944-b456-b7b1047ab256-kube-api-access-rm7qj\") pod \"heat-db-create-2tsdk\" (UID: \"5980b21e-1b3d-4944-b456-b7b1047ab256\") " pod="openstack/heat-db-create-2tsdk" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.365277 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ngr6\" (UniqueName: \"kubernetes.io/projected/8a2d9450-6c48-4896-b5f6-cafb5803c488-kube-api-access-5ngr6\") pod \"heat-cb3d-account-create-update-zsflm\" (UID: \"8a2d9450-6c48-4896-b5f6-cafb5803c488\") " pod="openstack/heat-cb3d-account-create-update-zsflm" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.365432 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a2d9450-6c48-4896-b5f6-cafb5803c488-operator-scripts\") pod \"heat-cb3d-account-create-update-zsflm\" (UID: \"8a2d9450-6c48-4896-b5f6-cafb5803c488\") " pod="openstack/heat-cb3d-account-create-update-zsflm" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.366411 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a2d9450-6c48-4896-b5f6-cafb5803c488-operator-scripts\") pod \"heat-cb3d-account-create-update-zsflm\" (UID: \"8a2d9450-6c48-4896-b5f6-cafb5803c488\") " pod="openstack/heat-cb3d-account-create-update-zsflm" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.383104 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ngr6\" (UniqueName: \"kubernetes.io/projected/8a2d9450-6c48-4896-b5f6-cafb5803c488-kube-api-access-5ngr6\") pod \"heat-cb3d-account-create-update-zsflm\" (UID: \"8a2d9450-6c48-4896-b5f6-cafb5803c488\") " pod="openstack/heat-cb3d-account-create-update-zsflm" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.470522 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-2tsdk" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.511633 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cb3d-account-create-update-zsflm" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.847515 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6797b66f4f-wv5m7" event={"ID":"5fced629-f257-4241-9f17-7856b0472fb9","Type":"ContainerStarted","Data":"9b7d90c666ac818a2313b8497b45d4eeccfbccc45164a6160d3cf6305e50aafa"} Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.847838 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6797b66f4f-wv5m7" event={"ID":"5fced629-f257-4241-9f17-7856b0472fb9","Type":"ContainerStarted","Data":"bd6e4d44617d93c7f6370cdf9231f9799c243662c5e312518a7d37907e28f460"} Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.879227 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6797b66f4f-wv5m7" podStartSLOduration=2.8791989730000003 podStartE2EDuration="2.879198973s" podCreationTimestamp="2026-01-29 15:42:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:42:19.874540647 +0000 UTC m=+5974.569275039" watchObservedRunningTime="2026-01-29 15:42:19.879198973 +0000 UTC m=+5974.573933365" Jan 29 15:42:19 crc kubenswrapper[4753]: I0129 15:42:19.942038 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-2tsdk"] Jan 29 15:42:20 crc kubenswrapper[4753]: I0129 15:42:20.034627 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cb3d-account-create-update-zsflm"] Jan 29 15:42:20 crc kubenswrapper[4753]: I0129 15:42:20.857319 4753 generic.go:334] "Generic (PLEG): container finished" podID="5980b21e-1b3d-4944-b456-b7b1047ab256" containerID="5daee1292b901aa70277adf7b115daf106676adf5a935efd229bf172bd4b5ae9" exitCode=0 Jan 29 15:42:20 crc kubenswrapper[4753]: I0129 15:42:20.857426 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-2tsdk" event={"ID":"5980b21e-1b3d-4944-b456-b7b1047ab256","Type":"ContainerDied","Data":"5daee1292b901aa70277adf7b115daf106676adf5a935efd229bf172bd4b5ae9"} Jan 29 15:42:20 crc kubenswrapper[4753]: I0129 15:42:20.857720 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-2tsdk" event={"ID":"5980b21e-1b3d-4944-b456-b7b1047ab256","Type":"ContainerStarted","Data":"39a6f924776ae75848fa27c28550f30a3e5298e3db5070bef8e29a60e735fa99"} Jan 29 15:42:20 crc kubenswrapper[4753]: I0129 15:42:20.859252 4753 generic.go:334] "Generic (PLEG): container finished" podID="8a2d9450-6c48-4896-b5f6-cafb5803c488" containerID="8e5eb4735ff280a07a116e04bd45b5783dafc0d21b8867bbc82e0e110805be2e" exitCode=0 Jan 29 15:42:20 crc kubenswrapper[4753]: I0129 15:42:20.860562 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cb3d-account-create-update-zsflm" event={"ID":"8a2d9450-6c48-4896-b5f6-cafb5803c488","Type":"ContainerDied","Data":"8e5eb4735ff280a07a116e04bd45b5783dafc0d21b8867bbc82e0e110805be2e"} Jan 29 15:42:20 crc kubenswrapper[4753]: I0129 15:42:20.860588 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cb3d-account-create-update-zsflm" event={"ID":"8a2d9450-6c48-4896-b5f6-cafb5803c488","Type":"ContainerStarted","Data":"9223ca984fa996f04fde2d9764a7b9514db2128c37f2da0be4ad439ef750a98d"} Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.402214 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cb3d-account-create-update-zsflm" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.410893 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-2tsdk" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.439811 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5980b21e-1b3d-4944-b456-b7b1047ab256-operator-scripts\") pod \"5980b21e-1b3d-4944-b456-b7b1047ab256\" (UID: \"5980b21e-1b3d-4944-b456-b7b1047ab256\") " Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.439849 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a2d9450-6c48-4896-b5f6-cafb5803c488-operator-scripts\") pod \"8a2d9450-6c48-4896-b5f6-cafb5803c488\" (UID: \"8a2d9450-6c48-4896-b5f6-cafb5803c488\") " Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.439877 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ngr6\" (UniqueName: \"kubernetes.io/projected/8a2d9450-6c48-4896-b5f6-cafb5803c488-kube-api-access-5ngr6\") pod \"8a2d9450-6c48-4896-b5f6-cafb5803c488\" (UID: \"8a2d9450-6c48-4896-b5f6-cafb5803c488\") " Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.439905 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm7qj\" (UniqueName: \"kubernetes.io/projected/5980b21e-1b3d-4944-b456-b7b1047ab256-kube-api-access-rm7qj\") pod \"5980b21e-1b3d-4944-b456-b7b1047ab256\" (UID: \"5980b21e-1b3d-4944-b456-b7b1047ab256\") " Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.440378 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5980b21e-1b3d-4944-b456-b7b1047ab256-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5980b21e-1b3d-4944-b456-b7b1047ab256" (UID: "5980b21e-1b3d-4944-b456-b7b1047ab256"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.440554 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5980b21e-1b3d-4944-b456-b7b1047ab256-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.440835 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a2d9450-6c48-4896-b5f6-cafb5803c488-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a2d9450-6c48-4896-b5f6-cafb5803c488" (UID: "8a2d9450-6c48-4896-b5f6-cafb5803c488"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.449454 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5980b21e-1b3d-4944-b456-b7b1047ab256-kube-api-access-rm7qj" (OuterVolumeSpecName: "kube-api-access-rm7qj") pod "5980b21e-1b3d-4944-b456-b7b1047ab256" (UID: "5980b21e-1b3d-4944-b456-b7b1047ab256"). InnerVolumeSpecName "kube-api-access-rm7qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.449530 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2d9450-6c48-4896-b5f6-cafb5803c488-kube-api-access-5ngr6" (OuterVolumeSpecName: "kube-api-access-5ngr6") pod "8a2d9450-6c48-4896-b5f6-cafb5803c488" (UID: "8a2d9450-6c48-4896-b5f6-cafb5803c488"). InnerVolumeSpecName "kube-api-access-5ngr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.541777 4753 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a2d9450-6c48-4896-b5f6-cafb5803c488-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.541814 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ngr6\" (UniqueName: \"kubernetes.io/projected/8a2d9450-6c48-4896-b5f6-cafb5803c488-kube-api-access-5ngr6\") on node \"crc\" DevicePath \"\"" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.541825 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm7qj\" (UniqueName: \"kubernetes.io/projected/5980b21e-1b3d-4944-b456-b7b1047ab256-kube-api-access-rm7qj\") on node \"crc\" DevicePath \"\"" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.881048 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cb3d-account-create-update-zsflm" event={"ID":"8a2d9450-6c48-4896-b5f6-cafb5803c488","Type":"ContainerDied","Data":"9223ca984fa996f04fde2d9764a7b9514db2128c37f2da0be4ad439ef750a98d"} Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.881520 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9223ca984fa996f04fde2d9764a7b9514db2128c37f2da0be4ad439ef750a98d" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.881109 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cb3d-account-create-update-zsflm" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.882782 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-2tsdk" event={"ID":"5980b21e-1b3d-4944-b456-b7b1047ab256","Type":"ContainerDied","Data":"39a6f924776ae75848fa27c28550f30a3e5298e3db5070bef8e29a60e735fa99"} Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.882837 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39a6f924776ae75848fa27c28550f30a3e5298e3db5070bef8e29a60e735fa99" Jan 29 15:42:22 crc kubenswrapper[4753]: I0129 15:42:22.882808 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-2tsdk" Jan 29 15:42:23 crc kubenswrapper[4753]: I0129 15:42:23.149843 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:42:23 crc kubenswrapper[4753]: E0129 15:42:23.150125 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.391930 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-w4q8l"] Jan 29 15:42:24 crc kubenswrapper[4753]: E0129 15:42:24.392881 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5980b21e-1b3d-4944-b456-b7b1047ab256" containerName="mariadb-database-create" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.392907 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5980b21e-1b3d-4944-b456-b7b1047ab256" containerName="mariadb-database-create" Jan 29 15:42:24 crc kubenswrapper[4753]: E0129 15:42:24.392976 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2d9450-6c48-4896-b5f6-cafb5803c488" containerName="mariadb-account-create-update" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.392985 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2d9450-6c48-4896-b5f6-cafb5803c488" containerName="mariadb-account-create-update" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.393252 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2d9450-6c48-4896-b5f6-cafb5803c488" containerName="mariadb-account-create-update" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.393274 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5980b21e-1b3d-4944-b456-b7b1047ab256" containerName="mariadb-database-create" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.394128 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.396029 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.397429 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-qn2p5" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.402333 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-w4q8l"] Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.584662 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-combined-ca-bundle\") pod \"heat-db-sync-w4q8l\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.584762 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-config-data\") pod \"heat-db-sync-w4q8l\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.585002 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhcsc\" (UniqueName: \"kubernetes.io/projected/17a71ced-859b-411a-8502-2bbeeae1bf5e-kube-api-access-dhcsc\") pod \"heat-db-sync-w4q8l\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.687842 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhcsc\" (UniqueName: \"kubernetes.io/projected/17a71ced-859b-411a-8502-2bbeeae1bf5e-kube-api-access-dhcsc\") pod \"heat-db-sync-w4q8l\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.688262 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-combined-ca-bundle\") pod \"heat-db-sync-w4q8l\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.688355 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-config-data\") pod \"heat-db-sync-w4q8l\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.693979 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-combined-ca-bundle\") pod \"heat-db-sync-w4q8l\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.695963 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-config-data\") pod \"heat-db-sync-w4q8l\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.705579 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhcsc\" (UniqueName: \"kubernetes.io/projected/17a71ced-859b-411a-8502-2bbeeae1bf5e-kube-api-access-dhcsc\") pod \"heat-db-sync-w4q8l\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:24 crc kubenswrapper[4753]: I0129 15:42:24.720743 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:25 crc kubenswrapper[4753]: I0129 15:42:25.039250 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hw54d"] Jan 29 15:42:25 crc kubenswrapper[4753]: I0129 15:42:25.048952 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hw54d"] Jan 29 15:42:25 crc kubenswrapper[4753]: I0129 15:42:25.203902 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-w4q8l"] Jan 29 15:42:25 crc kubenswrapper[4753]: W0129 15:42:25.204012 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17a71ced_859b_411a_8502_2bbeeae1bf5e.slice/crio-271a45156e474846837a8202daa67ea59078f824567a180ef201bc25b609d690 WatchSource:0}: Error finding container 271a45156e474846837a8202daa67ea59078f824567a180ef201bc25b609d690: Status 404 returned error can't find the container with id 271a45156e474846837a8202daa67ea59078f824567a180ef201bc25b609d690 Jan 29 15:42:25 crc kubenswrapper[4753]: I0129 15:42:25.916191 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w4q8l" event={"ID":"17a71ced-859b-411a-8502-2bbeeae1bf5e","Type":"ContainerStarted","Data":"271a45156e474846837a8202daa67ea59078f824567a180ef201bc25b609d690"} Jan 29 15:42:26 crc kubenswrapper[4753]: I0129 15:42:26.164347 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2871ffb-46c8-4307-90d7-6fc9402cb8cc" path="/var/lib/kubelet/pods/a2871ffb-46c8-4307-90d7-6fc9402cb8cc/volumes" Jan 29 15:42:28 crc kubenswrapper[4753]: I0129 15:42:28.075207 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:28 crc kubenswrapper[4753]: I0129 15:42:28.075803 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:35 crc kubenswrapper[4753]: I0129 15:42:35.150524 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:42:35 crc kubenswrapper[4753]: E0129 15:42:35.151483 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:42:37 crc kubenswrapper[4753]: I0129 15:42:37.077443 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w4q8l" event={"ID":"17a71ced-859b-411a-8502-2bbeeae1bf5e","Type":"ContainerStarted","Data":"85c759c0c34e73ab48cbd19681d76303b45380c75a0b0c24f7a82ba282c246c7"} Jan 29 15:42:37 crc kubenswrapper[4753]: I0129 15:42:37.096452 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-w4q8l" podStartSLOduration=2.040191425 podStartE2EDuration="13.09643083s" podCreationTimestamp="2026-01-29 15:42:24 +0000 UTC" firstStartedPulling="2026-01-29 15:42:25.206388578 +0000 UTC m=+5979.901122960" lastFinishedPulling="2026-01-29 15:42:36.262627983 +0000 UTC m=+5990.957362365" observedRunningTime="2026-01-29 15:42:37.094843127 +0000 UTC m=+5991.789577509" watchObservedRunningTime="2026-01-29 15:42:37.09643083 +0000 UTC m=+5991.791165212" Jan 29 15:42:38 crc kubenswrapper[4753]: I0129 15:42:38.076955 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6797b66f4f-wv5m7" podUID="5fced629-f257-4241-9f17-7856b0472fb9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.123:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.123:8080: connect: connection refused" Jan 29 15:42:40 crc kubenswrapper[4753]: I0129 15:42:40.110886 4753 generic.go:334] "Generic (PLEG): container finished" podID="17a71ced-859b-411a-8502-2bbeeae1bf5e" containerID="85c759c0c34e73ab48cbd19681d76303b45380c75a0b0c24f7a82ba282c246c7" exitCode=0 Jan 29 15:42:40 crc kubenswrapper[4753]: I0129 15:42:40.111231 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w4q8l" event={"ID":"17a71ced-859b-411a-8502-2bbeeae1bf5e","Type":"ContainerDied","Data":"85c759c0c34e73ab48cbd19681d76303b45380c75a0b0c24f7a82ba282c246c7"} Jan 29 15:42:41 crc kubenswrapper[4753]: I0129 15:42:41.485377 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:41 crc kubenswrapper[4753]: I0129 15:42:41.655831 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhcsc\" (UniqueName: \"kubernetes.io/projected/17a71ced-859b-411a-8502-2bbeeae1bf5e-kube-api-access-dhcsc\") pod \"17a71ced-859b-411a-8502-2bbeeae1bf5e\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " Jan 29 15:42:41 crc kubenswrapper[4753]: I0129 15:42:41.655977 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-config-data\") pod \"17a71ced-859b-411a-8502-2bbeeae1bf5e\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " Jan 29 15:42:41 crc kubenswrapper[4753]: I0129 15:42:41.656107 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-combined-ca-bundle\") pod \"17a71ced-859b-411a-8502-2bbeeae1bf5e\" (UID: \"17a71ced-859b-411a-8502-2bbeeae1bf5e\") " Jan 29 15:42:41 crc kubenswrapper[4753]: I0129 15:42:41.672874 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a71ced-859b-411a-8502-2bbeeae1bf5e-kube-api-access-dhcsc" (OuterVolumeSpecName: "kube-api-access-dhcsc") pod "17a71ced-859b-411a-8502-2bbeeae1bf5e" (UID: "17a71ced-859b-411a-8502-2bbeeae1bf5e"). InnerVolumeSpecName "kube-api-access-dhcsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:42:41 crc kubenswrapper[4753]: I0129 15:42:41.688425 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17a71ced-859b-411a-8502-2bbeeae1bf5e" (UID: "17a71ced-859b-411a-8502-2bbeeae1bf5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:42:41 crc kubenswrapper[4753]: I0129 15:42:41.734339 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-config-data" (OuterVolumeSpecName: "config-data") pod "17a71ced-859b-411a-8502-2bbeeae1bf5e" (UID: "17a71ced-859b-411a-8502-2bbeeae1bf5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:42:41 crc kubenswrapper[4753]: I0129 15:42:41.759342 4753 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:42:41 crc kubenswrapper[4753]: I0129 15:42:41.759404 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhcsc\" (UniqueName: \"kubernetes.io/projected/17a71ced-859b-411a-8502-2bbeeae1bf5e-kube-api-access-dhcsc\") on node \"crc\" DevicePath \"\"" Jan 29 15:42:41 crc kubenswrapper[4753]: I0129 15:42:41.759423 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a71ced-859b-411a-8502-2bbeeae1bf5e-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:42:42 crc kubenswrapper[4753]: I0129 15:42:42.140635 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-w4q8l" Jan 29 15:42:42 crc kubenswrapper[4753]: I0129 15:42:42.140634 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-w4q8l" event={"ID":"17a71ced-859b-411a-8502-2bbeeae1bf5e","Type":"ContainerDied","Data":"271a45156e474846837a8202daa67ea59078f824567a180ef201bc25b609d690"} Jan 29 15:42:42 crc kubenswrapper[4753]: I0129 15:42:42.140798 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="271a45156e474846837a8202daa67ea59078f824567a180ef201bc25b609d690" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.408893 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6b948c8fcf-v2dml"] Jan 29 15:42:43 crc kubenswrapper[4753]: E0129 15:42:43.417687 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a71ced-859b-411a-8502-2bbeeae1bf5e" containerName="heat-db-sync" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.417714 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a71ced-859b-411a-8502-2bbeeae1bf5e" containerName="heat-db-sync" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.417983 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a71ced-859b-411a-8502-2bbeeae1bf5e" containerName="heat-db-sync" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.418929 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.443639 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.444435 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.444589 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-qn2p5" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.483960 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b948c8fcf-v2dml"] Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.585466 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-db45c8f86-r6kzd"] Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.587670 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.594296 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.603607 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c80c58-8fa7-4f69-8909-2134cb48d953-config-data\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.603921 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c80c58-8fa7-4f69-8909-2134cb48d953-combined-ca-bundle\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.604051 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7c80c58-8fa7-4f69-8909-2134cb48d953-config-data-custom\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.604335 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsd6c\" (UniqueName: \"kubernetes.io/projected/b7c80c58-8fa7-4f69-8909-2134cb48d953-kube-api-access-rsd6c\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.613143 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-db45c8f86-r6kzd"] Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.706274 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38e0b4cf-43ec-4245-9b77-5198f3846248-config-data-custom\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.706392 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqfvn\" (UniqueName: \"kubernetes.io/projected/38e0b4cf-43ec-4245-9b77-5198f3846248-kube-api-access-rqfvn\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.706425 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e0b4cf-43ec-4245-9b77-5198f3846248-combined-ca-bundle\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.706456 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsd6c\" (UniqueName: \"kubernetes.io/projected/b7c80c58-8fa7-4f69-8909-2134cb48d953-kube-api-access-rsd6c\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.706522 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c80c58-8fa7-4f69-8909-2134cb48d953-config-data\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.706546 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e0b4cf-43ec-4245-9b77-5198f3846248-config-data\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.706582 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c80c58-8fa7-4f69-8909-2134cb48d953-combined-ca-bundle\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.706612 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7c80c58-8fa7-4f69-8909-2134cb48d953-config-data-custom\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.717643 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c80c58-8fa7-4f69-8909-2134cb48d953-config-data\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.719914 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7c80c58-8fa7-4f69-8909-2134cb48d953-config-data-custom\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.729196 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c80c58-8fa7-4f69-8909-2134cb48d953-combined-ca-bundle\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.769274 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsd6c\" (UniqueName: \"kubernetes.io/projected/b7c80c58-8fa7-4f69-8909-2134cb48d953-kube-api-access-rsd6c\") pod \"heat-engine-6b948c8fcf-v2dml\" (UID: \"b7c80c58-8fa7-4f69-8909-2134cb48d953\") " pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.811809 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e0b4cf-43ec-4245-9b77-5198f3846248-config-data\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.812159 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38e0b4cf-43ec-4245-9b77-5198f3846248-config-data-custom\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.812323 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqfvn\" (UniqueName: \"kubernetes.io/projected/38e0b4cf-43ec-4245-9b77-5198f3846248-kube-api-access-rqfvn\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.812449 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e0b4cf-43ec-4245-9b77-5198f3846248-combined-ca-bundle\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.820977 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38e0b4cf-43ec-4245-9b77-5198f3846248-config-data-custom\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.828262 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e0b4cf-43ec-4245-9b77-5198f3846248-config-data\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.828972 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e0b4cf-43ec-4245-9b77-5198f3846248-combined-ca-bundle\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.872212 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5dc4fc86d8-4q5ch"] Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.873565 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.880928 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.887236 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqfvn\" (UniqueName: \"kubernetes.io/projected/38e0b4cf-43ec-4245-9b77-5198f3846248-kube-api-access-rqfvn\") pod \"heat-cfnapi-db45c8f86-r6kzd\" (UID: \"38e0b4cf-43ec-4245-9b77-5198f3846248\") " pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.899558 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5dc4fc86d8-4q5ch"] Jan 29 15:42:43 crc kubenswrapper[4753]: I0129 15:42:43.977669 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.016753 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009bf51b-359a-4af2-9834-ef20a1268c6d-config-data\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.017269 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjj5r\" (UniqueName: \"kubernetes.io/projected/009bf51b-359a-4af2-9834-ef20a1268c6d-kube-api-access-bjj5r\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.017985 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009bf51b-359a-4af2-9834-ef20a1268c6d-combined-ca-bundle\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.018130 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/009bf51b-359a-4af2-9834-ef20a1268c6d-config-data-custom\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.051438 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.120783 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009bf51b-359a-4af2-9834-ef20a1268c6d-config-data\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.120883 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjj5r\" (UniqueName: \"kubernetes.io/projected/009bf51b-359a-4af2-9834-ef20a1268c6d-kube-api-access-bjj5r\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.121051 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009bf51b-359a-4af2-9834-ef20a1268c6d-combined-ca-bundle\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.121094 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/009bf51b-359a-4af2-9834-ef20a1268c6d-config-data-custom\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.132618 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/009bf51b-359a-4af2-9834-ef20a1268c6d-config-data\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.133177 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/009bf51b-359a-4af2-9834-ef20a1268c6d-config-data-custom\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.142612 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009bf51b-359a-4af2-9834-ef20a1268c6d-combined-ca-bundle\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.170638 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjj5r\" (UniqueName: \"kubernetes.io/projected/009bf51b-359a-4af2-9834-ef20a1268c6d-kube-api-access-bjj5r\") pod \"heat-api-5dc4fc86d8-4q5ch\" (UID: \"009bf51b-359a-4af2-9834-ef20a1268c6d\") " pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.257693 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.665645 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-db45c8f86-r6kzd"] Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.830823 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b948c8fcf-v2dml"] Jan 29 15:42:44 crc kubenswrapper[4753]: W0129 15:42:44.835316 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7c80c58_8fa7_4f69_8909_2134cb48d953.slice/crio-4c565b48a2bf2b60c18a666a05e5a9d08d3e464ec958e4a2577d15ec06dbb513 WatchSource:0}: Error finding container 4c565b48a2bf2b60c18a666a05e5a9d08d3e464ec958e4a2577d15ec06dbb513: Status 404 returned error can't find the container with id 4c565b48a2bf2b60c18a666a05e5a9d08d3e464ec958e4a2577d15ec06dbb513 Jan 29 15:42:44 crc kubenswrapper[4753]: I0129 15:42:44.959139 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5dc4fc86d8-4q5ch"] Jan 29 15:42:44 crc kubenswrapper[4753]: W0129 15:42:44.959392 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod009bf51b_359a_4af2_9834_ef20a1268c6d.slice/crio-cca2bc4039cbaa4256254e9192a37109f3e595a1ef9e7e7ceade0298bbe87522 WatchSource:0}: Error finding container cca2bc4039cbaa4256254e9192a37109f3e595a1ef9e7e7ceade0298bbe87522: Status 404 returned error can't find the container with id cca2bc4039cbaa4256254e9192a37109f3e595a1ef9e7e7ceade0298bbe87522 Jan 29 15:42:45 crc kubenswrapper[4753]: I0129 15:42:45.215704 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5dc4fc86d8-4q5ch" event={"ID":"009bf51b-359a-4af2-9834-ef20a1268c6d","Type":"ContainerStarted","Data":"cca2bc4039cbaa4256254e9192a37109f3e595a1ef9e7e7ceade0298bbe87522"} Jan 29 15:42:45 crc kubenswrapper[4753]: I0129 15:42:45.218614 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-db45c8f86-r6kzd" event={"ID":"38e0b4cf-43ec-4245-9b77-5198f3846248","Type":"ContainerStarted","Data":"099b8029877143a84e04b2e2d7fe9a532b134446e7ff64c4470c934672600769"} Jan 29 15:42:45 crc kubenswrapper[4753]: I0129 15:42:45.220811 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b948c8fcf-v2dml" event={"ID":"b7c80c58-8fa7-4f69-8909-2134cb48d953","Type":"ContainerStarted","Data":"4c565b48a2bf2b60c18a666a05e5a9d08d3e464ec958e4a2577d15ec06dbb513"} Jan 29 15:42:46 crc kubenswrapper[4753]: I0129 15:42:46.248532 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b948c8fcf-v2dml" event={"ID":"b7c80c58-8fa7-4f69-8909-2134cb48d953","Type":"ContainerStarted","Data":"011c9b24885a79334480119869c2295c1a8a3e80d54f0b996096429c411cdefe"} Jan 29 15:42:46 crc kubenswrapper[4753]: I0129 15:42:46.248982 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:42:46 crc kubenswrapper[4753]: I0129 15:42:46.282213 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6b948c8fcf-v2dml" podStartSLOduration=3.282186516 podStartE2EDuration="3.282186516s" podCreationTimestamp="2026-01-29 15:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:42:46.266004479 +0000 UTC m=+6000.960738901" watchObservedRunningTime="2026-01-29 15:42:46.282186516 +0000 UTC m=+6000.976920898" Jan 29 15:42:49 crc kubenswrapper[4753]: I0129 15:42:49.291316 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5dc4fc86d8-4q5ch" event={"ID":"009bf51b-359a-4af2-9834-ef20a1268c6d","Type":"ContainerStarted","Data":"d899f6f7d62114cd1a444a1828b5bbb8d1178f32c0e7885397ada1875f8b3cc1"} Jan 29 15:42:49 crc kubenswrapper[4753]: I0129 15:42:49.292785 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:49 crc kubenswrapper[4753]: I0129 15:42:49.294051 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-db45c8f86-r6kzd" event={"ID":"38e0b4cf-43ec-4245-9b77-5198f3846248","Type":"ContainerStarted","Data":"11059c9ade16ef6c99351c4a8721cb184a18590e1347cca7a86c4594b547751d"} Jan 29 15:42:49 crc kubenswrapper[4753]: I0129 15:42:49.294194 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:49 crc kubenswrapper[4753]: I0129 15:42:49.323210 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5dc4fc86d8-4q5ch" podStartSLOduration=3.411245888 podStartE2EDuration="6.323186037s" podCreationTimestamp="2026-01-29 15:42:43 +0000 UTC" firstStartedPulling="2026-01-29 15:42:44.962461638 +0000 UTC m=+5999.657196020" lastFinishedPulling="2026-01-29 15:42:47.874401787 +0000 UTC m=+6002.569136169" observedRunningTime="2026-01-29 15:42:49.309106568 +0000 UTC m=+6004.003840960" watchObservedRunningTime="2026-01-29 15:42:49.323186037 +0000 UTC m=+6004.017920429" Jan 29 15:42:49 crc kubenswrapper[4753]: I0129 15:42:49.341587 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-db45c8f86-r6kzd" podStartSLOduration=3.147195304 podStartE2EDuration="6.341555723s" podCreationTimestamp="2026-01-29 15:42:43 +0000 UTC" firstStartedPulling="2026-01-29 15:42:44.676417 +0000 UTC m=+5999.371151382" lastFinishedPulling="2026-01-29 15:42:47.870777419 +0000 UTC m=+6002.565511801" observedRunningTime="2026-01-29 15:42:49.329043145 +0000 UTC m=+6004.023777527" watchObservedRunningTime="2026-01-29 15:42:49.341555723 +0000 UTC m=+6004.036290105" Jan 29 15:42:50 crc kubenswrapper[4753]: I0129 15:42:50.149690 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:42:50 crc kubenswrapper[4753]: E0129 15:42:50.150337 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:42:50 crc kubenswrapper[4753]: I0129 15:42:50.346025 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:52 crc kubenswrapper[4753]: I0129 15:42:52.767037 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6797b66f4f-wv5m7" Jan 29 15:42:52 crc kubenswrapper[4753]: I0129 15:42:52.840738 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68cd549bc7-5fnnk"] Jan 29 15:42:52 crc kubenswrapper[4753]: I0129 15:42:52.840978 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68cd549bc7-5fnnk" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon-log" containerID="cri-o://6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b" gracePeriod=30 Jan 29 15:42:52 crc kubenswrapper[4753]: I0129 15:42:52.841129 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68cd549bc7-5fnnk" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon" containerID="cri-o://2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9" gracePeriod=30 Jan 29 15:42:55 crc kubenswrapper[4753]: I0129 15:42:55.054853 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8d6c-account-create-update-nnjmp"] Jan 29 15:42:55 crc kubenswrapper[4753]: I0129 15:42:55.064704 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-bqwgr"] Jan 29 15:42:55 crc kubenswrapper[4753]: I0129 15:42:55.081276 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8d6c-account-create-update-nnjmp"] Jan 29 15:42:55 crc kubenswrapper[4753]: I0129 15:42:55.094012 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-bqwgr"] Jan 29 15:42:55 crc kubenswrapper[4753]: I0129 15:42:55.816682 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-db45c8f86-r6kzd" Jan 29 15:42:56 crc kubenswrapper[4753]: I0129 15:42:56.161413 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba286775-81ab-4630-bf6e-0824c2a89a6b" path="/var/lib/kubelet/pods/ba286775-81ab-4630-bf6e-0824c2a89a6b/volumes" Jan 29 15:42:56 crc kubenswrapper[4753]: I0129 15:42:56.162061 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1b0f669-00e6-4f72-bc26-91272c722e84" path="/var/lib/kubelet/pods/f1b0f669-00e6-4f72-bc26-91272c722e84/volumes" Jan 29 15:42:56 crc kubenswrapper[4753]: I0129 15:42:56.238639 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5dc4fc86d8-4q5ch" Jan 29 15:42:56 crc kubenswrapper[4753]: I0129 15:42:56.350079 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68cd549bc7-5fnnk" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.120:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8080: connect: connection refused" Jan 29 15:42:56 crc kubenswrapper[4753]: I0129 15:42:56.377356 4753 generic.go:334] "Generic (PLEG): container finished" podID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerID="2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9" exitCode=0 Jan 29 15:42:56 crc kubenswrapper[4753]: I0129 15:42:56.377413 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd549bc7-5fnnk" event={"ID":"92a1036d-2faa-4780-8ad4-153d6a0ac402","Type":"ContainerDied","Data":"2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9"} Jan 29 15:43:00 crc kubenswrapper[4753]: I0129 15:43:00.040462 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-cpnns"] Jan 29 15:43:00 crc kubenswrapper[4753]: I0129 15:43:00.051407 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-cpnns"] Jan 29 15:43:00 crc kubenswrapper[4753]: I0129 15:43:00.160844 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4bb3975-ee1c-4d33-b619-d0f898300c93" path="/var/lib/kubelet/pods/b4bb3975-ee1c-4d33-b619-d0f898300c93/volumes" Jan 29 15:43:04 crc kubenswrapper[4753]: I0129 15:43:04.087748 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6b948c8fcf-v2dml" Jan 29 15:43:04 crc kubenswrapper[4753]: I0129 15:43:04.149736 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:43:05 crc kubenswrapper[4753]: I0129 15:43:05.474790 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"36ed70dab0ac14e2bbb32c7b086f3b2d4999c7c2d1318f4807ee75319489be51"} Jan 29 15:43:06 crc kubenswrapper[4753]: I0129 15:43:06.350626 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68cd549bc7-5fnnk" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.120:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8080: connect: connection refused" Jan 29 15:43:11 crc kubenswrapper[4753]: I0129 15:43:11.917536 4753 scope.go:117] "RemoveContainer" containerID="966fe858ffe3e9ff480a826a2682f30e7d74bfd056261085cf0293ab64c2dd7c" Jan 29 15:43:11 crc kubenswrapper[4753]: I0129 15:43:11.967248 4753 scope.go:117] "RemoveContainer" containerID="c12fc654db8e9dad7f73f6cafd5bb44659472fea9254158b325fb2406d1ae05d" Jan 29 15:43:12 crc kubenswrapper[4753]: I0129 15:43:12.016033 4753 scope.go:117] "RemoveContainer" containerID="8c5b4650ab8532e0dce229abbef4cae22e01c1012f7c1891031c24f505c83b31" Jan 29 15:43:12 crc kubenswrapper[4753]: I0129 15:43:12.052107 4753 scope.go:117] "RemoveContainer" containerID="78e54756122595ece8fadea5a2a1241966847d69027f124e6fcdde3ed271d95c" Jan 29 15:43:12 crc kubenswrapper[4753]: I0129 15:43:12.110346 4753 scope.go:117] "RemoveContainer" containerID="d829463f61eaf3c4db63ee69993e3899aad905441208fee8b6d0864b72523a57" Jan 29 15:43:12 crc kubenswrapper[4753]: I0129 15:43:12.150471 4753 scope.go:117] "RemoveContainer" containerID="f596a6ac800a4d5f38e9ac5b72f7c6590e212839d6b81773272a7724d90c3c65" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.070487 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44"] Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.073408 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.083719 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44"] Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.092452 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.148096 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.148188 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgx5m\" (UniqueName: \"kubernetes.io/projected/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-kube-api-access-rgx5m\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.148217 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.250384 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.250742 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgx5m\" (UniqueName: \"kubernetes.io/projected/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-kube-api-access-rgx5m\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.250858 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.250980 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.251362 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.275084 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgx5m\" (UniqueName: \"kubernetes.io/projected/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-kube-api-access-rgx5m\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.402952 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:43:14 crc kubenswrapper[4753]: I0129 15:43:14.836438 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44"] Jan 29 15:43:15 crc kubenswrapper[4753]: I0129 15:43:15.581348 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" event={"ID":"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b","Type":"ContainerStarted","Data":"f641cb5a49396340de614e23f2312a8d64551ad175d406b30ab06e60b9d9bc08"} Jan 29 15:43:15 crc kubenswrapper[4753]: I0129 15:43:15.582024 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" event={"ID":"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b","Type":"ContainerStarted","Data":"5e07e0608f52efebd3b8861e1317996f00d6679f71a69620ad0afc13f2c4ee2b"} Jan 29 15:43:16 crc kubenswrapper[4753]: I0129 15:43:16.349594 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68cd549bc7-5fnnk" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.120:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8080: connect: connection refused" Jan 29 15:43:16 crc kubenswrapper[4753]: I0129 15:43:16.350020 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:43:16 crc kubenswrapper[4753]: I0129 15:43:16.592376 4753 generic.go:334] "Generic (PLEG): container finished" podID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" containerID="f641cb5a49396340de614e23f2312a8d64551ad175d406b30ab06e60b9d9bc08" exitCode=0 Jan 29 15:43:16 crc kubenswrapper[4753]: I0129 15:43:16.592425 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" event={"ID":"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b","Type":"ContainerDied","Data":"f641cb5a49396340de614e23f2312a8d64551ad175d406b30ab06e60b9d9bc08"} Jan 29 15:43:16 crc kubenswrapper[4753]: E0129 15:43:16.717397 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6" Jan 29 15:43:16 crc kubenswrapper[4753]: E0129 15:43:16.717569 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rgx5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_openshift-marketplace(ea6b2cb6-43c4-4b8d-ac86-c5522959a43b): ErrImagePull: initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:43:16 crc kubenswrapper[4753]: E0129 15:43:16.718774 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:43:17 crc kubenswrapper[4753]: E0129 15:43:17.605578 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.313197 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.487591 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb9gx\" (UniqueName: \"kubernetes.io/projected/92a1036d-2faa-4780-8ad4-153d6a0ac402-kube-api-access-rb9gx\") pod \"92a1036d-2faa-4780-8ad4-153d6a0ac402\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.487734 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-scripts\") pod \"92a1036d-2faa-4780-8ad4-153d6a0ac402\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.487795 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-config-data\") pod \"92a1036d-2faa-4780-8ad4-153d6a0ac402\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.487902 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92a1036d-2faa-4780-8ad4-153d6a0ac402-horizon-secret-key\") pod \"92a1036d-2faa-4780-8ad4-153d6a0ac402\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.487985 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92a1036d-2faa-4780-8ad4-153d6a0ac402-logs\") pod \"92a1036d-2faa-4780-8ad4-153d6a0ac402\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.488397 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a1036d-2faa-4780-8ad4-153d6a0ac402-logs" (OuterVolumeSpecName: "logs") pod "92a1036d-2faa-4780-8ad4-153d6a0ac402" (UID: "92a1036d-2faa-4780-8ad4-153d6a0ac402"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.488518 4753 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92a1036d-2faa-4780-8ad4-153d6a0ac402-logs\") on node \"crc\" DevicePath \"\"" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.493574 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a1036d-2faa-4780-8ad4-153d6a0ac402-kube-api-access-rb9gx" (OuterVolumeSpecName: "kube-api-access-rb9gx") pod "92a1036d-2faa-4780-8ad4-153d6a0ac402" (UID: "92a1036d-2faa-4780-8ad4-153d6a0ac402"). InnerVolumeSpecName "kube-api-access-rb9gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.493817 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a1036d-2faa-4780-8ad4-153d6a0ac402-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "92a1036d-2faa-4780-8ad4-153d6a0ac402" (UID: "92a1036d-2faa-4780-8ad4-153d6a0ac402"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:43:23 crc kubenswrapper[4753]: E0129 15:43:23.514569 4753 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-scripts podName:92a1036d-2faa-4780-8ad4-153d6a0ac402 nodeName:}" failed. No retries permitted until 2026-01-29 15:43:24.014539865 +0000 UTC m=+6038.709274257 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "scripts" (UniqueName: "kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-scripts") pod "92a1036d-2faa-4780-8ad4-153d6a0ac402" (UID: "92a1036d-2faa-4780-8ad4-153d6a0ac402") : error deleting /var/lib/kubelet/pods/92a1036d-2faa-4780-8ad4-153d6a0ac402/volume-subpaths: remove /var/lib/kubelet/pods/92a1036d-2faa-4780-8ad4-153d6a0ac402/volume-subpaths: no such file or directory Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.515261 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-config-data" (OuterVolumeSpecName: "config-data") pod "92a1036d-2faa-4780-8ad4-153d6a0ac402" (UID: "92a1036d-2faa-4780-8ad4-153d6a0ac402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.590825 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb9gx\" (UniqueName: \"kubernetes.io/projected/92a1036d-2faa-4780-8ad4-153d6a0ac402-kube-api-access-rb9gx\") on node \"crc\" DevicePath \"\"" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.591178 4753 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-config-data\") on node \"crc\" DevicePath \"\"" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.591187 4753 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/92a1036d-2faa-4780-8ad4-153d6a0ac402-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.664787 4753 generic.go:334] "Generic (PLEG): container finished" podID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerID="6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b" exitCode=137 Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.664832 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd549bc7-5fnnk" event={"ID":"92a1036d-2faa-4780-8ad4-153d6a0ac402","Type":"ContainerDied","Data":"6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b"} Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.664861 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68cd549bc7-5fnnk" event={"ID":"92a1036d-2faa-4780-8ad4-153d6a0ac402","Type":"ContainerDied","Data":"b7ebddbe9b65b74ebda3fb617d2df2aeda90a78fb4cb31ff00bdebc314174d65"} Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.664877 4753 scope.go:117] "RemoveContainer" containerID="2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.664876 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68cd549bc7-5fnnk" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.843102 4753 scope.go:117] "RemoveContainer" containerID="6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.862001 4753 scope.go:117] "RemoveContainer" containerID="2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9" Jan 29 15:43:23 crc kubenswrapper[4753]: E0129 15:43:23.862577 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9\": container with ID starting with 2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9 not found: ID does not exist" containerID="2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.862621 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9"} err="failed to get container status \"2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9\": rpc error: code = NotFound desc = could not find container \"2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9\": container with ID starting with 2ca3c6c16e1a5949e68a6f40c6fead0b6a86bfd2855c8d596861a666ff21c4f9 not found: ID does not exist" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.862644 4753 scope.go:117] "RemoveContainer" containerID="6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b" Jan 29 15:43:23 crc kubenswrapper[4753]: E0129 15:43:23.863094 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b\": container with ID starting with 6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b not found: ID does not exist" containerID="6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b" Jan 29 15:43:23 crc kubenswrapper[4753]: I0129 15:43:23.863131 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b"} err="failed to get container status \"6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b\": rpc error: code = NotFound desc = could not find container \"6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b\": container with ID starting with 6307fdf1e2d000b6c1adb652c10c8286a9aaaa7ae58a1047fdb762c20e5c695b not found: ID does not exist" Jan 29 15:43:24 crc kubenswrapper[4753]: I0129 15:43:24.101736 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-scripts\") pod \"92a1036d-2faa-4780-8ad4-153d6a0ac402\" (UID: \"92a1036d-2faa-4780-8ad4-153d6a0ac402\") " Jan 29 15:43:24 crc kubenswrapper[4753]: I0129 15:43:24.102322 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-scripts" (OuterVolumeSpecName: "scripts") pod "92a1036d-2faa-4780-8ad4-153d6a0ac402" (UID: "92a1036d-2faa-4780-8ad4-153d6a0ac402"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:43:24 crc kubenswrapper[4753]: I0129 15:43:24.102469 4753 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/92a1036d-2faa-4780-8ad4-153d6a0ac402-scripts\") on node \"crc\" DevicePath \"\"" Jan 29 15:43:24 crc kubenswrapper[4753]: I0129 15:43:24.299102 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68cd549bc7-5fnnk"] Jan 29 15:43:24 crc kubenswrapper[4753]: I0129 15:43:24.310518 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68cd549bc7-5fnnk"] Jan 29 15:43:26 crc kubenswrapper[4753]: I0129 15:43:26.177264 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" path="/var/lib/kubelet/pods/92a1036d-2faa-4780-8ad4-153d6a0ac402/volumes" Jan 29 15:43:31 crc kubenswrapper[4753]: E0129 15:43:31.283317 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6" Jan 29 15:43:31 crc kubenswrapper[4753]: E0129 15:43:31.285561 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rgx5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_openshift-marketplace(ea6b2cb6-43c4-4b8d-ac86-c5522959a43b): ErrImagePull: initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:43:31 crc kubenswrapper[4753]: E0129 15:43:31.286887 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:43:45 crc kubenswrapper[4753]: E0129 15:43:45.152245 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.057509 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tmg7x"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.069080 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-bsz4g"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.085176 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-57e6-account-create-update-zm6k8"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.098731 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-93e2-account-create-update-7dxp5"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.124965 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4efd-account-create-update-9rt7v"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.137282 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5kggc"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.153021 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-bsz4g"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.167634 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tmg7x"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.179382 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-93e2-account-create-update-7dxp5"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.187726 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5kggc"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.197077 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-57e6-account-create-update-zm6k8"] Jan 29 15:43:59 crc kubenswrapper[4753]: I0129 15:43:59.206726 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4efd-account-create-update-9rt7v"] Jan 29 15:43:59 crc kubenswrapper[4753]: E0129 15:43:59.283805 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6" Jan 29 15:43:59 crc kubenswrapper[4753]: E0129 15:43:59.283986 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rgx5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_openshift-marketplace(ea6b2cb6-43c4-4b8d-ac86-c5522959a43b): ErrImagePull: initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:43:59 crc kubenswrapper[4753]: E0129 15:43:59.285189 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:44:00 crc kubenswrapper[4753]: I0129 15:44:00.172524 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d53a245-058d-428a-9de6-4d65eff12330" path="/var/lib/kubelet/pods/1d53a245-058d-428a-9de6-4d65eff12330/volumes" Jan 29 15:44:00 crc kubenswrapper[4753]: I0129 15:44:00.178756 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df4bfd0-cd78-4225-9eac-0903d4df186d" path="/var/lib/kubelet/pods/2df4bfd0-cd78-4225-9eac-0903d4df186d/volumes" Jan 29 15:44:00 crc kubenswrapper[4753]: I0129 15:44:00.179369 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33591b63-9910-4e2b-8188-c8598c1b510f" path="/var/lib/kubelet/pods/33591b63-9910-4e2b-8188-c8598c1b510f/volumes" Jan 29 15:44:00 crc kubenswrapper[4753]: I0129 15:44:00.179912 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3f5b7b-6054-4179-aefb-8ac06bf44628" path="/var/lib/kubelet/pods/6f3f5b7b-6054-4179-aefb-8ac06bf44628/volumes" Jan 29 15:44:00 crc kubenswrapper[4753]: I0129 15:44:00.181537 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7" path="/var/lib/kubelet/pods/9f4ce3cc-9eb9-4a02-80d7-b3c0361613c7/volumes" Jan 29 15:44:00 crc kubenswrapper[4753]: I0129 15:44:00.182639 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26381a9-132d-429c-bbaf-396b609c273c" path="/var/lib/kubelet/pods/b26381a9-132d-429c-bbaf-396b609c273c/volumes" Jan 29 15:44:12 crc kubenswrapper[4753]: I0129 15:44:12.364194 4753 scope.go:117] "RemoveContainer" containerID="13e68e8a4ccd774ac3de8880e8336b0a1d0e28bb31d378e3ad175d659df35e8e" Jan 29 15:44:12 crc kubenswrapper[4753]: I0129 15:44:12.399566 4753 scope.go:117] "RemoveContainer" containerID="12194361fbbdf0e221be0d1d2fa832b7a7be746789bcf75f3cd26d41d57dfa13" Jan 29 15:44:12 crc kubenswrapper[4753]: I0129 15:44:12.437255 4753 scope.go:117] "RemoveContainer" containerID="0a6be775f2cd5e5d07c78d6c98ef48e7324779a0346174264aa991bdcbeccd33" Jan 29 15:44:12 crc kubenswrapper[4753]: I0129 15:44:12.487016 4753 scope.go:117] "RemoveContainer" containerID="f9851b510b51f6cedeedb2272ea38c449045648ef008760f4bd1c7523f76faff" Jan 29 15:44:12 crc kubenswrapper[4753]: I0129 15:44:12.532306 4753 scope.go:117] "RemoveContainer" containerID="db01f621e3933d8854d001cfb96afddc78b096bf6c50af9bf725b27ada74f3bc" Jan 29 15:44:12 crc kubenswrapper[4753]: I0129 15:44:12.578083 4753 scope.go:117] "RemoveContainer" containerID="22faef2d8e3c78953fdd5fe9ae0808301b52e36fbb26df710e1c6fa0216fd2c2" Jan 29 15:44:13 crc kubenswrapper[4753]: E0129 15:44:13.159656 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:44:15 crc kubenswrapper[4753]: I0129 15:44:15.044606 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jljmv"] Jan 29 15:44:15 crc kubenswrapper[4753]: I0129 15:44:15.053860 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jljmv"] Jan 29 15:44:16 crc kubenswrapper[4753]: I0129 15:44:16.158940 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f925a54-0169-4eea-b309-8f7d168419a0" path="/var/lib/kubelet/pods/0f925a54-0169-4eea-b309-8f7d168419a0/volumes" Jan 29 15:44:18 crc kubenswrapper[4753]: I0129 15:44:18.924084 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xgt6l"] Jan 29 15:44:18 crc kubenswrapper[4753]: E0129 15:44:18.924985 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon-log" Jan 29 15:44:18 crc kubenswrapper[4753]: I0129 15:44:18.925001 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon-log" Jan 29 15:44:18 crc kubenswrapper[4753]: E0129 15:44:18.925026 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon" Jan 29 15:44:18 crc kubenswrapper[4753]: I0129 15:44:18.925034 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon" Jan 29 15:44:18 crc kubenswrapper[4753]: I0129 15:44:18.925322 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon" Jan 29 15:44:18 crc kubenswrapper[4753]: I0129 15:44:18.925339 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a1036d-2faa-4780-8ad4-153d6a0ac402" containerName="horizon-log" Jan 29 15:44:18 crc kubenswrapper[4753]: I0129 15:44:18.927075 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:18 crc kubenswrapper[4753]: I0129 15:44:18.950844 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgt6l"] Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.009910 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26zht\" (UniqueName: \"kubernetes.io/projected/61359d71-6838-4f79-8df5-3bf8b620f0a6-kube-api-access-26zht\") pod \"community-operators-xgt6l\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.010336 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-catalog-content\") pod \"community-operators-xgt6l\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.010633 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-utilities\") pod \"community-operators-xgt6l\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.114570 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26zht\" (UniqueName: \"kubernetes.io/projected/61359d71-6838-4f79-8df5-3bf8b620f0a6-kube-api-access-26zht\") pod \"community-operators-xgt6l\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.114996 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-catalog-content\") pod \"community-operators-xgt6l\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.115381 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-utilities\") pod \"community-operators-xgt6l\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.115984 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-catalog-content\") pod \"community-operators-xgt6l\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.116111 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-utilities\") pod \"community-operators-xgt6l\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.141584 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26zht\" (UniqueName: \"kubernetes.io/projected/61359d71-6838-4f79-8df5-3bf8b620f0a6-kube-api-access-26zht\") pod \"community-operators-xgt6l\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.250725 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:44:19 crc kubenswrapper[4753]: W0129 15:44:19.939961 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61359d71_6838_4f79_8df5_3bf8b620f0a6.slice/crio-e65ba39e0606b6ce3e0c4f4e6113b81b537882ab8e6e9789d726efa1c7118b77 WatchSource:0}: Error finding container e65ba39e0606b6ce3e0c4f4e6113b81b537882ab8e6e9789d726efa1c7118b77: Status 404 returned error can't find the container with id e65ba39e0606b6ce3e0c4f4e6113b81b537882ab8e6e9789d726efa1c7118b77 Jan 29 15:44:19 crc kubenswrapper[4753]: I0129 15:44:19.942307 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgt6l"] Jan 29 15:44:20 crc kubenswrapper[4753]: I0129 15:44:20.244920 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt6l" event={"ID":"61359d71-6838-4f79-8df5-3bf8b620f0a6","Type":"ContainerStarted","Data":"aaef84bcdd34f623978253790ddac6bc188597ebb67728278b58266f759a5653"} Jan 29 15:44:20 crc kubenswrapper[4753]: I0129 15:44:20.245299 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt6l" event={"ID":"61359d71-6838-4f79-8df5-3bf8b620f0a6","Type":"ContainerStarted","Data":"e65ba39e0606b6ce3e0c4f4e6113b81b537882ab8e6e9789d726efa1c7118b77"} Jan 29 15:44:21 crc kubenswrapper[4753]: I0129 15:44:21.260551 4753 generic.go:334] "Generic (PLEG): container finished" podID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerID="aaef84bcdd34f623978253790ddac6bc188597ebb67728278b58266f759a5653" exitCode=0 Jan 29 15:44:21 crc kubenswrapper[4753]: I0129 15:44:21.260609 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt6l" event={"ID":"61359d71-6838-4f79-8df5-3bf8b620f0a6","Type":"ContainerDied","Data":"aaef84bcdd34f623978253790ddac6bc188597ebb67728278b58266f759a5653"} Jan 29 15:44:21 crc kubenswrapper[4753]: I0129 15:44:21.263263 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 15:44:21 crc kubenswrapper[4753]: E0129 15:44:21.390466 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 15:44:21 crc kubenswrapper[4753]: E0129 15:44:21.391492 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26zht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xgt6l_openshift-marketplace(61359d71-6838-4f79-8df5-3bf8b620f0a6): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:44:21 crc kubenswrapper[4753]: E0129 15:44:21.392851 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:44:22 crc kubenswrapper[4753]: E0129 15:44:22.273339 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:44:28 crc kubenswrapper[4753]: E0129 15:44:28.152024 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:44:30 crc kubenswrapper[4753]: I0129 15:44:30.030049 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4zbrh"] Jan 29 15:44:30 crc kubenswrapper[4753]: I0129 15:44:30.038801 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4zbrh"] Jan 29 15:44:30 crc kubenswrapper[4753]: I0129 15:44:30.161127 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b" path="/var/lib/kubelet/pods/0d7cd059-b4b2-4516-b9d4-2e2e13b7b38b/volumes" Jan 29 15:44:31 crc kubenswrapper[4753]: I0129 15:44:31.033955 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-n4smg"] Jan 29 15:44:31 crc kubenswrapper[4753]: I0129 15:44:31.042729 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-n4smg"] Jan 29 15:44:32 crc kubenswrapper[4753]: I0129 15:44:32.171088 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de6fe34-0d8e-494e-8cea-b91c8e437b88" path="/var/lib/kubelet/pods/8de6fe34-0d8e-494e-8cea-b91c8e437b88/volumes" Jan 29 15:44:36 crc kubenswrapper[4753]: E0129 15:44:36.279695 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 15:44:36 crc kubenswrapper[4753]: E0129 15:44:36.280455 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26zht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xgt6l_openshift-marketplace(61359d71-6838-4f79-8df5-3bf8b620f0a6): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:44:36 crc kubenswrapper[4753]: E0129 15:44:36.281655 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:44:41 crc kubenswrapper[4753]: E0129 15:44:41.308119 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6" Jan 29 15:44:41 crc kubenswrapper[4753]: E0129 15:44:41.308827 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rgx5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_openshift-marketplace(ea6b2cb6-43c4-4b8d-ac86-c5522959a43b): ErrImagePull: initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:44:41 crc kubenswrapper[4753]: E0129 15:44:41.310068 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:44:49 crc kubenswrapper[4753]: I0129 15:44:49.053241 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-d5h4f"] Jan 29 15:44:49 crc kubenswrapper[4753]: I0129 15:44:49.063456 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-d5h4f"] Jan 29 15:44:50 crc kubenswrapper[4753]: I0129 15:44:50.162914 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e23b99b1-289f-4f3b-8de7-7567e21674a4" path="/var/lib/kubelet/pods/e23b99b1-289f-4f3b-8de7-7567e21674a4/volumes" Jan 29 15:44:52 crc kubenswrapper[4753]: E0129 15:44:52.151936 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:44:54 crc kubenswrapper[4753]: E0129 15:44:54.150865 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.174922 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5"] Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.177539 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.180822 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.181083 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.183680 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5"] Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.313080 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97b9b728-5c14-478a-a902-7b01d6ca918f-secret-volume\") pod \"collect-profiles-29495025-7dzd5\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.313474 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97b9b728-5c14-478a-a902-7b01d6ca918f-config-volume\") pod \"collect-profiles-29495025-7dzd5\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.313571 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwm7z\" (UniqueName: \"kubernetes.io/projected/97b9b728-5c14-478a-a902-7b01d6ca918f-kube-api-access-bwm7z\") pod \"collect-profiles-29495025-7dzd5\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.415238 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97b9b728-5c14-478a-a902-7b01d6ca918f-secret-volume\") pod \"collect-profiles-29495025-7dzd5\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.415296 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97b9b728-5c14-478a-a902-7b01d6ca918f-config-volume\") pod \"collect-profiles-29495025-7dzd5\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.415402 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwm7z\" (UniqueName: \"kubernetes.io/projected/97b9b728-5c14-478a-a902-7b01d6ca918f-kube-api-access-bwm7z\") pod \"collect-profiles-29495025-7dzd5\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.416555 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97b9b728-5c14-478a-a902-7b01d6ca918f-config-volume\") pod \"collect-profiles-29495025-7dzd5\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.423541 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97b9b728-5c14-478a-a902-7b01d6ca918f-secret-volume\") pod \"collect-profiles-29495025-7dzd5\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.434959 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwm7z\" (UniqueName: \"kubernetes.io/projected/97b9b728-5c14-478a-a902-7b01d6ca918f-kube-api-access-bwm7z\") pod \"collect-profiles-29495025-7dzd5\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: I0129 15:45:00.515377 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:00 crc kubenswrapper[4753]: W0129 15:45:00.999601 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b9b728_5c14_478a_a902_7b01d6ca918f.slice/crio-5183b260c4ed1a0706fff18a09517ad7a8b854cf8bc6e9402a6c29b5e7dd9c5a WatchSource:0}: Error finding container 5183b260c4ed1a0706fff18a09517ad7a8b854cf8bc6e9402a6c29b5e7dd9c5a: Status 404 returned error can't find the container with id 5183b260c4ed1a0706fff18a09517ad7a8b854cf8bc6e9402a6c29b5e7dd9c5a Jan 29 15:45:01 crc kubenswrapper[4753]: I0129 15:45:01.006848 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5"] Jan 29 15:45:01 crc kubenswrapper[4753]: I0129 15:45:01.605598 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" event={"ID":"97b9b728-5c14-478a-a902-7b01d6ca918f","Type":"ContainerStarted","Data":"ffbfb31e6df15fa9a7a7a66357dd38cec9c29fe1f3d6c1344a5c162e71ad7ecd"} Jan 29 15:45:01 crc kubenswrapper[4753]: I0129 15:45:01.605869 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" event={"ID":"97b9b728-5c14-478a-a902-7b01d6ca918f","Type":"ContainerStarted","Data":"5183b260c4ed1a0706fff18a09517ad7a8b854cf8bc6e9402a6c29b5e7dd9c5a"} Jan 29 15:45:02 crc kubenswrapper[4753]: I0129 15:45:02.614102 4753 generic.go:334] "Generic (PLEG): container finished" podID="97b9b728-5c14-478a-a902-7b01d6ca918f" containerID="ffbfb31e6df15fa9a7a7a66357dd38cec9c29fe1f3d6c1344a5c162e71ad7ecd" exitCode=0 Jan 29 15:45:02 crc kubenswrapper[4753]: I0129 15:45:02.614180 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" event={"ID":"97b9b728-5c14-478a-a902-7b01d6ca918f","Type":"ContainerDied","Data":"ffbfb31e6df15fa9a7a7a66357dd38cec9c29fe1f3d6c1344a5c162e71ad7ecd"} Jan 29 15:45:03 crc kubenswrapper[4753]: I0129 15:45:03.963581 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.097399 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97b9b728-5c14-478a-a902-7b01d6ca918f-secret-volume\") pod \"97b9b728-5c14-478a-a902-7b01d6ca918f\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.097505 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97b9b728-5c14-478a-a902-7b01d6ca918f-config-volume\") pod \"97b9b728-5c14-478a-a902-7b01d6ca918f\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.097559 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwm7z\" (UniqueName: \"kubernetes.io/projected/97b9b728-5c14-478a-a902-7b01d6ca918f-kube-api-access-bwm7z\") pod \"97b9b728-5c14-478a-a902-7b01d6ca918f\" (UID: \"97b9b728-5c14-478a-a902-7b01d6ca918f\") " Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.097954 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b9b728-5c14-478a-a902-7b01d6ca918f-config-volume" (OuterVolumeSpecName: "config-volume") pod "97b9b728-5c14-478a-a902-7b01d6ca918f" (UID: "97b9b728-5c14-478a-a902-7b01d6ca918f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.098415 4753 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97b9b728-5c14-478a-a902-7b01d6ca918f-config-volume\") on node \"crc\" DevicePath \"\"" Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.102696 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b9b728-5c14-478a-a902-7b01d6ca918f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97b9b728-5c14-478a-a902-7b01d6ca918f" (UID: "97b9b728-5c14-478a-a902-7b01d6ca918f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.102791 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b9b728-5c14-478a-a902-7b01d6ca918f-kube-api-access-bwm7z" (OuterVolumeSpecName: "kube-api-access-bwm7z") pod "97b9b728-5c14-478a-a902-7b01d6ca918f" (UID: "97b9b728-5c14-478a-a902-7b01d6ca918f"). InnerVolumeSpecName "kube-api-access-bwm7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.202229 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwm7z\" (UniqueName: \"kubernetes.io/projected/97b9b728-5c14-478a-a902-7b01d6ca918f-kube-api-access-bwm7z\") on node \"crc\" DevicePath \"\"" Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.202263 4753 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97b9b728-5c14-478a-a902-7b01d6ca918f-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.632888 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" event={"ID":"97b9b728-5c14-478a-a902-7b01d6ca918f","Type":"ContainerDied","Data":"5183b260c4ed1a0706fff18a09517ad7a8b854cf8bc6e9402a6c29b5e7dd9c5a"} Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.632930 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29495025-7dzd5" Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.632933 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5183b260c4ed1a0706fff18a09517ad7a8b854cf8bc6e9402a6c29b5e7dd9c5a" Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.687074 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg"] Jan 29 15:45:04 crc kubenswrapper[4753]: I0129 15:45:04.743748 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29494980-bwkhg"] Jan 29 15:45:06 crc kubenswrapper[4753]: I0129 15:45:06.161095 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bdf128f-e268-4db9-8d94-73ff14869f6f" path="/var/lib/kubelet/pods/4bdf128f-e268-4db9-8d94-73ff14869f6f/volumes" Jan 29 15:45:06 crc kubenswrapper[4753]: E0129 15:45:06.295820 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 15:45:06 crc kubenswrapper[4753]: E0129 15:45:06.295969 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26zht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xgt6l_openshift-marketplace(61359d71-6838-4f79-8df5-3bf8b620f0a6): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:45:06 crc kubenswrapper[4753]: E0129 15:45:06.297512 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:45:07 crc kubenswrapper[4753]: E0129 15:45:07.153040 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:45:12 crc kubenswrapper[4753]: I0129 15:45:12.814683 4753 scope.go:117] "RemoveContainer" containerID="516b3d758bf8f47b9acb854501c20565ce10a1920afc00e60bb79fe2e2de0fe1" Jan 29 15:45:12 crc kubenswrapper[4753]: I0129 15:45:12.861286 4753 scope.go:117] "RemoveContainer" containerID="d846a7206d4ce7a763a19fcd1e81d0391536a4b1a2f7ff3f08eb9475b5e2ce5e" Jan 29 15:45:12 crc kubenswrapper[4753]: I0129 15:45:12.947747 4753 scope.go:117] "RemoveContainer" containerID="45f3369822e9d27958bbe7571a9cd1bb7d754295cdeb7ea6d115b8d7b110b73d" Jan 29 15:45:12 crc kubenswrapper[4753]: I0129 15:45:12.974358 4753 scope.go:117] "RemoveContainer" containerID="6fdb622d7cac9c4eaa8e3ffb1f44bffc115ed0d4186720085dff27d18687c99a" Jan 29 15:45:13 crc kubenswrapper[4753]: I0129 15:45:13.013284 4753 scope.go:117] "RemoveContainer" containerID="487fcdc593a8ac55d5d3583e9aaac21ca9431766bd0a898764b8a281a5f76e7d" Jan 29 15:45:13 crc kubenswrapper[4753]: I0129 15:45:13.091227 4753 scope.go:117] "RemoveContainer" containerID="3452c8832f0b8b0f8cf58d57bac5c5cf11c9bf27006d1a303881002226f68c73" Jan 29 15:45:18 crc kubenswrapper[4753]: E0129 15:45:18.150685 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:45:19 crc kubenswrapper[4753]: E0129 15:45:19.157416 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:45:27 crc kubenswrapper[4753]: I0129 15:45:27.055431 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:45:27 crc kubenswrapper[4753]: I0129 15:45:27.056524 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:45:31 crc kubenswrapper[4753]: E0129 15:45:31.151973 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:45:33 crc kubenswrapper[4753]: I0129 15:45:33.052138 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6zxpq"] Jan 29 15:45:33 crc kubenswrapper[4753]: I0129 15:45:33.060818 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6zxpq"] Jan 29 15:45:33 crc kubenswrapper[4753]: I0129 15:45:33.071358 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-53dd-account-create-update-zlbhj"] Jan 29 15:45:33 crc kubenswrapper[4753]: I0129 15:45:33.079356 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-53dd-account-create-update-zlbhj"] Jan 29 15:45:34 crc kubenswrapper[4753]: E0129 15:45:34.151759 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:45:34 crc kubenswrapper[4753]: I0129 15:45:34.174997 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50dc7afb-2680-43db-88e4-8b5315ee34c4" path="/var/lib/kubelet/pods/50dc7afb-2680-43db-88e4-8b5315ee34c4/volumes" Jan 29 15:45:34 crc kubenswrapper[4753]: I0129 15:45:34.175720 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3129f07-ee9a-402f-97f1-8a8c3093b3ae" path="/var/lib/kubelet/pods/f3129f07-ee9a-402f-97f1-8a8c3093b3ae/volumes" Jan 29 15:45:41 crc kubenswrapper[4753]: I0129 15:45:41.042265 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-cdd6x"] Jan 29 15:45:41 crc kubenswrapper[4753]: I0129 15:45:41.054439 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-cdd6x"] Jan 29 15:45:42 crc kubenswrapper[4753]: I0129 15:45:42.161371 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf" path="/var/lib/kubelet/pods/fe507c25-d7c5-418f-8fb5-b1d4d00b1bbf/volumes" Jan 29 15:45:43 crc kubenswrapper[4753]: E0129 15:45:43.152943 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:45:48 crc kubenswrapper[4753]: E0129 15:45:48.280795 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 29 15:45:48 crc kubenswrapper[4753]: E0129 15:45:48.281495 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26zht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xgt6l_openshift-marketplace(61359d71-6838-4f79-8df5-3bf8b620f0a6): ErrImagePull: initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:45:48 crc kubenswrapper[4753]: E0129 15:45:48.282717 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"initializing source docker://registry.redhat.io/redhat/community-operator-index:v4.18: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:45:57 crc kubenswrapper[4753]: I0129 15:45:57.055399 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:45:57 crc kubenswrapper[4753]: I0129 15:45:57.055974 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:45:58 crc kubenswrapper[4753]: E0129 15:45:58.153278 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:46:02 crc kubenswrapper[4753]: E0129 15:46:02.154646 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:46:07 crc kubenswrapper[4753]: I0129 15:46:07.764963 4753 patch_prober.go:28] interesting pod/console-5b5f5df768-q7mxm container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 29 15:46:07 crc kubenswrapper[4753]: I0129 15:46:07.765626 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-5b5f5df768-q7mxm" podUID="c0ac8a3c-f3d0-43e7-818b-7ad9fe76163d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.44:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 29 15:46:11 crc kubenswrapper[4753]: E0129 15:46:11.280230 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)" image="registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6" Jan 29 15:46:11 crc kubenswrapper[4753]: E0129 15:46:11.281194 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rgx5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_openshift-marketplace(ea6b2cb6-43c4-4b8d-ac86-c5522959a43b): ErrImagePull: initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)" logger="UnhandledError" Jan 29 15:46:11 crc kubenswrapper[4753]: E0129 15:46:11.282512 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"initializing source docker://registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6: Requesting bearer token: invalid status code from registry 403 (Forbidden)\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:46:13 crc kubenswrapper[4753]: I0129 15:46:13.282193 4753 scope.go:117] "RemoveContainer" containerID="e8f4f1e5862d855a9e7856d9ee77bc0b86f5442662aef00530c6df7b12717aa4" Jan 29 15:46:13 crc kubenswrapper[4753]: I0129 15:46:13.450897 4753 scope.go:117] "RemoveContainer" containerID="57c97e4fbc6500efa3262734e237de8aa934f2cf7da7e1285ee3f2add032876f" Jan 29 15:46:13 crc kubenswrapper[4753]: I0129 15:46:13.477063 4753 scope.go:117] "RemoveContainer" containerID="f3e96303b31f0a4eaf9543ecf4a2fbb463326996b9ef00debfffac3c9003316a" Jan 29 15:46:16 crc kubenswrapper[4753]: E0129 15:46:16.159008 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:46:25 crc kubenswrapper[4753]: E0129 15:46:25.152043 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:46:27 crc kubenswrapper[4753]: I0129 15:46:27.054666 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:46:27 crc kubenswrapper[4753]: I0129 15:46:27.055040 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:46:27 crc kubenswrapper[4753]: I0129 15:46:27.055092 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:46:27 crc kubenswrapper[4753]: I0129 15:46:27.055913 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36ed70dab0ac14e2bbb32c7b086f3b2d4999c7c2d1318f4807ee75319489be51"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:46:27 crc kubenswrapper[4753]: I0129 15:46:27.055973 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://36ed70dab0ac14e2bbb32c7b086f3b2d4999c7c2d1318f4807ee75319489be51" gracePeriod=600 Jan 29 15:46:27 crc kubenswrapper[4753]: I0129 15:46:27.417046 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="36ed70dab0ac14e2bbb32c7b086f3b2d4999c7c2d1318f4807ee75319489be51" exitCode=0 Jan 29 15:46:27 crc kubenswrapper[4753]: I0129 15:46:27.417117 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"36ed70dab0ac14e2bbb32c7b086f3b2d4999c7c2d1318f4807ee75319489be51"} Jan 29 15:46:27 crc kubenswrapper[4753]: I0129 15:46:27.417764 4753 scope.go:117] "RemoveContainer" containerID="417d8138a41abd2154db6ca7106bd57bb8a091ef28c83fc74f84b6800a3c373e" Jan 29 15:46:28 crc kubenswrapper[4753]: E0129 15:46:28.151836 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:46:28 crc kubenswrapper[4753]: I0129 15:46:28.430975 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e"} Jan 29 15:46:39 crc kubenswrapper[4753]: E0129 15:46:39.153469 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:46:40 crc kubenswrapper[4753]: E0129 15:46:40.151101 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:46:52 crc kubenswrapper[4753]: E0129 15:46:52.152394 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:46:53 crc kubenswrapper[4753]: E0129 15:46:53.151492 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:47:04 crc kubenswrapper[4753]: E0129 15:47:04.153323 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" Jan 29 15:47:05 crc kubenswrapper[4753]: E0129 15:47:05.151176 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:47:18 crc kubenswrapper[4753]: E0129 15:47:18.505877 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:47:19 crc kubenswrapper[4753]: I0129 15:47:19.877338 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt6l" event={"ID":"61359d71-6838-4f79-8df5-3bf8b620f0a6","Type":"ContainerStarted","Data":"b1d0fb5c38b794fc2c20e927b02f4b0eeb09a5466c2bcf959c662ea1168f79d4"} Jan 29 15:47:20 crc kubenswrapper[4753]: I0129 15:47:20.888121 4753 generic.go:334] "Generic (PLEG): container finished" podID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerID="b1d0fb5c38b794fc2c20e927b02f4b0eeb09a5466c2bcf959c662ea1168f79d4" exitCode=0 Jan 29 15:47:20 crc kubenswrapper[4753]: I0129 15:47:20.888213 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt6l" event={"ID":"61359d71-6838-4f79-8df5-3bf8b620f0a6","Type":"ContainerDied","Data":"b1d0fb5c38b794fc2c20e927b02f4b0eeb09a5466c2bcf959c662ea1168f79d4"} Jan 29 15:47:23 crc kubenswrapper[4753]: I0129 15:47:23.921249 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt6l" event={"ID":"61359d71-6838-4f79-8df5-3bf8b620f0a6","Type":"ContainerStarted","Data":"396c550407c3a20f76a3ba383bda2f5025f2c2d0e1574d5c7be22fb9e6f7583c"} Jan 29 15:47:23 crc kubenswrapper[4753]: I0129 15:47:23.939876 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xgt6l" podStartSLOduration=4.045224101 podStartE2EDuration="3m5.939860553s" podCreationTimestamp="2026-01-29 15:44:18 +0000 UTC" firstStartedPulling="2026-01-29 15:44:21.262787575 +0000 UTC m=+6095.957521957" lastFinishedPulling="2026-01-29 15:47:23.157424027 +0000 UTC m=+6277.852158409" observedRunningTime="2026-01-29 15:47:23.937927441 +0000 UTC m=+6278.632661853" watchObservedRunningTime="2026-01-29 15:47:23.939860553 +0000 UTC m=+6278.634594935" Jan 29 15:47:29 crc kubenswrapper[4753]: I0129 15:47:29.250836 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:47:29 crc kubenswrapper[4753]: I0129 15:47:29.252647 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:47:29 crc kubenswrapper[4753]: I0129 15:47:29.306675 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:47:30 crc kubenswrapper[4753]: I0129 15:47:30.010318 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:47:30 crc kubenswrapper[4753]: I0129 15:47:30.063213 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgt6l"] Jan 29 15:47:31 crc kubenswrapper[4753]: E0129 15:47:31.151843 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:47:31 crc kubenswrapper[4753]: I0129 15:47:31.985911 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xgt6l" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerName="registry-server" containerID="cri-o://396c550407c3a20f76a3ba383bda2f5025f2c2d0e1574d5c7be22fb9e6f7583c" gracePeriod=2 Jan 29 15:47:32 crc kubenswrapper[4753]: I0129 15:47:32.997820 4753 generic.go:334] "Generic (PLEG): container finished" podID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerID="396c550407c3a20f76a3ba383bda2f5025f2c2d0e1574d5c7be22fb9e6f7583c" exitCode=0 Jan 29 15:47:32 crc kubenswrapper[4753]: I0129 15:47:32.997911 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt6l" event={"ID":"61359d71-6838-4f79-8df5-3bf8b620f0a6","Type":"ContainerDied","Data":"396c550407c3a20f76a3ba383bda2f5025f2c2d0e1574d5c7be22fb9e6f7583c"} Jan 29 15:47:33 crc kubenswrapper[4753]: I0129 15:47:33.135982 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:47:33 crc kubenswrapper[4753]: I0129 15:47:33.274846 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-utilities\") pod \"61359d71-6838-4f79-8df5-3bf8b620f0a6\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " Jan 29 15:47:33 crc kubenswrapper[4753]: I0129 15:47:33.274965 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-catalog-content\") pod \"61359d71-6838-4f79-8df5-3bf8b620f0a6\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " Jan 29 15:47:33 crc kubenswrapper[4753]: I0129 15:47:33.275052 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26zht\" (UniqueName: \"kubernetes.io/projected/61359d71-6838-4f79-8df5-3bf8b620f0a6-kube-api-access-26zht\") pod \"61359d71-6838-4f79-8df5-3bf8b620f0a6\" (UID: \"61359d71-6838-4f79-8df5-3bf8b620f0a6\") " Jan 29 15:47:33 crc kubenswrapper[4753]: I0129 15:47:33.275987 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-utilities" (OuterVolumeSpecName: "utilities") pod "61359d71-6838-4f79-8df5-3bf8b620f0a6" (UID: "61359d71-6838-4f79-8df5-3bf8b620f0a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:47:33 crc kubenswrapper[4753]: I0129 15:47:33.280931 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61359d71-6838-4f79-8df5-3bf8b620f0a6-kube-api-access-26zht" (OuterVolumeSpecName: "kube-api-access-26zht") pod "61359d71-6838-4f79-8df5-3bf8b620f0a6" (UID: "61359d71-6838-4f79-8df5-3bf8b620f0a6"). InnerVolumeSpecName "kube-api-access-26zht". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:47:33 crc kubenswrapper[4753]: I0129 15:47:33.350872 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61359d71-6838-4f79-8df5-3bf8b620f0a6" (UID: "61359d71-6838-4f79-8df5-3bf8b620f0a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:47:33 crc kubenswrapper[4753]: I0129 15:47:33.378761 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:47:33 crc kubenswrapper[4753]: I0129 15:47:33.378796 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61359d71-6838-4f79-8df5-3bf8b620f0a6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:47:33 crc kubenswrapper[4753]: I0129 15:47:33.378811 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26zht\" (UniqueName: \"kubernetes.io/projected/61359d71-6838-4f79-8df5-3bf8b620f0a6-kube-api-access-26zht\") on node \"crc\" DevicePath \"\"" Jan 29 15:47:34 crc kubenswrapper[4753]: I0129 15:47:34.013289 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt6l" event={"ID":"61359d71-6838-4f79-8df5-3bf8b620f0a6","Type":"ContainerDied","Data":"e65ba39e0606b6ce3e0c4f4e6113b81b537882ab8e6e9789d726efa1c7118b77"} Jan 29 15:47:34 crc kubenswrapper[4753]: I0129 15:47:34.013672 4753 scope.go:117] "RemoveContainer" containerID="396c550407c3a20f76a3ba383bda2f5025f2c2d0e1574d5c7be22fb9e6f7583c" Jan 29 15:47:34 crc kubenswrapper[4753]: I0129 15:47:34.013921 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgt6l" Jan 29 15:47:34 crc kubenswrapper[4753]: I0129 15:47:34.042855 4753 scope.go:117] "RemoveContainer" containerID="b1d0fb5c38b794fc2c20e927b02f4b0eeb09a5466c2bcf959c662ea1168f79d4" Jan 29 15:47:34 crc kubenswrapper[4753]: I0129 15:47:34.053074 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgt6l"] Jan 29 15:47:34 crc kubenswrapper[4753]: I0129 15:47:34.064481 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xgt6l"] Jan 29 15:47:34 crc kubenswrapper[4753]: I0129 15:47:34.070255 4753 scope.go:117] "RemoveContainer" containerID="aaef84bcdd34f623978253790ddac6bc188597ebb67728278b58266f759a5653" Jan 29 15:47:34 crc kubenswrapper[4753]: I0129 15:47:34.166302 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" path="/var/lib/kubelet/pods/61359d71-6838-4f79-8df5-3bf8b620f0a6/volumes" Jan 29 15:47:45 crc kubenswrapper[4753]: E0129 15:47:45.151322 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:47:58 crc kubenswrapper[4753]: E0129 15:47:58.151979 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:48:10 crc kubenswrapper[4753]: E0129 15:48:10.150951 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.066829 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-wmq68"] Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.078380 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-wmq68"] Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.186514 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f" path="/var/lib/kubelet/pods/ce7b0f5a-10b7-49e6-9dcc-c959ef4b698f/volumes" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.683641 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r5gql"] Jan 29 15:48:22 crc kubenswrapper[4753]: E0129 15:48:22.684131 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerName="extract-utilities" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.684171 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerName="extract-utilities" Jan 29 15:48:22 crc kubenswrapper[4753]: E0129 15:48:22.684194 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerName="registry-server" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.684204 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerName="registry-server" Jan 29 15:48:22 crc kubenswrapper[4753]: E0129 15:48:22.684226 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerName="extract-content" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.684235 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerName="extract-content" Jan 29 15:48:22 crc kubenswrapper[4753]: E0129 15:48:22.684258 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b9b728-5c14-478a-a902-7b01d6ca918f" containerName="collect-profiles" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.684268 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b9b728-5c14-478a-a902-7b01d6ca918f" containerName="collect-profiles" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.684505 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="61359d71-6838-4f79-8df5-3bf8b620f0a6" containerName="registry-server" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.684523 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b9b728-5c14-478a-a902-7b01d6ca918f" containerName="collect-profiles" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.687182 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.701952 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r5gql"] Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.718245 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-catalog-content\") pod \"certified-operators-r5gql\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.718330 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc9tm\" (UniqueName: \"kubernetes.io/projected/ece32461-ebe5-4402-ad18-891f5bab74c6-kube-api-access-hc9tm\") pod \"certified-operators-r5gql\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.718382 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-utilities\") pod \"certified-operators-r5gql\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.820778 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-catalog-content\") pod \"certified-operators-r5gql\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.821218 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc9tm\" (UniqueName: \"kubernetes.io/projected/ece32461-ebe5-4402-ad18-891f5bab74c6-kube-api-access-hc9tm\") pod \"certified-operators-r5gql\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.821297 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-utilities\") pod \"certified-operators-r5gql\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.821362 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-catalog-content\") pod \"certified-operators-r5gql\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.821691 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-utilities\") pod \"certified-operators-r5gql\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:22 crc kubenswrapper[4753]: I0129 15:48:22.846570 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc9tm\" (UniqueName: \"kubernetes.io/projected/ece32461-ebe5-4402-ad18-891f5bab74c6-kube-api-access-hc9tm\") pod \"certified-operators-r5gql\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:23 crc kubenswrapper[4753]: I0129 15:48:23.005400 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:23 crc kubenswrapper[4753]: I0129 15:48:23.569254 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r5gql"] Jan 29 15:48:24 crc kubenswrapper[4753]: I0129 15:48:24.026680 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-7d0c-account-create-update-qfdv9"] Jan 29 15:48:24 crc kubenswrapper[4753]: I0129 15:48:24.034500 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-7d0c-account-create-update-qfdv9"] Jan 29 15:48:24 crc kubenswrapper[4753]: E0129 15:48:24.152248 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:48:24 crc kubenswrapper[4753]: I0129 15:48:24.163926 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8606fff5-4c4f-4f5f-9471-88e3b77284da" path="/var/lib/kubelet/pods/8606fff5-4c4f-4f5f-9471-88e3b77284da/volumes" Jan 29 15:48:24 crc kubenswrapper[4753]: I0129 15:48:24.479583 4753 generic.go:334] "Generic (PLEG): container finished" podID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerID="a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9" exitCode=0 Jan 29 15:48:24 crc kubenswrapper[4753]: I0129 15:48:24.479631 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gql" event={"ID":"ece32461-ebe5-4402-ad18-891f5bab74c6","Type":"ContainerDied","Data":"a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9"} Jan 29 15:48:24 crc kubenswrapper[4753]: I0129 15:48:24.479661 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gql" event={"ID":"ece32461-ebe5-4402-ad18-891f5bab74c6","Type":"ContainerStarted","Data":"1d5332047a92d6ef59700f2039c40ef6d14836523ad3a908d4873a9b32078d46"} Jan 29 15:48:25 crc kubenswrapper[4753]: I0129 15:48:25.888964 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ggcch"] Jan 29 15:48:25 crc kubenswrapper[4753]: I0129 15:48:25.893821 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:25 crc kubenswrapper[4753]: I0129 15:48:25.949142 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ggcch"] Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.004862 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfnn2\" (UniqueName: \"kubernetes.io/projected/da2af3b7-f304-4576-b4c6-1ffaba5b4278-kube-api-access-nfnn2\") pod \"redhat-operators-ggcch\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.004939 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-utilities\") pod \"redhat-operators-ggcch\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.005037 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-catalog-content\") pod \"redhat-operators-ggcch\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.107322 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfnn2\" (UniqueName: \"kubernetes.io/projected/da2af3b7-f304-4576-b4c6-1ffaba5b4278-kube-api-access-nfnn2\") pod \"redhat-operators-ggcch\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.107378 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-utilities\") pod \"redhat-operators-ggcch\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.107462 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-catalog-content\") pod \"redhat-operators-ggcch\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.108118 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-catalog-content\") pod \"redhat-operators-ggcch\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.108115 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-utilities\") pod \"redhat-operators-ggcch\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.132478 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfnn2\" (UniqueName: \"kubernetes.io/projected/da2af3b7-f304-4576-b4c6-1ffaba5b4278-kube-api-access-nfnn2\") pod \"redhat-operators-ggcch\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.263412 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.522256 4753 generic.go:334] "Generic (PLEG): container finished" podID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerID="52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0" exitCode=0 Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.522419 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gql" event={"ID":"ece32461-ebe5-4402-ad18-891f5bab74c6","Type":"ContainerDied","Data":"52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0"} Jan 29 15:48:26 crc kubenswrapper[4753]: I0129 15:48:26.761323 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ggcch"] Jan 29 15:48:27 crc kubenswrapper[4753]: I0129 15:48:27.057641 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:48:27 crc kubenswrapper[4753]: I0129 15:48:27.057974 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:48:27 crc kubenswrapper[4753]: I0129 15:48:27.534255 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gql" event={"ID":"ece32461-ebe5-4402-ad18-891f5bab74c6","Type":"ContainerStarted","Data":"9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1"} Jan 29 15:48:27 crc kubenswrapper[4753]: I0129 15:48:27.535636 4753 generic.go:334] "Generic (PLEG): container finished" podID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerID="85215c46458a812dac16706990d05ca42d06ff7309e5a07bbc984a3fa28452d7" exitCode=0 Jan 29 15:48:27 crc kubenswrapper[4753]: I0129 15:48:27.535672 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggcch" event={"ID":"da2af3b7-f304-4576-b4c6-1ffaba5b4278","Type":"ContainerDied","Data":"85215c46458a812dac16706990d05ca42d06ff7309e5a07bbc984a3fa28452d7"} Jan 29 15:48:27 crc kubenswrapper[4753]: I0129 15:48:27.535691 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggcch" event={"ID":"da2af3b7-f304-4576-b4c6-1ffaba5b4278","Type":"ContainerStarted","Data":"4e51d0a8dacbd9733cbc9926fbbc36d6e8f6ecb6b443d80bc5bca4237708d1a8"} Jan 29 15:48:27 crc kubenswrapper[4753]: I0129 15:48:27.562885 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r5gql" podStartSLOduration=3.124296967 podStartE2EDuration="5.562858765s" podCreationTimestamp="2026-01-29 15:48:22 +0000 UTC" firstStartedPulling="2026-01-29 15:48:24.481244211 +0000 UTC m=+6339.175978593" lastFinishedPulling="2026-01-29 15:48:26.919806009 +0000 UTC m=+6341.614540391" observedRunningTime="2026-01-29 15:48:27.556714068 +0000 UTC m=+6342.251448470" watchObservedRunningTime="2026-01-29 15:48:27.562858765 +0000 UTC m=+6342.257593147" Jan 29 15:48:29 crc kubenswrapper[4753]: I0129 15:48:29.551854 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggcch" event={"ID":"da2af3b7-f304-4576-b4c6-1ffaba5b4278","Type":"ContainerStarted","Data":"ffff470bf90e6ab1c4427b7c7787bf0c2b01ef052927ae444d6166de3ede347a"} Jan 29 15:48:30 crc kubenswrapper[4753]: I0129 15:48:30.033100 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-5rwnd"] Jan 29 15:48:30 crc kubenswrapper[4753]: I0129 15:48:30.047032 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-5rwnd"] Jan 29 15:48:30 crc kubenswrapper[4753]: I0129 15:48:30.161904 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9de6cc8-8d76-4818-9ac7-5242037ba4dd" path="/var/lib/kubelet/pods/f9de6cc8-8d76-4818-9ac7-5242037ba4dd/volumes" Jan 29 15:48:30 crc kubenswrapper[4753]: I0129 15:48:30.564463 4753 generic.go:334] "Generic (PLEG): container finished" podID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerID="ffff470bf90e6ab1c4427b7c7787bf0c2b01ef052927ae444d6166de3ede347a" exitCode=0 Jan 29 15:48:30 crc kubenswrapper[4753]: I0129 15:48:30.564513 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggcch" event={"ID":"da2af3b7-f304-4576-b4c6-1ffaba5b4278","Type":"ContainerDied","Data":"ffff470bf90e6ab1c4427b7c7787bf0c2b01ef052927ae444d6166de3ede347a"} Jan 29 15:48:31 crc kubenswrapper[4753]: I0129 15:48:31.031646 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-41eb-account-create-update-mq7gf"] Jan 29 15:48:31 crc kubenswrapper[4753]: I0129 15:48:31.040815 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-41eb-account-create-update-mq7gf"] Jan 29 15:48:31 crc kubenswrapper[4753]: I0129 15:48:31.574907 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggcch" event={"ID":"da2af3b7-f304-4576-b4c6-1ffaba5b4278","Type":"ContainerStarted","Data":"a2f9c69f1aedd55a35a0724dfa115e5098143341d0967f2a5dbfb71aed455ecf"} Jan 29 15:48:31 crc kubenswrapper[4753]: I0129 15:48:31.601039 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ggcch" podStartSLOduration=3.040958339 podStartE2EDuration="6.601019756s" podCreationTimestamp="2026-01-29 15:48:25 +0000 UTC" firstStartedPulling="2026-01-29 15:48:27.537462638 +0000 UTC m=+6342.232197020" lastFinishedPulling="2026-01-29 15:48:31.097524065 +0000 UTC m=+6345.792258437" observedRunningTime="2026-01-29 15:48:31.592600578 +0000 UTC m=+6346.287334970" watchObservedRunningTime="2026-01-29 15:48:31.601019756 +0000 UTC m=+6346.295754128" Jan 29 15:48:32 crc kubenswrapper[4753]: I0129 15:48:32.160687 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156f1d4c-b1f1-4c66-9425-65a6819d5efb" path="/var/lib/kubelet/pods/156f1d4c-b1f1-4c66-9425-65a6819d5efb/volumes" Jan 29 15:48:33 crc kubenswrapper[4753]: I0129 15:48:33.005850 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:33 crc kubenswrapper[4753]: I0129 15:48:33.007166 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:33 crc kubenswrapper[4753]: I0129 15:48:33.055084 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:33 crc kubenswrapper[4753]: I0129 15:48:33.635059 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:34 crc kubenswrapper[4753]: I0129 15:48:34.275609 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r5gql"] Jan 29 15:48:35 crc kubenswrapper[4753]: I0129 15:48:35.607298 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r5gql" podUID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerName="registry-server" containerID="cri-o://9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1" gracePeriod=2 Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.128457 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.223100 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc9tm\" (UniqueName: \"kubernetes.io/projected/ece32461-ebe5-4402-ad18-891f5bab74c6-kube-api-access-hc9tm\") pod \"ece32461-ebe5-4402-ad18-891f5bab74c6\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.223204 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-utilities\") pod \"ece32461-ebe5-4402-ad18-891f5bab74c6\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.223364 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-catalog-content\") pod \"ece32461-ebe5-4402-ad18-891f5bab74c6\" (UID: \"ece32461-ebe5-4402-ad18-891f5bab74c6\") " Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.223952 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-utilities" (OuterVolumeSpecName: "utilities") pod "ece32461-ebe5-4402-ad18-891f5bab74c6" (UID: "ece32461-ebe5-4402-ad18-891f5bab74c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.229055 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece32461-ebe5-4402-ad18-891f5bab74c6-kube-api-access-hc9tm" (OuterVolumeSpecName: "kube-api-access-hc9tm") pod "ece32461-ebe5-4402-ad18-891f5bab74c6" (UID: "ece32461-ebe5-4402-ad18-891f5bab74c6"). InnerVolumeSpecName "kube-api-access-hc9tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.264243 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.264315 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.275927 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ece32461-ebe5-4402-ad18-891f5bab74c6" (UID: "ece32461-ebe5-4402-ad18-891f5bab74c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.326694 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc9tm\" (UniqueName: \"kubernetes.io/projected/ece32461-ebe5-4402-ad18-891f5bab74c6-kube-api-access-hc9tm\") on node \"crc\" DevicePath \"\"" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.326756 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.326768 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ece32461-ebe5-4402-ad18-891f5bab74c6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.617644 4753 generic.go:334] "Generic (PLEG): container finished" podID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerID="9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1" exitCode=0 Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.617937 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gql" event={"ID":"ece32461-ebe5-4402-ad18-891f5bab74c6","Type":"ContainerDied","Data":"9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1"} Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.617966 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r5gql" event={"ID":"ece32461-ebe5-4402-ad18-891f5bab74c6","Type":"ContainerDied","Data":"1d5332047a92d6ef59700f2039c40ef6d14836523ad3a908d4873a9b32078d46"} Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.617981 4753 scope.go:117] "RemoveContainer" containerID="9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.618116 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r5gql" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.662504 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r5gql"] Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.663540 4753 scope.go:117] "RemoveContainer" containerID="52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.673268 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r5gql"] Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.685758 4753 scope.go:117] "RemoveContainer" containerID="a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.747701 4753 scope.go:117] "RemoveContainer" containerID="9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1" Jan 29 15:48:36 crc kubenswrapper[4753]: E0129 15:48:36.748664 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1\": container with ID starting with 9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1 not found: ID does not exist" containerID="9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.748711 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1"} err="failed to get container status \"9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1\": rpc error: code = NotFound desc = could not find container \"9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1\": container with ID starting with 9ddfead0c39fbec02ee47804b2994656f467f14c6ef873475b93a9baff2831a1 not found: ID does not exist" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.748743 4753 scope.go:117] "RemoveContainer" containerID="52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0" Jan 29 15:48:36 crc kubenswrapper[4753]: E0129 15:48:36.749136 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0\": container with ID starting with 52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0 not found: ID does not exist" containerID="52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.749185 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0"} err="failed to get container status \"52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0\": rpc error: code = NotFound desc = could not find container \"52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0\": container with ID starting with 52d7926b6d56deb60d96cd0e5f1a76dbb190a5363361d486435ad104fead46e0 not found: ID does not exist" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.749207 4753 scope.go:117] "RemoveContainer" containerID="a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9" Jan 29 15:48:36 crc kubenswrapper[4753]: E0129 15:48:36.749624 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9\": container with ID starting with a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9 not found: ID does not exist" containerID="a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9" Jan 29 15:48:36 crc kubenswrapper[4753]: I0129 15:48:36.749670 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9"} err="failed to get container status \"a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9\": rpc error: code = NotFound desc = could not find container \"a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9\": container with ID starting with a799995a1f28f0f03a18c7079fffd77325f7712826f1cd41aa9d73f53de455c9 not found: ID does not exist" Jan 29 15:48:37 crc kubenswrapper[4753]: I0129 15:48:37.314408 4753 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ggcch" podUID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerName="registry-server" probeResult="failure" output=< Jan 29 15:48:37 crc kubenswrapper[4753]: timeout: failed to connect service ":50051" within 1s Jan 29 15:48:37 crc kubenswrapper[4753]: > Jan 29 15:48:38 crc kubenswrapper[4753]: I0129 15:48:38.163999 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece32461-ebe5-4402-ad18-891f5bab74c6" path="/var/lib/kubelet/pods/ece32461-ebe5-4402-ad18-891f5bab74c6/volumes" Jan 29 15:48:39 crc kubenswrapper[4753]: E0129 15:48:39.151475 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-operator-bundle@sha256:a3b8e1f3f8d154095f365ccbb163f2cf3852d6091b1f74773a8b5a2ee5c1cee6\\\"\"" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" Jan 29 15:48:46 crc kubenswrapper[4753]: I0129 15:48:46.310930 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:46 crc kubenswrapper[4753]: I0129 15:48:46.361291 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:46 crc kubenswrapper[4753]: I0129 15:48:46.546914 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ggcch"] Jan 29 15:48:47 crc kubenswrapper[4753]: I0129 15:48:47.748926 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ggcch" podUID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerName="registry-server" containerID="cri-o://a2f9c69f1aedd55a35a0724dfa115e5098143341d0967f2a5dbfb71aed455ecf" gracePeriod=2 Jan 29 15:48:48 crc kubenswrapper[4753]: I0129 15:48:48.762015 4753 generic.go:334] "Generic (PLEG): container finished" podID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerID="a2f9c69f1aedd55a35a0724dfa115e5098143341d0967f2a5dbfb71aed455ecf" exitCode=0 Jan 29 15:48:48 crc kubenswrapper[4753]: I0129 15:48:48.762090 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggcch" event={"ID":"da2af3b7-f304-4576-b4c6-1ffaba5b4278","Type":"ContainerDied","Data":"a2f9c69f1aedd55a35a0724dfa115e5098143341d0967f2a5dbfb71aed455ecf"} Jan 29 15:48:48 crc kubenswrapper[4753]: I0129 15:48:48.762346 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ggcch" event={"ID":"da2af3b7-f304-4576-b4c6-1ffaba5b4278","Type":"ContainerDied","Data":"4e51d0a8dacbd9733cbc9926fbbc36d6e8f6ecb6b443d80bc5bca4237708d1a8"} Jan 29 15:48:48 crc kubenswrapper[4753]: I0129 15:48:48.762361 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e51d0a8dacbd9733cbc9926fbbc36d6e8f6ecb6b443d80bc5bca4237708d1a8" Jan 29 15:48:48 crc kubenswrapper[4753]: I0129 15:48:48.762127 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:48 crc kubenswrapper[4753]: I0129 15:48:48.923276 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-utilities\") pod \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " Jan 29 15:48:48 crc kubenswrapper[4753]: I0129 15:48:48.923353 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfnn2\" (UniqueName: \"kubernetes.io/projected/da2af3b7-f304-4576-b4c6-1ffaba5b4278-kube-api-access-nfnn2\") pod \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " Jan 29 15:48:48 crc kubenswrapper[4753]: I0129 15:48:48.923417 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-catalog-content\") pod \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\" (UID: \"da2af3b7-f304-4576-b4c6-1ffaba5b4278\") " Jan 29 15:48:48 crc kubenswrapper[4753]: I0129 15:48:48.928731 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-utilities" (OuterVolumeSpecName: "utilities") pod "da2af3b7-f304-4576-b4c6-1ffaba5b4278" (UID: "da2af3b7-f304-4576-b4c6-1ffaba5b4278"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:48:48 crc kubenswrapper[4753]: I0129 15:48:48.930794 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2af3b7-f304-4576-b4c6-1ffaba5b4278-kube-api-access-nfnn2" (OuterVolumeSpecName: "kube-api-access-nfnn2") pod "da2af3b7-f304-4576-b4c6-1ffaba5b4278" (UID: "da2af3b7-f304-4576-b4c6-1ffaba5b4278"). InnerVolumeSpecName "kube-api-access-nfnn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:48:49 crc kubenswrapper[4753]: I0129 15:48:49.025806 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:48:49 crc kubenswrapper[4753]: I0129 15:48:49.025854 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfnn2\" (UniqueName: \"kubernetes.io/projected/da2af3b7-f304-4576-b4c6-1ffaba5b4278-kube-api-access-nfnn2\") on node \"crc\" DevicePath \"\"" Jan 29 15:48:49 crc kubenswrapper[4753]: I0129 15:48:49.037639 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da2af3b7-f304-4576-b4c6-1ffaba5b4278" (UID: "da2af3b7-f304-4576-b4c6-1ffaba5b4278"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:48:49 crc kubenswrapper[4753]: I0129 15:48:49.127515 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da2af3b7-f304-4576-b4c6-1ffaba5b4278-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:48:49 crc kubenswrapper[4753]: I0129 15:48:49.771889 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ggcch" Jan 29 15:48:49 crc kubenswrapper[4753]: I0129 15:48:49.806144 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ggcch"] Jan 29 15:48:49 crc kubenswrapper[4753]: I0129 15:48:49.814119 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ggcch"] Jan 29 15:48:50 crc kubenswrapper[4753]: I0129 15:48:50.187366 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" path="/var/lib/kubelet/pods/da2af3b7-f304-4576-b4c6-1ffaba5b4278/volumes" Jan 29 15:48:56 crc kubenswrapper[4753]: I0129 15:48:56.838614 4753 generic.go:334] "Generic (PLEG): container finished" podID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" containerID="158f580b6e5edefc7e17c45d06d3de807c5865ce03a5871ef802de47c3e8484c" exitCode=0 Jan 29 15:48:56 crc kubenswrapper[4753]: I0129 15:48:56.838701 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" event={"ID":"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b","Type":"ContainerDied","Data":"158f580b6e5edefc7e17c45d06d3de807c5865ce03a5871ef802de47c3e8484c"} Jan 29 15:48:57 crc kubenswrapper[4753]: I0129 15:48:57.055471 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:48:57 crc kubenswrapper[4753]: I0129 15:48:57.055918 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:48:57 crc kubenswrapper[4753]: I0129 15:48:57.852742 4753 generic.go:334] "Generic (PLEG): container finished" podID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" containerID="ded8aa09681cae136035e5c14bdc48bd44645ea616de7730aa3454e2e626db82" exitCode=0 Jan 29 15:48:57 crc kubenswrapper[4753]: I0129 15:48:57.852803 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" event={"ID":"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b","Type":"ContainerDied","Data":"ded8aa09681cae136035e5c14bdc48bd44645ea616de7730aa3454e2e626db82"} Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.297144 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.448832 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-bundle\") pod \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.449093 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-util\") pod \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.449179 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgx5m\" (UniqueName: \"kubernetes.io/projected/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-kube-api-access-rgx5m\") pod \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\" (UID: \"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b\") " Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.452909 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-bundle" (OuterVolumeSpecName: "bundle") pod "ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" (UID: "ea6b2cb6-43c4-4b8d-ac86-c5522959a43b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.457421 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-kube-api-access-rgx5m" (OuterVolumeSpecName: "kube-api-access-rgx5m") pod "ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" (UID: "ea6b2cb6-43c4-4b8d-ac86-c5522959a43b"). InnerVolumeSpecName "kube-api-access-rgx5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.466649 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-util" (OuterVolumeSpecName: "util") pod "ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" (UID: "ea6b2cb6-43c4-4b8d-ac86-c5522959a43b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.551416 4753 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-util\") on node \"crc\" DevicePath \"\"" Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.551459 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgx5m\" (UniqueName: \"kubernetes.io/projected/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-kube-api-access-rgx5m\") on node \"crc\" DevicePath \"\"" Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.551473 4753 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea6b2cb6-43c4-4b8d-ac86-c5522959a43b-bundle\") on node \"crc\" DevicePath \"\"" Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.875016 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" event={"ID":"ea6b2cb6-43c4-4b8d-ac86-c5522959a43b","Type":"ContainerDied","Data":"5e07e0608f52efebd3b8861e1317996f00d6679f71a69620ad0afc13f2c4ee2b"} Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.875073 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e07e0608f52efebd3b8861e1317996f00d6679f71a69620ad0afc13f2c4ee2b" Jan 29 15:48:59 crc kubenswrapper[4753]: I0129 15:48:59.875106 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44" Jan 29 15:49:10 crc kubenswrapper[4753]: I0129 15:49:10.093130 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-jc6gq"] Jan 29 15:49:10 crc kubenswrapper[4753]: I0129 15:49:10.100538 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-jc6gq"] Jan 29 15:49:10 crc kubenswrapper[4753]: I0129 15:49:10.160550 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782d2b35-9247-42b6-91b9-a935997474e7" path="/var/lib/kubelet/pods/782d2b35-9247-42b6-91b9-a935997474e7/volumes" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.789011 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj"] Jan 29 15:49:12 crc kubenswrapper[4753]: E0129 15:49:12.790012 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerName="extract-utilities" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790031 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerName="extract-utilities" Jan 29 15:49:12 crc kubenswrapper[4753]: E0129 15:49:12.790055 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" containerName="pull" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790063 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" containerName="pull" Jan 29 15:49:12 crc kubenswrapper[4753]: E0129 15:49:12.790082 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" containerName="util" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790092 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" containerName="util" Jan 29 15:49:12 crc kubenswrapper[4753]: E0129 15:49:12.790107 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerName="extract-utilities" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790115 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerName="extract-utilities" Jan 29 15:49:12 crc kubenswrapper[4753]: E0129 15:49:12.790136 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerName="registry-server" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790144 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerName="registry-server" Jan 29 15:49:12 crc kubenswrapper[4753]: E0129 15:49:12.790180 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerName="extract-content" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790189 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerName="extract-content" Jan 29 15:49:12 crc kubenswrapper[4753]: E0129 15:49:12.790199 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerName="extract-content" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790205 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerName="extract-content" Jan 29 15:49:12 crc kubenswrapper[4753]: E0129 15:49:12.790218 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerName="registry-server" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790225 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerName="registry-server" Jan 29 15:49:12 crc kubenswrapper[4753]: E0129 15:49:12.790243 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" containerName="extract" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790250 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" containerName="extract" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790468 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece32461-ebe5-4402-ad18-891f5bab74c6" containerName="registry-server" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790492 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6b2cb6-43c4-4b8d-ac86-c5522959a43b" containerName="extract" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.790522 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2af3b7-f304-4576-b4c6-1ffaba5b4278" containerName="registry-server" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.791355 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.794651 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-8rmvh" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.794700 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.794818 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.803904 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj"] Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.886057 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp"] Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.887762 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.891890 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.892760 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-2xbnf" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.908137 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd"] Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.909799 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.918194 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp"] Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.930267 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzqzc\" (UniqueName: \"kubernetes.io/projected/8d9e5081-ef9a-4c43-a32a-e4917c8c1db2-kube-api-access-tzqzc\") pod \"obo-prometheus-operator-68bc856cb9-k6ssj\" (UID: \"8d9e5081-ef9a-4c43-a32a-e4917c8c1db2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj" Jan 29 15:49:12 crc kubenswrapper[4753]: I0129 15:49:12.958869 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd"] Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.030395 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-j8wsv"] Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.032268 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.032496 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0b81ce9-61fe-40de-9647-d14c933f2b13-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-4cprd\" (UID: \"c0b81ce9-61fe-40de-9647-d14c933f2b13\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.032578 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/830e25f0-520a-446b-a145-5c1f0f3ceea1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-5b2cp\" (UID: \"830e25f0-520a-446b-a145-5c1f0f3ceea1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.032610 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0b81ce9-61fe-40de-9647-d14c933f2b13-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-4cprd\" (UID: \"c0b81ce9-61fe-40de-9647-d14c933f2b13\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.032644 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzqzc\" (UniqueName: \"kubernetes.io/projected/8d9e5081-ef9a-4c43-a32a-e4917c8c1db2-kube-api-access-tzqzc\") pod \"obo-prometheus-operator-68bc856cb9-k6ssj\" (UID: \"8d9e5081-ef9a-4c43-a32a-e4917c8c1db2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.032670 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/830e25f0-520a-446b-a145-5c1f0f3ceea1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-5b2cp\" (UID: \"830e25f0-520a-446b-a145-5c1f0f3ceea1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.035545 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.035760 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-98d67" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.046853 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-j8wsv"] Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.078475 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzqzc\" (UniqueName: \"kubernetes.io/projected/8d9e5081-ef9a-4c43-a32a-e4917c8c1db2-kube-api-access-tzqzc\") pod \"obo-prometheus-operator-68bc856cb9-k6ssj\" (UID: \"8d9e5081-ef9a-4c43-a32a-e4917c8c1db2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.113934 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.133999 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0b81ce9-61fe-40de-9647-d14c933f2b13-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-4cprd\" (UID: \"c0b81ce9-61fe-40de-9647-d14c933f2b13\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.134069 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/830e25f0-520a-446b-a145-5c1f0f3ceea1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-5b2cp\" (UID: \"830e25f0-520a-446b-a145-5c1f0f3ceea1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.134094 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0b81ce9-61fe-40de-9647-d14c933f2b13-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-4cprd\" (UID: \"c0b81ce9-61fe-40de-9647-d14c933f2b13\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.134118 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/830e25f0-520a-446b-a145-5c1f0f3ceea1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-5b2cp\" (UID: \"830e25f0-520a-446b-a145-5c1f0f3ceea1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.134145 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/45d80c47-de8f-426c-a6cf-c46adcf7394a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-j8wsv\" (UID: \"45d80c47-de8f-426c-a6cf-c46adcf7394a\") " pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.134327 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4mpf\" (UniqueName: \"kubernetes.io/projected/45d80c47-de8f-426c-a6cf-c46adcf7394a-kube-api-access-v4mpf\") pod \"observability-operator-59bdc8b94-j8wsv\" (UID: \"45d80c47-de8f-426c-a6cf-c46adcf7394a\") " pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.138272 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/830e25f0-520a-446b-a145-5c1f0f3ceea1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-5b2cp\" (UID: \"830e25f0-520a-446b-a145-5c1f0f3ceea1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.138415 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0b81ce9-61fe-40de-9647-d14c933f2b13-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-4cprd\" (UID: \"c0b81ce9-61fe-40de-9647-d14c933f2b13\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.139624 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0b81ce9-61fe-40de-9647-d14c933f2b13-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-4cprd\" (UID: \"c0b81ce9-61fe-40de-9647-d14c933f2b13\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.143617 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/830e25f0-520a-446b-a145-5c1f0f3ceea1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b499c759f-5b2cp\" (UID: \"830e25f0-520a-446b-a145-5c1f0f3ceea1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.211582 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.229659 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.234348 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dwlls"] Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.236339 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dwlls" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.236453 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/45d80c47-de8f-426c-a6cf-c46adcf7394a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-j8wsv\" (UID: \"45d80c47-de8f-426c-a6cf-c46adcf7394a\") " pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.236677 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4mpf\" (UniqueName: \"kubernetes.io/projected/45d80c47-de8f-426c-a6cf-c46adcf7394a-kube-api-access-v4mpf\") pod \"observability-operator-59bdc8b94-j8wsv\" (UID: \"45d80c47-de8f-426c-a6cf-c46adcf7394a\") " pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.241846 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-9rzfh" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.242399 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/45d80c47-de8f-426c-a6cf-c46adcf7394a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-j8wsv\" (UID: \"45d80c47-de8f-426c-a6cf-c46adcf7394a\") " pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.267010 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dwlls"] Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.306605 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4mpf\" (UniqueName: \"kubernetes.io/projected/45d80c47-de8f-426c-a6cf-c46adcf7394a-kube-api-access-v4mpf\") pod \"observability-operator-59bdc8b94-j8wsv\" (UID: \"45d80c47-de8f-426c-a6cf-c46adcf7394a\") " pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.340005 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgxgv\" (UniqueName: \"kubernetes.io/projected/785cd806-53b9-41c5-bde3-8445651ffff5-kube-api-access-vgxgv\") pod \"perses-operator-5bf474d74f-dwlls\" (UID: \"785cd806-53b9-41c5-bde3-8445651ffff5\") " pod="openshift-operators/perses-operator-5bf474d74f-dwlls" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.340292 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/785cd806-53b9-41c5-bde3-8445651ffff5-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dwlls\" (UID: \"785cd806-53b9-41c5-bde3-8445651ffff5\") " pod="openshift-operators/perses-operator-5bf474d74f-dwlls" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.366683 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.460305 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/785cd806-53b9-41c5-bde3-8445651ffff5-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dwlls\" (UID: \"785cd806-53b9-41c5-bde3-8445651ffff5\") " pod="openshift-operators/perses-operator-5bf474d74f-dwlls" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.460699 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgxgv\" (UniqueName: \"kubernetes.io/projected/785cd806-53b9-41c5-bde3-8445651ffff5-kube-api-access-vgxgv\") pod \"perses-operator-5bf474d74f-dwlls\" (UID: \"785cd806-53b9-41c5-bde3-8445651ffff5\") " pod="openshift-operators/perses-operator-5bf474d74f-dwlls" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.461894 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/785cd806-53b9-41c5-bde3-8445651ffff5-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dwlls\" (UID: \"785cd806-53b9-41c5-bde3-8445651ffff5\") " pod="openshift-operators/perses-operator-5bf474d74f-dwlls" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.500593 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgxgv\" (UniqueName: \"kubernetes.io/projected/785cd806-53b9-41c5-bde3-8445651ffff5-kube-api-access-vgxgv\") pod \"perses-operator-5bf474d74f-dwlls\" (UID: \"785cd806-53b9-41c5-bde3-8445651ffff5\") " pod="openshift-operators/perses-operator-5bf474d74f-dwlls" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.696724 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dwlls" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.704919 4753 scope.go:117] "RemoveContainer" containerID="21313d876224b59e882c7a5a5c997a5c1f413fa7b0bf73bb1300a2090b4343ea" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.751512 4753 scope.go:117] "RemoveContainer" containerID="bc3407d0a5c92f58438310b7f48aeacedd9b97b3c25829b886e6d324ee6bf3d4" Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.845446 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj"] Jan 29 15:49:13 crc kubenswrapper[4753]: I0129 15:49:13.923621 4753 scope.go:117] "RemoveContainer" containerID="11ecf0c88f759c8d15f6d7bfd2818d985fd245da02f5fa9dcefaaace099b83a6" Jan 29 15:49:14 crc kubenswrapper[4753]: I0129 15:49:14.036827 4753 scope.go:117] "RemoveContainer" containerID="07300aa973e7cedf0320cd4b96ddc63261f22a588c275ed9e0d41101076bfdc4" Jan 29 15:49:14 crc kubenswrapper[4753]: I0129 15:49:14.134100 4753 scope.go:117] "RemoveContainer" containerID="8d455b5e8c4810211d797d03e1c8774ea06e385b76a85777d4260549aad7323b" Jan 29 15:49:14 crc kubenswrapper[4753]: I0129 15:49:14.143835 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj" event={"ID":"8d9e5081-ef9a-4c43-a32a-e4917c8c1db2","Type":"ContainerStarted","Data":"a313c8f6596d6243bcb8ae3612a92dbee1b86a1394deb86e8a2c8bcb877b0484"} Jan 29 15:49:14 crc kubenswrapper[4753]: I0129 15:49:14.220934 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd"] Jan 29 15:49:14 crc kubenswrapper[4753]: I0129 15:49:14.245048 4753 scope.go:117] "RemoveContainer" containerID="d20fca9eb1965c2ee9d34c6086a3515cca60ee05516e5cdd03b22850d563cfed" Jan 29 15:49:14 crc kubenswrapper[4753]: I0129 15:49:14.318476 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp"] Jan 29 15:49:14 crc kubenswrapper[4753]: I0129 15:49:14.327859 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-j8wsv"] Jan 29 15:49:14 crc kubenswrapper[4753]: I0129 15:49:14.602658 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dwlls"] Jan 29 15:49:14 crc kubenswrapper[4753]: W0129 15:49:14.608865 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod785cd806_53b9_41c5_bde3_8445651ffff5.slice/crio-f47f5711e4380892d4d971a2581e0bf671419338a41c078ce2cc181e2ad7af4d WatchSource:0}: Error finding container f47f5711e4380892d4d971a2581e0bf671419338a41c078ce2cc181e2ad7af4d: Status 404 returned error can't find the container with id f47f5711e4380892d4d971a2581e0bf671419338a41c078ce2cc181e2ad7af4d Jan 29 15:49:15 crc kubenswrapper[4753]: I0129 15:49:15.236047 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dwlls" event={"ID":"785cd806-53b9-41c5-bde3-8445651ffff5","Type":"ContainerStarted","Data":"f47f5711e4380892d4d971a2581e0bf671419338a41c078ce2cc181e2ad7af4d"} Jan 29 15:49:15 crc kubenswrapper[4753]: I0129 15:49:15.243375 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" event={"ID":"c0b81ce9-61fe-40de-9647-d14c933f2b13","Type":"ContainerStarted","Data":"9511c3e132fa296cc503091c55c5839c6e3ee55e564144b2d9d59d96ab238546"} Jan 29 15:49:15 crc kubenswrapper[4753]: I0129 15:49:15.245658 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" event={"ID":"830e25f0-520a-446b-a145-5c1f0f3ceea1","Type":"ContainerStarted","Data":"f66967486e10b2b4266f0f6eb63c4ee84ab948743cc635059ea1788467e5ba53"} Jan 29 15:49:15 crc kubenswrapper[4753]: I0129 15:49:15.247692 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" event={"ID":"45d80c47-de8f-426c-a6cf-c46adcf7394a","Type":"ContainerStarted","Data":"f089d0d811cf0d28603f58008c9c4855dc8d104e729b289dcd184b6b4ed9c54a"} Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.447496 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5z222/must-gather-tl9tg"] Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.449677 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/must-gather-tl9tg" Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.451357 4753 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5z222"/"default-dockercfg-7mnxl" Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.454835 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5z222"/"kube-root-ca.crt" Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.455202 4753 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5z222"/"openshift-service-ca.crt" Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.459981 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5z222/must-gather-tl9tg"] Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.632830 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-must-gather-output\") pod \"must-gather-tl9tg\" (UID: \"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1\") " pod="openshift-must-gather-5z222/must-gather-tl9tg" Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.632908 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qts6l\" (UniqueName: \"kubernetes.io/projected/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-kube-api-access-qts6l\") pod \"must-gather-tl9tg\" (UID: \"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1\") " pod="openshift-must-gather-5z222/must-gather-tl9tg" Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.734660 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-must-gather-output\") pod \"must-gather-tl9tg\" (UID: \"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1\") " pod="openshift-must-gather-5z222/must-gather-tl9tg" Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.734829 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qts6l\" (UniqueName: \"kubernetes.io/projected/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-kube-api-access-qts6l\") pod \"must-gather-tl9tg\" (UID: \"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1\") " pod="openshift-must-gather-5z222/must-gather-tl9tg" Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.735646 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-must-gather-output\") pod \"must-gather-tl9tg\" (UID: \"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1\") " pod="openshift-must-gather-5z222/must-gather-tl9tg" Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.759183 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qts6l\" (UniqueName: \"kubernetes.io/projected/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-kube-api-access-qts6l\") pod \"must-gather-tl9tg\" (UID: \"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1\") " pod="openshift-must-gather-5z222/must-gather-tl9tg" Jan 29 15:49:22 crc kubenswrapper[4753]: I0129 15:49:22.772062 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/must-gather-tl9tg" Jan 29 15:49:27 crc kubenswrapper[4753]: I0129 15:49:27.056605 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:49:27 crc kubenswrapper[4753]: I0129 15:49:27.057205 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:49:27 crc kubenswrapper[4753]: I0129 15:49:27.057265 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:49:27 crc kubenswrapper[4753]: I0129 15:49:27.058457 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:49:27 crc kubenswrapper[4753]: I0129 15:49:27.058525 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" gracePeriod=600 Jan 29 15:49:27 crc kubenswrapper[4753]: I0129 15:49:27.385097 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" exitCode=0 Jan 29 15:49:27 crc kubenswrapper[4753]: I0129 15:49:27.385275 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e"} Jan 29 15:49:27 crc kubenswrapper[4753]: I0129 15:49:27.385545 4753 scope.go:117] "RemoveContainer" containerID="36ed70dab0ac14e2bbb32c7b086f3b2d4999c7c2d1318f4807ee75319489be51" Jan 29 15:49:28 crc kubenswrapper[4753]: E0129 15:49:28.871311 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Jan 29 15:49:28 crc kubenswrapper[4753]: E0129 15:49:28.871743 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-b499c759f-5b2cp_openshift-operators(830e25f0-520a-446b-a145-5c1f0f3ceea1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 15:49:28 crc kubenswrapper[4753]: E0129 15:49:28.872827 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" podUID="830e25f0-520a-446b-a145-5c1f0f3ceea1" Jan 29 15:49:28 crc kubenswrapper[4753]: E0129 15:49:28.912700 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:49:28 crc kubenswrapper[4753]: E0129 15:49:28.930657 4753 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Jan 29 15:49:28 crc kubenswrapper[4753]: E0129 15:49:28.931138 4753 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-b499c759f-4cprd_openshift-operators(c0b81ce9-61fe-40de-9647-d14c933f2b13): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 29 15:49:28 crc kubenswrapper[4753]: E0129 15:49:28.932697 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" podUID="c0b81ce9-61fe-40de-9647-d14c933f2b13" Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.409493 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dwlls" event={"ID":"785cd806-53b9-41c5-bde3-8445651ffff5","Type":"ContainerStarted","Data":"c406cfd6161c00eb5041e78a3db15f2e483682ecfcaf43dfd84c0dd03ffc6d5e"} Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.410122 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-dwlls" Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.417693 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:49:29 crc kubenswrapper[4753]: E0129 15:49:29.418011 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.420766 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" event={"ID":"45d80c47-de8f-426c-a6cf-c46adcf7394a","Type":"ContainerStarted","Data":"b7ab01ca085a80a2262a4dbac75b7b8a8011ab4cacc44da9b80978684d399126"} Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.421224 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" Jan 29 15:49:29 crc kubenswrapper[4753]: E0129 15:49:29.421696 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" podUID="830e25f0-520a-446b-a145-5c1f0f3ceea1" Jan 29 15:49:29 crc kubenswrapper[4753]: E0129 15:49:29.421977 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" podUID="c0b81ce9-61fe-40de-9647-d14c933f2b13" Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.423381 4753 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-j8wsv container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.1.138:8081/healthz\": dial tcp 10.217.1.138:8081: connect: connection refused" start-of-body= Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.423413 4753 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" podUID="45d80c47-de8f-426c-a6cf-c46adcf7394a" containerName="operator" probeResult="failure" output="Get \"http://10.217.1.138:8081/healthz\": dial tcp 10.217.1.138:8081: connect: connection refused" Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.442856 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-dwlls" podStartSLOduration=2.098174687 podStartE2EDuration="16.442839763s" podCreationTimestamp="2026-01-29 15:49:13 +0000 UTC" firstStartedPulling="2026-01-29 15:49:14.612874059 +0000 UTC m=+6389.307608441" lastFinishedPulling="2026-01-29 15:49:28.957539135 +0000 UTC m=+6403.652273517" observedRunningTime="2026-01-29 15:49:29.434522478 +0000 UTC m=+6404.129256860" watchObservedRunningTime="2026-01-29 15:49:29.442839763 +0000 UTC m=+6404.137574145" Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.493507 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" podStartSLOduration=1.841672799 podStartE2EDuration="16.493488143s" podCreationTimestamp="2026-01-29 15:49:13 +0000 UTC" firstStartedPulling="2026-01-29 15:49:14.340568563 +0000 UTC m=+6389.035302945" lastFinishedPulling="2026-01-29 15:49:28.992383907 +0000 UTC m=+6403.687118289" observedRunningTime="2026-01-29 15:49:29.490482041 +0000 UTC m=+6404.185216423" watchObservedRunningTime="2026-01-29 15:49:29.493488143 +0000 UTC m=+6404.188222525" Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.639352 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5z222/must-gather-tl9tg"] Jan 29 15:49:29 crc kubenswrapper[4753]: I0129 15:49:29.640962 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 15:49:30 crc kubenswrapper[4753]: I0129 15:49:30.432547 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5z222/must-gather-tl9tg" event={"ID":"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1","Type":"ContainerStarted","Data":"662fd341c7612ecb5e23e77f72c81667616f9a8cba02051cfe4a0be6f71d8716"} Jan 29 15:49:30 crc kubenswrapper[4753]: I0129 15:49:30.435233 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj" event={"ID":"8d9e5081-ef9a-4c43-a32a-e4917c8c1db2","Type":"ContainerStarted","Data":"40f44d959aeadaf35fd17ae6cccbc23b36db82a42fc5291f0836d193ed0b6150"} Jan 29 15:49:30 crc kubenswrapper[4753]: I0129 15:49:30.440319 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-j8wsv" Jan 29 15:49:30 crc kubenswrapper[4753]: I0129 15:49:30.480796 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-k6ssj" podStartSLOduration=3.395920193 podStartE2EDuration="18.480778651s" podCreationTimestamp="2026-01-29 15:49:12 +0000 UTC" firstStartedPulling="2026-01-29 15:49:13.860128137 +0000 UTC m=+6388.554862519" lastFinishedPulling="2026-01-29 15:49:28.944986595 +0000 UTC m=+6403.639720977" observedRunningTime="2026-01-29 15:49:30.47958822 +0000 UTC m=+6405.174322632" watchObservedRunningTime="2026-01-29 15:49:30.480778651 +0000 UTC m=+6405.175513033" Jan 29 15:49:41 crc kubenswrapper[4753]: I0129 15:49:41.150021 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:49:41 crc kubenswrapper[4753]: E0129 15:49:41.150893 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:49:41 crc kubenswrapper[4753]: I0129 15:49:41.561590 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5z222/must-gather-tl9tg" event={"ID":"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1","Type":"ContainerStarted","Data":"649861c985e6502475eae3a1faf0aa84f466446c36b9a4053e44f418cf0b8d79"} Jan 29 15:49:41 crc kubenswrapper[4753]: I0129 15:49:41.561975 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5z222/must-gather-tl9tg" event={"ID":"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1","Type":"ContainerStarted","Data":"904f62a3e8ed0487f213fe5c97d5a9b335be76524de2cb8e5216f153c44b4eba"} Jan 29 15:49:43 crc kubenswrapper[4753]: I0129 15:49:43.171193 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5z222/must-gather-tl9tg" podStartSLOduration=10.485269047 podStartE2EDuration="21.171171944s" podCreationTimestamp="2026-01-29 15:49:22 +0000 UTC" firstStartedPulling="2026-01-29 15:49:29.640589262 +0000 UTC m=+6404.335323644" lastFinishedPulling="2026-01-29 15:49:40.326492159 +0000 UTC m=+6415.021226541" observedRunningTime="2026-01-29 15:49:41.582117587 +0000 UTC m=+6416.276851969" watchObservedRunningTime="2026-01-29 15:49:43.171171944 +0000 UTC m=+6417.865906326" Jan 29 15:49:43 crc kubenswrapper[4753]: I0129 15:49:43.700735 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-dwlls" Jan 29 15:49:44 crc kubenswrapper[4753]: I0129 15:49:44.602042 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" event={"ID":"c0b81ce9-61fe-40de-9647-d14c933f2b13","Type":"ContainerStarted","Data":"3fe223074a76a088852d4eb3a37a6993e73bdf2b6ca33e05b081e41eeceba6a7"} Jan 29 15:49:44 crc kubenswrapper[4753]: I0129 15:49:44.603620 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" event={"ID":"830e25f0-520a-446b-a145-5c1f0f3ceea1","Type":"ContainerStarted","Data":"6a99ce8e50171ec1aedeabbd295ec1a600e675134df57c8fe76e9d1797b831ea"} Jan 29 15:49:44 crc kubenswrapper[4753]: I0129 15:49:44.633855 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-4cprd" podStartSLOduration=2.866993034 podStartE2EDuration="32.633832152s" podCreationTimestamp="2026-01-29 15:49:12 +0000 UTC" firstStartedPulling="2026-01-29 15:49:14.244368681 +0000 UTC m=+6388.939103053" lastFinishedPulling="2026-01-29 15:49:44.011207789 +0000 UTC m=+6418.705942171" observedRunningTime="2026-01-29 15:49:44.630586524 +0000 UTC m=+6419.325320916" watchObservedRunningTime="2026-01-29 15:49:44.633832152 +0000 UTC m=+6419.328566534" Jan 29 15:49:44 crc kubenswrapper[4753]: I0129 15:49:44.663707 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b499c759f-5b2cp" podStartSLOduration=-9223372004.19109 podStartE2EDuration="32.663687039s" podCreationTimestamp="2026-01-29 15:49:12 +0000 UTC" firstStartedPulling="2026-01-29 15:49:14.326089372 +0000 UTC m=+6389.020823744" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-29 15:49:44.656459774 +0000 UTC m=+6419.351194156" watchObservedRunningTime="2026-01-29 15:49:44.663687039 +0000 UTC m=+6419.358421421" Jan 29 15:49:46 crc kubenswrapper[4753]: I0129 15:49:46.472935 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5z222/crc-debug-dgkkv"] Jan 29 15:49:46 crc kubenswrapper[4753]: I0129 15:49:46.475584 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/crc-debug-dgkkv" Jan 29 15:49:46 crc kubenswrapper[4753]: I0129 15:49:46.621888 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prk52\" (UniqueName: \"kubernetes.io/projected/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-kube-api-access-prk52\") pod \"crc-debug-dgkkv\" (UID: \"5d74a0f9-6e19-48ca-b44b-6c1e2792db44\") " pod="openshift-must-gather-5z222/crc-debug-dgkkv" Jan 29 15:49:46 crc kubenswrapper[4753]: I0129 15:49:46.622038 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-host\") pod \"crc-debug-dgkkv\" (UID: \"5d74a0f9-6e19-48ca-b44b-6c1e2792db44\") " pod="openshift-must-gather-5z222/crc-debug-dgkkv" Jan 29 15:49:46 crc kubenswrapper[4753]: I0129 15:49:46.723264 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prk52\" (UniqueName: \"kubernetes.io/projected/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-kube-api-access-prk52\") pod \"crc-debug-dgkkv\" (UID: \"5d74a0f9-6e19-48ca-b44b-6c1e2792db44\") " pod="openshift-must-gather-5z222/crc-debug-dgkkv" Jan 29 15:49:46 crc kubenswrapper[4753]: I0129 15:49:46.723913 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-host\") pod \"crc-debug-dgkkv\" (UID: \"5d74a0f9-6e19-48ca-b44b-6c1e2792db44\") " pod="openshift-must-gather-5z222/crc-debug-dgkkv" Jan 29 15:49:46 crc kubenswrapper[4753]: I0129 15:49:46.724087 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-host\") pod \"crc-debug-dgkkv\" (UID: \"5d74a0f9-6e19-48ca-b44b-6c1e2792db44\") " pod="openshift-must-gather-5z222/crc-debug-dgkkv" Jan 29 15:49:46 crc kubenswrapper[4753]: I0129 15:49:46.745503 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prk52\" (UniqueName: \"kubernetes.io/projected/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-kube-api-access-prk52\") pod \"crc-debug-dgkkv\" (UID: \"5d74a0f9-6e19-48ca-b44b-6c1e2792db44\") " pod="openshift-must-gather-5z222/crc-debug-dgkkv" Jan 29 15:49:46 crc kubenswrapper[4753]: I0129 15:49:46.795859 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/crc-debug-dgkkv" Jan 29 15:49:46 crc kubenswrapper[4753]: W0129 15:49:46.854922 4753 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d74a0f9_6e19_48ca_b44b_6c1e2792db44.slice/crio-faa11c57764e57ba25adf08f4842cac513694a42427aa844d966eac66f9731dc WatchSource:0}: Error finding container faa11c57764e57ba25adf08f4842cac513694a42427aa844d966eac66f9731dc: Status 404 returned error can't find the container with id faa11c57764e57ba25adf08f4842cac513694a42427aa844d966eac66f9731dc Jan 29 15:49:47 crc kubenswrapper[4753]: I0129 15:49:47.635071 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5z222/crc-debug-dgkkv" event={"ID":"5d74a0f9-6e19-48ca-b44b-6c1e2792db44","Type":"ContainerStarted","Data":"faa11c57764e57ba25adf08f4842cac513694a42427aa844d966eac66f9731dc"} Jan 29 15:49:56 crc kubenswrapper[4753]: I0129 15:49:56.155369 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:49:56 crc kubenswrapper[4753]: E0129 15:49:56.156269 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:50:01 crc kubenswrapper[4753]: I0129 15:50:01.784864 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5z222/crc-debug-dgkkv" event={"ID":"5d74a0f9-6e19-48ca-b44b-6c1e2792db44","Type":"ContainerStarted","Data":"04c3ed89d5ac7394ffc9b1a06d61f21d39c06d671a648412053630f80a79465c"} Jan 29 15:50:01 crc kubenswrapper[4753]: I0129 15:50:01.812069 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5z222/crc-debug-dgkkv" podStartSLOduration=1.763189019 podStartE2EDuration="15.812047911s" podCreationTimestamp="2026-01-29 15:49:46 +0000 UTC" firstStartedPulling="2026-01-29 15:49:46.857140508 +0000 UTC m=+6421.551874890" lastFinishedPulling="2026-01-29 15:50:00.9059994 +0000 UTC m=+6435.600733782" observedRunningTime="2026-01-29 15:50:01.81020841 +0000 UTC m=+6436.504942812" watchObservedRunningTime="2026-01-29 15:50:01.812047911 +0000 UTC m=+6436.506782293" Jan 29 15:50:10 crc kubenswrapper[4753]: I0129 15:50:10.149916 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:50:10 crc kubenswrapper[4753]: E0129 15:50:10.150713 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.274680 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lhlkk"] Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.277331 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.299280 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lhlkk"] Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.434418 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8md9\" (UniqueName: \"kubernetes.io/projected/40e88645-493b-49af-a4d1-f49e759b2d09-kube-api-access-j8md9\") pod \"redhat-marketplace-lhlkk\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.435100 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-catalog-content\") pod \"redhat-marketplace-lhlkk\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.435326 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-utilities\") pod \"redhat-marketplace-lhlkk\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.536996 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-catalog-content\") pod \"redhat-marketplace-lhlkk\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.537171 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-utilities\") pod \"redhat-marketplace-lhlkk\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.537246 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8md9\" (UniqueName: \"kubernetes.io/projected/40e88645-493b-49af-a4d1-f49e759b2d09-kube-api-access-j8md9\") pod \"redhat-marketplace-lhlkk\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.538191 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-catalog-content\") pod \"redhat-marketplace-lhlkk\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.538442 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-utilities\") pod \"redhat-marketplace-lhlkk\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.555824 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8md9\" (UniqueName: \"kubernetes.io/projected/40e88645-493b-49af-a4d1-f49e759b2d09-kube-api-access-j8md9\") pod \"redhat-marketplace-lhlkk\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:15 crc kubenswrapper[4753]: I0129 15:50:15.605298 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:18 crc kubenswrapper[4753]: I0129 15:50:18.659277 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lhlkk"] Jan 29 15:50:18 crc kubenswrapper[4753]: I0129 15:50:18.941600 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhlkk" event={"ID":"40e88645-493b-49af-a4d1-f49e759b2d09","Type":"ContainerStarted","Data":"07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c"} Jan 29 15:50:18 crc kubenswrapper[4753]: I0129 15:50:18.941947 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhlkk" event={"ID":"40e88645-493b-49af-a4d1-f49e759b2d09","Type":"ContainerStarted","Data":"2ad3b3c60878814d73845ab96babbc72c233807705a817c7117e239f95aec3af"} Jan 29 15:50:19 crc kubenswrapper[4753]: I0129 15:50:19.952827 4753 generic.go:334] "Generic (PLEG): container finished" podID="40e88645-493b-49af-a4d1-f49e759b2d09" containerID="07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c" exitCode=0 Jan 29 15:50:19 crc kubenswrapper[4753]: I0129 15:50:19.953140 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhlkk" event={"ID":"40e88645-493b-49af-a4d1-f49e759b2d09","Type":"ContainerDied","Data":"07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c"} Jan 29 15:50:21 crc kubenswrapper[4753]: I0129 15:50:21.988879 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhlkk" event={"ID":"40e88645-493b-49af-a4d1-f49e759b2d09","Type":"ContainerStarted","Data":"61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83"} Jan 29 15:50:24 crc kubenswrapper[4753]: I0129 15:50:24.008688 4753 generic.go:334] "Generic (PLEG): container finished" podID="40e88645-493b-49af-a4d1-f49e759b2d09" containerID="61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83" exitCode=0 Jan 29 15:50:24 crc kubenswrapper[4753]: I0129 15:50:24.008770 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhlkk" event={"ID":"40e88645-493b-49af-a4d1-f49e759b2d09","Type":"ContainerDied","Data":"61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83"} Jan 29 15:50:24 crc kubenswrapper[4753]: I0129 15:50:24.149953 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:50:24 crc kubenswrapper[4753]: E0129 15:50:24.150221 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:50:26 crc kubenswrapper[4753]: I0129 15:50:26.029290 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhlkk" event={"ID":"40e88645-493b-49af-a4d1-f49e759b2d09","Type":"ContainerStarted","Data":"27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6"} Jan 29 15:50:26 crc kubenswrapper[4753]: I0129 15:50:26.066745 4753 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lhlkk" podStartSLOduration=6.231801627 podStartE2EDuration="11.066719743s" podCreationTimestamp="2026-01-29 15:50:15 +0000 UTC" firstStartedPulling="2026-01-29 15:50:19.954931765 +0000 UTC m=+6454.649666147" lastFinishedPulling="2026-01-29 15:50:24.789849881 +0000 UTC m=+6459.484584263" observedRunningTime="2026-01-29 15:50:26.059131847 +0000 UTC m=+6460.753866229" watchObservedRunningTime="2026-01-29 15:50:26.066719743 +0000 UTC m=+6460.761454125" Jan 29 15:50:30 crc kubenswrapper[4753]: I0129 15:50:30.062919 4753 generic.go:334] "Generic (PLEG): container finished" podID="5d74a0f9-6e19-48ca-b44b-6c1e2792db44" containerID="04c3ed89d5ac7394ffc9b1a06d61f21d39c06d671a648412053630f80a79465c" exitCode=0 Jan 29 15:50:30 crc kubenswrapper[4753]: I0129 15:50:30.063012 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5z222/crc-debug-dgkkv" event={"ID":"5d74a0f9-6e19-48ca-b44b-6c1e2792db44","Type":"ContainerDied","Data":"04c3ed89d5ac7394ffc9b1a06d61f21d39c06d671a648412053630f80a79465c"} Jan 29 15:50:31 crc kubenswrapper[4753]: I0129 15:50:31.189957 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/crc-debug-dgkkv" Jan 29 15:50:31 crc kubenswrapper[4753]: I0129 15:50:31.223905 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5z222/crc-debug-dgkkv"] Jan 29 15:50:31 crc kubenswrapper[4753]: I0129 15:50:31.238565 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5z222/crc-debug-dgkkv"] Jan 29 15:50:31 crc kubenswrapper[4753]: I0129 15:50:31.279271 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-host\") pod \"5d74a0f9-6e19-48ca-b44b-6c1e2792db44\" (UID: \"5d74a0f9-6e19-48ca-b44b-6c1e2792db44\") " Jan 29 15:50:31 crc kubenswrapper[4753]: I0129 15:50:31.279397 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prk52\" (UniqueName: \"kubernetes.io/projected/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-kube-api-access-prk52\") pod \"5d74a0f9-6e19-48ca-b44b-6c1e2792db44\" (UID: \"5d74a0f9-6e19-48ca-b44b-6c1e2792db44\") " Jan 29 15:50:31 crc kubenswrapper[4753]: I0129 15:50:31.279714 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-host" (OuterVolumeSpecName: "host") pod "5d74a0f9-6e19-48ca-b44b-6c1e2792db44" (UID: "5d74a0f9-6e19-48ca-b44b-6c1e2792db44"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 15:50:31 crc kubenswrapper[4753]: I0129 15:50:31.280399 4753 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-host\") on node \"crc\" DevicePath \"\"" Jan 29 15:50:31 crc kubenswrapper[4753]: I0129 15:50:31.300973 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-kube-api-access-prk52" (OuterVolumeSpecName: "kube-api-access-prk52") pod "5d74a0f9-6e19-48ca-b44b-6c1e2792db44" (UID: "5d74a0f9-6e19-48ca-b44b-6c1e2792db44"). InnerVolumeSpecName "kube-api-access-prk52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:50:31 crc kubenswrapper[4753]: I0129 15:50:31.381700 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prk52\" (UniqueName: \"kubernetes.io/projected/5d74a0f9-6e19-48ca-b44b-6c1e2792db44-kube-api-access-prk52\") on node \"crc\" DevicePath \"\"" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.081671 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faa11c57764e57ba25adf08f4842cac513694a42427aa844d966eac66f9731dc" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.082098 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/crc-debug-dgkkv" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.161991 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d74a0f9-6e19-48ca-b44b-6c1e2792db44" path="/var/lib/kubelet/pods/5d74a0f9-6e19-48ca-b44b-6c1e2792db44/volumes" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.522369 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5z222/crc-debug-cx4hs"] Jan 29 15:50:32 crc kubenswrapper[4753]: E0129 15:50:32.522880 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d74a0f9-6e19-48ca-b44b-6c1e2792db44" containerName="container-00" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.522895 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d74a0f9-6e19-48ca-b44b-6c1e2792db44" containerName="container-00" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.523119 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d74a0f9-6e19-48ca-b44b-6c1e2792db44" containerName="container-00" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.524004 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/crc-debug-cx4hs" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.707605 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4rmt\" (UniqueName: \"kubernetes.io/projected/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-kube-api-access-p4rmt\") pod \"crc-debug-cx4hs\" (UID: \"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1\") " pod="openshift-must-gather-5z222/crc-debug-cx4hs" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.707861 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-host\") pod \"crc-debug-cx4hs\" (UID: \"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1\") " pod="openshift-must-gather-5z222/crc-debug-cx4hs" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.810110 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4rmt\" (UniqueName: \"kubernetes.io/projected/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-kube-api-access-p4rmt\") pod \"crc-debug-cx4hs\" (UID: \"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1\") " pod="openshift-must-gather-5z222/crc-debug-cx4hs" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.810676 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-host\") pod \"crc-debug-cx4hs\" (UID: \"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1\") " pod="openshift-must-gather-5z222/crc-debug-cx4hs" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.810852 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-host\") pod \"crc-debug-cx4hs\" (UID: \"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1\") " pod="openshift-must-gather-5z222/crc-debug-cx4hs" Jan 29 15:50:32 crc kubenswrapper[4753]: I0129 15:50:32.846975 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4rmt\" (UniqueName: \"kubernetes.io/projected/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-kube-api-access-p4rmt\") pod \"crc-debug-cx4hs\" (UID: \"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1\") " pod="openshift-must-gather-5z222/crc-debug-cx4hs" Jan 29 15:50:33 crc kubenswrapper[4753]: I0129 15:50:33.141962 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/crc-debug-cx4hs" Jan 29 15:50:34 crc kubenswrapper[4753]: I0129 15:50:34.099743 4753 generic.go:334] "Generic (PLEG): container finished" podID="70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1" containerID="97d3c1b5eac417f7bd468f1fc2543982fcaf77feab98122f74f4b88baf112665" exitCode=1 Jan 29 15:50:34 crc kubenswrapper[4753]: I0129 15:50:34.099811 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5z222/crc-debug-cx4hs" event={"ID":"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1","Type":"ContainerDied","Data":"97d3c1b5eac417f7bd468f1fc2543982fcaf77feab98122f74f4b88baf112665"} Jan 29 15:50:34 crc kubenswrapper[4753]: I0129 15:50:34.101243 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5z222/crc-debug-cx4hs" event={"ID":"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1","Type":"ContainerStarted","Data":"d9ceb369a51c3dc4136322c62476a549ecfdec4219187864288eda5bde699948"} Jan 29 15:50:34 crc kubenswrapper[4753]: I0129 15:50:34.146033 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5z222/crc-debug-cx4hs"] Jan 29 15:50:34 crc kubenswrapper[4753]: I0129 15:50:34.162820 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5z222/crc-debug-cx4hs"] Jan 29 15:50:35 crc kubenswrapper[4753]: I0129 15:50:35.213059 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/crc-debug-cx4hs" Jan 29 15:50:35 crc kubenswrapper[4753]: I0129 15:50:35.358862 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4rmt\" (UniqueName: \"kubernetes.io/projected/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-kube-api-access-p4rmt\") pod \"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1\" (UID: \"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1\") " Jan 29 15:50:35 crc kubenswrapper[4753]: I0129 15:50:35.358990 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-host\") pod \"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1\" (UID: \"70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1\") " Jan 29 15:50:35 crc kubenswrapper[4753]: I0129 15:50:35.359311 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-host" (OuterVolumeSpecName: "host") pod "70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1" (UID: "70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 15:50:35 crc kubenswrapper[4753]: I0129 15:50:35.359747 4753 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-host\") on node \"crc\" DevicePath \"\"" Jan 29 15:50:35 crc kubenswrapper[4753]: I0129 15:50:35.365260 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-kube-api-access-p4rmt" (OuterVolumeSpecName: "kube-api-access-p4rmt") pod "70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1" (UID: "70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1"). InnerVolumeSpecName "kube-api-access-p4rmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:50:35 crc kubenswrapper[4753]: I0129 15:50:35.461129 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4rmt\" (UniqueName: \"kubernetes.io/projected/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1-kube-api-access-p4rmt\") on node \"crc\" DevicePath \"\"" Jan 29 15:50:35 crc kubenswrapper[4753]: I0129 15:50:35.605761 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:35 crc kubenswrapper[4753]: I0129 15:50:35.605804 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:35 crc kubenswrapper[4753]: I0129 15:50:35.664565 4753 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:36 crc kubenswrapper[4753]: I0129 15:50:36.117889 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/crc-debug-cx4hs" Jan 29 15:50:36 crc kubenswrapper[4753]: I0129 15:50:36.118322 4753 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9ceb369a51c3dc4136322c62476a549ecfdec4219187864288eda5bde699948" Jan 29 15:50:36 crc kubenswrapper[4753]: I0129 15:50:36.159591 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:50:36 crc kubenswrapper[4753]: E0129 15:50:36.160123 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:50:36 crc kubenswrapper[4753]: I0129 15:50:36.174557 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1" path="/var/lib/kubelet/pods/70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1/volumes" Jan 29 15:50:36 crc kubenswrapper[4753]: I0129 15:50:36.189048 4753 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:36 crc kubenswrapper[4753]: I0129 15:50:36.250614 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lhlkk"] Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.134348 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lhlkk" podUID="40e88645-493b-49af-a4d1-f49e759b2d09" containerName="registry-server" containerID="cri-o://27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6" gracePeriod=2 Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.740123 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.829798 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-utilities\") pod \"40e88645-493b-49af-a4d1-f49e759b2d09\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.829964 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-catalog-content\") pod \"40e88645-493b-49af-a4d1-f49e759b2d09\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.830009 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8md9\" (UniqueName: \"kubernetes.io/projected/40e88645-493b-49af-a4d1-f49e759b2d09-kube-api-access-j8md9\") pod \"40e88645-493b-49af-a4d1-f49e759b2d09\" (UID: \"40e88645-493b-49af-a4d1-f49e759b2d09\") " Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.830990 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-utilities" (OuterVolumeSpecName: "utilities") pod "40e88645-493b-49af-a4d1-f49e759b2d09" (UID: "40e88645-493b-49af-a4d1-f49e759b2d09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.836032 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e88645-493b-49af-a4d1-f49e759b2d09-kube-api-access-j8md9" (OuterVolumeSpecName: "kube-api-access-j8md9") pod "40e88645-493b-49af-a4d1-f49e759b2d09" (UID: "40e88645-493b-49af-a4d1-f49e759b2d09"). InnerVolumeSpecName "kube-api-access-j8md9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.854348 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40e88645-493b-49af-a4d1-f49e759b2d09" (UID: "40e88645-493b-49af-a4d1-f49e759b2d09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.931848 4753 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.931906 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8md9\" (UniqueName: \"kubernetes.io/projected/40e88645-493b-49af-a4d1-f49e759b2d09-kube-api-access-j8md9\") on node \"crc\" DevicePath \"\"" Jan 29 15:50:38 crc kubenswrapper[4753]: I0129 15:50:38.931919 4753 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e88645-493b-49af-a4d1-f49e759b2d09-utilities\") on node \"crc\" DevicePath \"\"" Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.154064 4753 generic.go:334] "Generic (PLEG): container finished" podID="40e88645-493b-49af-a4d1-f49e759b2d09" containerID="27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6" exitCode=0 Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.154112 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhlkk" event={"ID":"40e88645-493b-49af-a4d1-f49e759b2d09","Type":"ContainerDied","Data":"27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6"} Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.154140 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lhlkk" event={"ID":"40e88645-493b-49af-a4d1-f49e759b2d09","Type":"ContainerDied","Data":"2ad3b3c60878814d73845ab96babbc72c233807705a817c7117e239f95aec3af"} Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.154186 4753 scope.go:117] "RemoveContainer" containerID="27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6" Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.154327 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lhlkk" Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.183428 4753 scope.go:117] "RemoveContainer" containerID="61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83" Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.217642 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lhlkk"] Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.235050 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lhlkk"] Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.249728 4753 scope.go:117] "RemoveContainer" containerID="07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c" Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.286640 4753 scope.go:117] "RemoveContainer" containerID="27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6" Jan 29 15:50:39 crc kubenswrapper[4753]: E0129 15:50:39.287137 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6\": container with ID starting with 27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6 not found: ID does not exist" containerID="27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6" Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.287198 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6"} err="failed to get container status \"27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6\": rpc error: code = NotFound desc = could not find container \"27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6\": container with ID starting with 27399b66f06ad9191330557e2995650b6498282630c6a5b27a0fd798d9455bb6 not found: ID does not exist" Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.287224 4753 scope.go:117] "RemoveContainer" containerID="61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83" Jan 29 15:50:39 crc kubenswrapper[4753]: E0129 15:50:39.287711 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83\": container with ID starting with 61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83 not found: ID does not exist" containerID="61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83" Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.287916 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83"} err="failed to get container status \"61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83\": rpc error: code = NotFound desc = could not find container \"61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83\": container with ID starting with 61e02e38284f203c2ebcb33930964977c405a1e6d6517588f90835728637fc83 not found: ID does not exist" Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.288229 4753 scope.go:117] "RemoveContainer" containerID="07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c" Jan 29 15:50:39 crc kubenswrapper[4753]: E0129 15:50:39.293194 4753 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c\": container with ID starting with 07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c not found: ID does not exist" containerID="07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c" Jan 29 15:50:39 crc kubenswrapper[4753]: I0129 15:50:39.293246 4753 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c"} err="failed to get container status \"07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c\": rpc error: code = NotFound desc = could not find container \"07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c\": container with ID starting with 07b25d9673a2cda98c2c398c7362ad563ca2db5cab111c55b598b3e1c01f170c not found: ID does not exist" Jan 29 15:50:40 crc kubenswrapper[4753]: I0129 15:50:40.164982 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e88645-493b-49af-a4d1-f49e759b2d09" path="/var/lib/kubelet/pods/40e88645-493b-49af-a4d1-f49e759b2d09/volumes" Jan 29 15:50:49 crc kubenswrapper[4753]: I0129 15:50:49.150262 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:50:49 crc kubenswrapper[4753]: E0129 15:50:49.151080 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:51:00 crc kubenswrapper[4753]: I0129 15:51:00.150435 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:51:00 crc kubenswrapper[4753]: E0129 15:51:00.151483 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:51:03 crc kubenswrapper[4753]: I0129 15:51:03.344141 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-699cfdb8d4-skqb8_2a04bc36-b333-40bb-8a95-38a148b53e8b/barbican-api/0.log" Jan 29 15:51:03 crc kubenswrapper[4753]: I0129 15:51:03.519741 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-699cfdb8d4-skqb8_2a04bc36-b333-40bb-8a95-38a148b53e8b/barbican-api-log/0.log" Jan 29 15:51:03 crc kubenswrapper[4753]: I0129 15:51:03.605285 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78c4f66974-59mhf_710ca968-bd29-41e7-9101-11e445b4fc1b/barbican-keystone-listener/0.log" Jan 29 15:51:03 crc kubenswrapper[4753]: I0129 15:51:03.709050 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78c4f66974-59mhf_710ca968-bd29-41e7-9101-11e445b4fc1b/barbican-keystone-listener-log/0.log" Jan 29 15:51:03 crc kubenswrapper[4753]: I0129 15:51:03.791419 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7cfd7cb57f-82mpw_1b997396-38f1-426e-a2d8-318808c53a6c/barbican-worker/0.log" Jan 29 15:51:03 crc kubenswrapper[4753]: I0129 15:51:03.887753 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7cfd7cb57f-82mpw_1b997396-38f1-426e-a2d8-318808c53a6c/barbican-worker-log/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.025704 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d567cae9-e775-4bd7-af21-b75b563dd220/cinder-api/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.064905 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d567cae9-e775-4bd7-af21-b75b563dd220/cinder-api-log/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.225588 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_567bff90-0d20-4a19-b301-eac95246fc6d/probe/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.318960 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_567bff90-0d20-4a19-b301-eac95246fc6d/cinder-backup/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.408115 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_78998281-5df0-4295-80f6-3b9e0cd7fac4/cinder-scheduler/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.462703 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_78998281-5df0-4295-80f6-3b9e0cd7fac4/probe/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.615315 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_6d501b41-fcc7-47e2-8dae-b76c5b7b0519/cinder-volume/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.636786 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_6d501b41-fcc7-47e2-8dae-b76c5b7b0519/probe/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.823091 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6c4df955-76jcm_87c3e570-7085-4a5f-b38b-7d4d0df86a99/init/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.981539 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6c4df955-76jcm_87c3e570-7085-4a5f-b38b-7d4d0df86a99/init/0.log" Jan 29 15:51:04 crc kubenswrapper[4753]: I0129 15:51:04.981679 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6c4df955-76jcm_87c3e570-7085-4a5f-b38b-7d4d0df86a99/dnsmasq-dns/0.log" Jan 29 15:51:05 crc kubenswrapper[4753]: I0129 15:51:05.024530 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_104ef6b1-854f-423e-bfde-2a4d8beedd8f/glance-httpd/0.log" Jan 29 15:51:05 crc kubenswrapper[4753]: I0129 15:51:05.150497 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_104ef6b1-854f-423e-bfde-2a4d8beedd8f/glance-log/0.log" Jan 29 15:51:05 crc kubenswrapper[4753]: I0129 15:51:05.197572 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ea96ed85-0d6b-4874-ae27-4844dc5ac67b/glance-httpd/0.log" Jan 29 15:51:05 crc kubenswrapper[4753]: I0129 15:51:05.298602 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ea96ed85-0d6b-4874-ae27-4844dc5ac67b/glance-log/0.log" Jan 29 15:51:05 crc kubenswrapper[4753]: I0129 15:51:05.474913 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5dc4fc86d8-4q5ch_009bf51b-359a-4af2-9834-ef20a1268c6d/heat-api/0.log" Jan 29 15:51:05 crc kubenswrapper[4753]: I0129 15:51:05.569810 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cb3d-account-create-update-zsflm_8a2d9450-6c48-4896-b5f6-cafb5803c488/mariadb-account-create-update/0.log" Jan 29 15:51:05 crc kubenswrapper[4753]: I0129 15:51:05.707694 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-db45c8f86-r6kzd_38e0b4cf-43ec-4245-9b77-5198f3846248/heat-cfnapi/0.log" Jan 29 15:51:05 crc kubenswrapper[4753]: I0129 15:51:05.789535 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-create-2tsdk_5980b21e-1b3d-4944-b456-b7b1047ab256/mariadb-database-create/0.log" Jan 29 15:51:05 crc kubenswrapper[4753]: I0129 15:51:05.921390 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-sync-w4q8l_17a71ced-859b-411a-8502-2bbeeae1bf5e/heat-db-sync/0.log" Jan 29 15:51:06 crc kubenswrapper[4753]: I0129 15:51:06.068631 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6b948c8fcf-v2dml_b7c80c58-8fa7-4f69-8909-2134cb48d953/heat-engine/0.log" Jan 29 15:51:06 crc kubenswrapper[4753]: I0129 15:51:06.246284 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6797b66f4f-wv5m7_5fced629-f257-4241-9f17-7856b0472fb9/horizon-log/0.log" Jan 29 15:51:06 crc kubenswrapper[4753]: I0129 15:51:06.303690 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6797b66f4f-wv5m7_5fced629-f257-4241-9f17-7856b0472fb9/horizon/0.log" Jan 29 15:51:06 crc kubenswrapper[4753]: I0129 15:51:06.487233 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d955797-s6jpd_f5d2e134-c722-47ef-b1c9-696e16fa72ce/keystone-api/0.log" Jan 29 15:51:06 crc kubenswrapper[4753]: I0129 15:51:06.552543 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_744ae72c-556c-4e95-afc0-0f7fb7cb0923/adoption/0.log" Jan 29 15:51:06 crc kubenswrapper[4753]: I0129 15:51:06.839597 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7c88956d6f-wsn9g_56cda1a5-a73b-4da0-b9e1-0d95f12387c8/neutron-api/0.log" Jan 29 15:51:06 crc kubenswrapper[4753]: I0129 15:51:06.931612 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7c88956d6f-wsn9g_56cda1a5-a73b-4da0-b9e1-0d95f12387c8/neutron-httpd/0.log" Jan 29 15:51:07 crc kubenswrapper[4753]: I0129 15:51:07.194645 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_493fa42b-a667-4c4f-8473-2848bb63714b/nova-api-api/0.log" Jan 29 15:51:07 crc kubenswrapper[4753]: I0129 15:51:07.302210 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_493fa42b-a667-4c4f-8473-2848bb63714b/nova-api-log/0.log" Jan 29 15:51:07 crc kubenswrapper[4753]: I0129 15:51:07.528581 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_33161748-700b-402e-b133-35b428c5887d/nova-cell0-conductor-conductor/0.log" Jan 29 15:51:07 crc kubenswrapper[4753]: I0129 15:51:07.657335 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_016bbe0f-680c-4d36-a48d-8092b9174669/nova-cell1-conductor-conductor/0.log" Jan 29 15:51:07 crc kubenswrapper[4753]: I0129 15:51:07.810365 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_968ff4a1-242c-4e6c-af17-20956a2c99c4/nova-cell1-novncproxy-novncproxy/0.log" Jan 29 15:51:07 crc kubenswrapper[4753]: I0129 15:51:07.914802 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_80fcb37d-8d12-4311-859b-6aeb6e200031/nova-metadata-metadata/0.log" Jan 29 15:51:08 crc kubenswrapper[4753]: I0129 15:51:08.037254 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_80fcb37d-8d12-4311-859b-6aeb6e200031/nova-metadata-log/0.log" Jan 29 15:51:08 crc kubenswrapper[4753]: I0129 15:51:08.124604 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_dbd601de-74ab-4a96-9e9d-47fc53a047ac/nova-scheduler-scheduler/0.log" Jan 29 15:51:08 crc kubenswrapper[4753]: I0129 15:51:08.307441 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6db4d9d88d-tngdc_8f2e6df8-8fdf-43d7-bf49-b97301d2767a/init/0.log" Jan 29 15:51:08 crc kubenswrapper[4753]: I0129 15:51:08.480134 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6db4d9d88d-tngdc_8f2e6df8-8fdf-43d7-bf49-b97301d2767a/init/0.log" Jan 29 15:51:08 crc kubenswrapper[4753]: I0129 15:51:08.598516 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6db4d9d88d-tngdc_8f2e6df8-8fdf-43d7-bf49-b97301d2767a/octavia-api-provider-agent/0.log" Jan 29 15:51:08 crc kubenswrapper[4753]: I0129 15:51:08.693482 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6db4d9d88d-tngdc_8f2e6df8-8fdf-43d7-bf49-b97301d2767a/octavia-api/0.log" Jan 29 15:51:08 crc kubenswrapper[4753]: I0129 15:51:08.719748 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-t2tws_65774cd6-bd93-4aef-8747-d2436ecb13ec/init/0.log" Jan 29 15:51:08 crc kubenswrapper[4753]: I0129 15:51:08.863315 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-t2tws_65774cd6-bd93-4aef-8747-d2436ecb13ec/init/0.log" Jan 29 15:51:08 crc kubenswrapper[4753]: I0129 15:51:08.923841 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-vv6cz_f72baddf-f7f0-4f14-9d4e-cf49b4790797/init/0.log" Jan 29 15:51:08 crc kubenswrapper[4753]: I0129 15:51:08.958520 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-t2tws_65774cd6-bd93-4aef-8747-d2436ecb13ec/octavia-healthmanager/0.log" Jan 29 15:51:09 crc kubenswrapper[4753]: I0129 15:51:09.158540 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-vv6cz_f72baddf-f7f0-4f14-9d4e-cf49b4790797/init/0.log" Jan 29 15:51:09 crc kubenswrapper[4753]: I0129 15:51:09.259957 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-vv6cz_f72baddf-f7f0-4f14-9d4e-cf49b4790797/octavia-housekeeping/0.log" Jan 29 15:51:09 crc kubenswrapper[4753]: I0129 15:51:09.338718 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-65dd99cb46-96gzr_ce9f7a8b-5457-4348-8850-959a150b9a35/init/0.log" Jan 29 15:51:09 crc kubenswrapper[4753]: I0129 15:51:09.474178 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-65dd99cb46-96gzr_ce9f7a8b-5457-4348-8850-959a150b9a35/octavia-amphora-httpd/0.log" Jan 29 15:51:09 crc kubenswrapper[4753]: I0129 15:51:09.534480 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-65dd99cb46-96gzr_ce9f7a8b-5457-4348-8850-959a150b9a35/init/0.log" Jan 29 15:51:09 crc kubenswrapper[4753]: I0129 15:51:09.604666 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-fs4wp_4f4c395f-96c2-4745-a815-dc86a0a60498/init/0.log" Jan 29 15:51:09 crc kubenswrapper[4753]: I0129 15:51:09.827855 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-vdlhc_6bb0bc99-03cf-48b9-a648-7141b56c18ba/init/0.log" Jan 29 15:51:09 crc kubenswrapper[4753]: I0129 15:51:09.847353 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-fs4wp_4f4c395f-96c2-4745-a815-dc86a0a60498/init/0.log" Jan 29 15:51:09 crc kubenswrapper[4753]: I0129 15:51:09.922345 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-fs4wp_4f4c395f-96c2-4745-a815-dc86a0a60498/octavia-rsyslog/0.log" Jan 29 15:51:10 crc kubenswrapper[4753]: I0129 15:51:10.290652 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-vdlhc_6bb0bc99-03cf-48b9-a648-7141b56c18ba/init/0.log" Jan 29 15:51:10 crc kubenswrapper[4753]: I0129 15:51:10.345970 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0686c722-99c5-44dd-994b-3525d5642d96/mysql-bootstrap/0.log" Jan 29 15:51:10 crc kubenswrapper[4753]: I0129 15:51:10.376253 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-vdlhc_6bb0bc99-03cf-48b9-a648-7141b56c18ba/octavia-worker/0.log" Jan 29 15:51:10 crc kubenswrapper[4753]: I0129 15:51:10.777378 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0686c722-99c5-44dd-994b-3525d5642d96/mysql-bootstrap/0.log" Jan 29 15:51:10 crc kubenswrapper[4753]: I0129 15:51:10.784455 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_edfdd3cb-77ed-4232-898c-8b61bad9c133/mysql-bootstrap/0.log" Jan 29 15:51:10 crc kubenswrapper[4753]: I0129 15:51:10.789789 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0686c722-99c5-44dd-994b-3525d5642d96/galera/0.log" Jan 29 15:51:11 crc kubenswrapper[4753]: I0129 15:51:11.008254 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_edfdd3cb-77ed-4232-898c-8b61bad9c133/mysql-bootstrap/0.log" Jan 29 15:51:11 crc kubenswrapper[4753]: I0129 15:51:11.024299 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_edfdd3cb-77ed-4232-898c-8b61bad9c133/galera/0.log" Jan 29 15:51:11 crc kubenswrapper[4753]: I0129 15:51:11.051269 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_218a05db-5006-47e3-992f-2d49802ffe9f/openstackclient/0.log" Jan 29 15:51:11 crc kubenswrapper[4753]: I0129 15:51:11.246909 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-jr6fv_e68ccb6b-3714-472e-a754-247bb456104d/ovn-controller/0.log" Jan 29 15:51:11 crc kubenswrapper[4753]: I0129 15:51:11.488390 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jr9zv_d7baa621-2d41-4a8a-ad05-87491d3f20ad/openstack-network-exporter/0.log" Jan 29 15:51:11 crc kubenswrapper[4753]: I0129 15:51:11.609495 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fljn5_68a6ad2e-376c-4b67-94f1-2c39cd523d5b/ovsdb-server-init/0.log" Jan 29 15:51:11 crc kubenswrapper[4753]: I0129 15:51:11.799080 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fljn5_68a6ad2e-376c-4b67-94f1-2c39cd523d5b/ovs-vswitchd/0.log" Jan 29 15:51:11 crc kubenswrapper[4753]: I0129 15:51:11.803946 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fljn5_68a6ad2e-376c-4b67-94f1-2c39cd523d5b/ovsdb-server-init/0.log" Jan 29 15:51:11 crc kubenswrapper[4753]: I0129 15:51:11.876913 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fljn5_68a6ad2e-376c-4b67-94f1-2c39cd523d5b/ovsdb-server/0.log" Jan 29 15:51:11 crc kubenswrapper[4753]: I0129 15:51:11.914409 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_564d2d65-9b14-436e-9b59-93679ab0466f/memcached/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.011040 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_b4637574-7d3a-4145-91e7-b46c949531e2/adoption/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.084785 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3ebf42b3-0d4c-45b4-b765-f02514c8dc3f/ovn-northd/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.105346 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3ebf42b3-0d4c-45b4-b765-f02514c8dc3f/openstack-network-exporter/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.552277 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4da64566-6ed8-41ab-aaa7-354bead2c806/ovsdbserver-nb/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.563495 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_231b5162-2261-4cff-80bb-10a61ef63095/openstack-network-exporter/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.595651 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4da64566-6ed8-41ab-aaa7-354bead2c806/openstack-network-exporter/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.608562 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_231b5162-2261-4cff-80bb-10a61ef63095/ovsdbserver-nb/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.750571 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_df70109c-a132-46c6-92d9-ea3af4d90b11/ovsdbserver-nb/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.794961 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_df70109c-a132-46c6-92d9-ea3af4d90b11/openstack-network-exporter/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.829938 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ff76ca68-ec4f-4730-9c99-8e51389ba0a6/openstack-network-exporter/0.log" Jan 29 15:51:12 crc kubenswrapper[4753]: I0129 15:51:12.974209 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ff76ca68-ec4f-4730-9c99-8e51389ba0a6/ovsdbserver-sb/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.044980 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_f34befd5-5c61-47e0-8dd6-e3637efdbc8d/openstack-network-exporter/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.053665 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_f34befd5-5c61-47e0-8dd6-e3637efdbc8d/ovsdbserver-sb/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.194214 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_e4a88502-4db4-4387-a942-bba43250fb20/openstack-network-exporter/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.282292 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_e4a88502-4db4-4387-a942-bba43250fb20/ovsdbserver-sb/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.365559 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7649df9fd-nqw7s_0f9e94f3-540e-4cc6-a623-59b52104d6c8/placement-api/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.390376 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7649df9fd-nqw7s_0f9e94f3-540e-4cc6-a623-59b52104d6c8/placement-log/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.467231 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_95e968fb-1cc0-4aae-a72a-204c2515f449/setup-container/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.657078 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_95e968fb-1cc0-4aae-a72a-204c2515f449/setup-container/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.666494 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_95e968fb-1cc0-4aae-a72a-204c2515f449/rabbitmq/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.690236 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_135e4efe-2677-4433-9a33-e2d1e1220037/setup-container/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.871714 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_135e4efe-2677-4433-9a33-e2d1e1220037/setup-container/0.log" Jan 29 15:51:13 crc kubenswrapper[4753]: I0129 15:51:13.973337 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_135e4efe-2677-4433-9a33-e2d1e1220037/rabbitmq/0.log" Jan 29 15:51:15 crc kubenswrapper[4753]: I0129 15:51:15.150334 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:51:15 crc kubenswrapper[4753]: E0129 15:51:15.151175 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:51:26 crc kubenswrapper[4753]: I0129 15:51:26.157223 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:51:26 crc kubenswrapper[4753]: E0129 15:51:26.157998 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:51:32 crc kubenswrapper[4753]: I0129 15:51:32.960080 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg_9ce44e70-fc27-417b-a381-1e253c42a007/util/0.log" Jan 29 15:51:33 crc kubenswrapper[4753]: I0129 15:51:33.170785 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg_9ce44e70-fc27-417b-a381-1e253c42a007/util/0.log" Jan 29 15:51:33 crc kubenswrapper[4753]: I0129 15:51:33.188927 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg_9ce44e70-fc27-417b-a381-1e253c42a007/pull/0.log" Jan 29 15:51:33 crc kubenswrapper[4753]: I0129 15:51:33.241749 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg_9ce44e70-fc27-417b-a381-1e253c42a007/pull/0.log" Jan 29 15:51:33 crc kubenswrapper[4753]: I0129 15:51:33.433785 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg_9ce44e70-fc27-417b-a381-1e253c42a007/util/0.log" Jan 29 15:51:33 crc kubenswrapper[4753]: I0129 15:51:33.434885 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg_9ce44e70-fc27-417b-a381-1e253c42a007/pull/0.log" Jan 29 15:51:33 crc kubenswrapper[4753]: I0129 15:51:33.500714 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1d5lpg_9ce44e70-fc27-417b-a381-1e253c42a007/extract/0.log" Jan 29 15:51:33 crc kubenswrapper[4753]: I0129 15:51:33.791738 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-28p9j_8ef2b554-7857-404c-adce-f82ebcf71f72/manager/0.log" Jan 29 15:51:33 crc kubenswrapper[4753]: I0129 15:51:33.801289 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-h5nxt_9d8da066-fe2d-4cf7-b721-3155f8f11510/manager/0.log" Jan 29 15:51:33 crc kubenswrapper[4753]: I0129 15:51:33.890404 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-vc8jd_3482315b-e8cc-4dcf-9eb6-fb120739e361/manager/0.log" Jan 29 15:51:34 crc kubenswrapper[4753]: I0129 15:51:34.177804 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-57kmq_ea463c74-766e-424b-a930-cc8cad45ea88/manager/0.log" Jan 29 15:51:34 crc kubenswrapper[4753]: I0129 15:51:34.207233 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-n6kx7_f20cfb79-500e-4652-b32d-098d8b27e031/manager/0.log" Jan 29 15:51:34 crc kubenswrapper[4753]: I0129 15:51:34.265257 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-2mvrq_81d6f0fa-ae4f-46e3-8103-bc97b1afc209/manager/0.log" Jan 29 15:51:34 crc kubenswrapper[4753]: I0129 15:51:34.443011 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-jnsmr_91f93520-085a-4d27-bffa-f8bc1956c686/manager/0.log" Jan 29 15:51:34 crc kubenswrapper[4753]: I0129 15:51:34.793838 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-tjf4x_88bc469c-847c-4e52-9612-cfd238cbcf3d/manager/0.log" Jan 29 15:51:34 crc kubenswrapper[4753]: I0129 15:51:34.917281 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-n6t9d_ed5e20f7-cf91-4238-9472-eba0bcc3183b/manager/0.log" Jan 29 15:51:34 crc kubenswrapper[4753]: I0129 15:51:34.987314 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-mgwrk_6bfa698b-c528-4171-88ee-3480e2715dc9/manager/0.log" Jan 29 15:51:35 crc kubenswrapper[4753]: I0129 15:51:35.069988 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-9777h_7d8a4a16-258a-4759-980b-98f13fa2e64c/manager/0.log" Jan 29 15:51:35 crc kubenswrapper[4753]: I0129 15:51:35.245792 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-hr6vm_6d74ea48-8122-4dae-9adf-472a4d2ce3c9/manager/0.log" Jan 29 15:51:35 crc kubenswrapper[4753]: I0129 15:51:35.503014 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-lml7t_4f2207b2-9101-4661-8ccf-d2eb0c57092a/manager/0.log" Jan 29 15:51:35 crc kubenswrapper[4753]: I0129 15:51:35.558240 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-8wdkj_9591e3df-ea5a-4f4d-b94d-e8bc5d6531fb/manager/0.log" Jan 29 15:51:35 crc kubenswrapper[4753]: I0129 15:51:35.596125 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86dfb79cc7cqbmq_20e19d51-387b-4da6-8e39-652b24176ef9/manager/0.log" Jan 29 15:51:35 crc kubenswrapper[4753]: I0129 15:51:35.777951 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-757f46c65d-zw2hc_e80d82c2-34f9-4d52-84c8-880f1c787a27/operator/0.log" Jan 29 15:51:36 crc kubenswrapper[4753]: I0129 15:51:36.119246 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-scww8_ceb72c42-5011-4360-9b4f-eae057a53ac0/registry-server/0.log" Jan 29 15:51:36 crc kubenswrapper[4753]: I0129 15:51:36.357906 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-c6w2c_10c551ac-f50b-4773-8c83-e3e10e76f0c1/manager/0.log" Jan 29 15:51:36 crc kubenswrapper[4753]: I0129 15:51:36.417469 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-pkk94_a3142007-86c8-4dda-a225-813a250be829/manager/0.log" Jan 29 15:51:36 crc kubenswrapper[4753]: I0129 15:51:36.662072 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pkpp6_26b13b81-bb4c-4b22-88a3-975875eb76dc/operator/0.log" Jan 29 15:51:36 crc kubenswrapper[4753]: I0129 15:51:36.952504 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-56nqb_fc8c4def-3a2e-4f72-a398-fb1c7d6e47e4/manager/0.log" Jan 29 15:51:37 crc kubenswrapper[4753]: I0129 15:51:37.006327 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-5nnl6_460c2c4f-24cd-4817-8145-20641c54a23e/manager/0.log" Jan 29 15:51:37 crc kubenswrapper[4753]: I0129 15:51:37.212113 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-57nnj_ed5b3b30-fd89-4137-9d0c-d9aaa7bd19f0/manager/0.log" Jan 29 15:51:37 crc kubenswrapper[4753]: I0129 15:51:37.512975 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-wfszx_53a3af2e-de03-483f-ba36-253eb5e9db1d/manager/0.log" Jan 29 15:51:37 crc kubenswrapper[4753]: I0129 15:51:37.615394 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b6f655c79-dtkh6_d7e2152d-0998-475e-b645-23df5698e858/manager/0.log" Jan 29 15:51:40 crc kubenswrapper[4753]: I0129 15:51:40.150215 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:51:40 crc kubenswrapper[4753]: E0129 15:51:40.151133 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:51:53 crc kubenswrapper[4753]: I0129 15:51:53.150634 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:51:53 crc kubenswrapper[4753]: E0129 15:51:53.152045 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:51:56 crc kubenswrapper[4753]: I0129 15:51:56.995543 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-d76gc_5a293056-ca09-4e84-86a5-11785aaa9a62/control-plane-machine-set-operator/0.log" Jan 29 15:51:57 crc kubenswrapper[4753]: I0129 15:51:57.221798 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jvpqh_78dab1db-992a-4ae3-97a1-d70613ac41fe/kube-rbac-proxy/0.log" Jan 29 15:51:57 crc kubenswrapper[4753]: I0129 15:51:57.249786 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jvpqh_78dab1db-992a-4ae3-97a1-d70613ac41fe/machine-api-operator/0.log" Jan 29 15:52:04 crc kubenswrapper[4753]: I0129 15:52:04.150194 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:52:04 crc kubenswrapper[4753]: E0129 15:52:04.151047 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:52:09 crc kubenswrapper[4753]: I0129 15:52:09.574197 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-t4wnm_9a2d92f9-fefb-4f88-9ee2-3841da9e0a74/cert-manager-controller/0.log" Jan 29 15:52:09 crc kubenswrapper[4753]: I0129 15:52:09.765765 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-55p62_08add473-034f-403a-9f96-7d5fc2e4c8df/cert-manager-cainjector/0.log" Jan 29 15:52:09 crc kubenswrapper[4753]: I0129 15:52:09.835126 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-bhj74_037d7c67-99e9-410d-8f17-77d3f95b1443/cert-manager-webhook/0.log" Jan 29 15:52:19 crc kubenswrapper[4753]: I0129 15:52:19.149796 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:52:19 crc kubenswrapper[4753]: E0129 15:52:19.150826 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:52:21 crc kubenswrapper[4753]: I0129 15:52:21.542347 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-pxnz4_69a0f59f-b01e-4f27-9001-bd2460df3a27/nmstate-console-plugin/0.log" Jan 29 15:52:21 crc kubenswrapper[4753]: I0129 15:52:21.755966 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wfhtc_7738b656-597e-4e3c-89ea-3b16e36b5c9f/nmstate-handler/0.log" Jan 29 15:52:21 crc kubenswrapper[4753]: I0129 15:52:21.811265 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-nvcx7_3c3c4dba-c774-449f-a0a9-afd2119b5730/kube-rbac-proxy/0.log" Jan 29 15:52:21 crc kubenswrapper[4753]: I0129 15:52:21.898955 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-nvcx7_3c3c4dba-c774-449f-a0a9-afd2119b5730/nmstate-metrics/0.log" Jan 29 15:52:21 crc kubenswrapper[4753]: I0129 15:52:21.977385 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-6vgcx_b343a24e-ab8c-4aff-8eeb-99b1f50868eb/nmstate-operator/0.log" Jan 29 15:52:22 crc kubenswrapper[4753]: I0129 15:52:22.130655 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-ksd2z_34db578d-7849-457b-bf77-5bd07a7fb0b5/nmstate-webhook/0.log" Jan 29 15:52:23 crc kubenswrapper[4753]: I0129 15:52:23.044497 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-2tsdk"] Jan 29 15:52:23 crc kubenswrapper[4753]: I0129 15:52:23.055125 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cb3d-account-create-update-zsflm"] Jan 29 15:52:23 crc kubenswrapper[4753]: I0129 15:52:23.064651 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-2tsdk"] Jan 29 15:52:23 crc kubenswrapper[4753]: I0129 15:52:23.074046 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cb3d-account-create-update-zsflm"] Jan 29 15:52:24 crc kubenswrapper[4753]: I0129 15:52:24.160565 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5980b21e-1b3d-4944-b456-b7b1047ab256" path="/var/lib/kubelet/pods/5980b21e-1b3d-4944-b456-b7b1047ab256/volumes" Jan 29 15:52:24 crc kubenswrapper[4753]: I0129 15:52:24.161680 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a2d9450-6c48-4896-b5f6-cafb5803c488" path="/var/lib/kubelet/pods/8a2d9450-6c48-4896-b5f6-cafb5803c488/volumes" Jan 29 15:52:34 crc kubenswrapper[4753]: I0129 15:52:34.150045 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:52:34 crc kubenswrapper[4753]: E0129 15:52:34.150947 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:52:35 crc kubenswrapper[4753]: I0129 15:52:35.718506 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-k6ssj_8d9e5081-ef9a-4c43-a32a-e4917c8c1db2/prometheus-operator/0.log" Jan 29 15:52:35 crc kubenswrapper[4753]: I0129 15:52:35.935752 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b499c759f-4cprd_c0b81ce9-61fe-40de-9647-d14c933f2b13/prometheus-operator-admission-webhook/0.log" Jan 29 15:52:36 crc kubenswrapper[4753]: I0129 15:52:36.007643 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b499c759f-5b2cp_830e25f0-520a-446b-a145-5c1f0f3ceea1/prometheus-operator-admission-webhook/0.log" Jan 29 15:52:36 crc kubenswrapper[4753]: I0129 15:52:36.156113 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-j8wsv_45d80c47-de8f-426c-a6cf-c46adcf7394a/operator/0.log" Jan 29 15:52:36 crc kubenswrapper[4753]: I0129 15:52:36.258651 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dwlls_785cd806-53b9-41c5-bde3-8445651ffff5/perses-operator/0.log" Jan 29 15:52:42 crc kubenswrapper[4753]: I0129 15:52:42.046207 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-w4q8l"] Jan 29 15:52:42 crc kubenswrapper[4753]: I0129 15:52:42.062968 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-w4q8l"] Jan 29 15:52:42 crc kubenswrapper[4753]: I0129 15:52:42.161393 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a71ced-859b-411a-8502-2bbeeae1bf5e" path="/var/lib/kubelet/pods/17a71ced-859b-411a-8502-2bbeeae1bf5e/volumes" Jan 29 15:52:48 crc kubenswrapper[4753]: I0129 15:52:48.150265 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:52:48 crc kubenswrapper[4753]: E0129 15:52:48.150971 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:52:50 crc kubenswrapper[4753]: I0129 15:52:50.755850 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-kwv4j_6ceea48e-1742-416c-9ecd-389b66588487/kube-rbac-proxy/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.017829 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-frr-files/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.184432 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-reloader/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.209549 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-frr-files/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.244038 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-kwv4j_6ceea48e-1742-416c-9ecd-389b66588487/controller/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.297273 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-metrics/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.405517 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-reloader/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.630136 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-metrics/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.650705 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-reloader/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.685628 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-metrics/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.685967 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-frr-files/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.847540 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-reloader/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.874697 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-metrics/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.874730 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/cp-frr-files/0.log" Jan 29 15:52:51 crc kubenswrapper[4753]: I0129 15:52:51.898427 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/controller/0.log" Jan 29 15:52:52 crc kubenswrapper[4753]: I0129 15:52:52.103930 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/frr-metrics/0.log" Jan 29 15:52:52 crc kubenswrapper[4753]: I0129 15:52:52.144923 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/kube-rbac-proxy-frr/0.log" Jan 29 15:52:52 crc kubenswrapper[4753]: I0129 15:52:52.167575 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/kube-rbac-proxy/0.log" Jan 29 15:52:52 crc kubenswrapper[4753]: I0129 15:52:52.356622 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/reloader/0.log" Jan 29 15:52:52 crc kubenswrapper[4753]: I0129 15:52:52.418295 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-cc5mj_d6ad7a91-6b35-4d1d-aa3d-e9e1a5a164c4/frr-k8s-webhook-server/0.log" Jan 29 15:52:52 crc kubenswrapper[4753]: I0129 15:52:52.696699 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5657cf448d-jpgpp_dd141147-06aa-42ff-86af-96ad3b852349/manager/0.log" Jan 29 15:52:52 crc kubenswrapper[4753]: I0129 15:52:52.748065 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7b447d96c7-khxwk_82c917f2-b33e-4508-8f83-b54e5238fb38/webhook-server/0.log" Jan 29 15:52:53 crc kubenswrapper[4753]: I0129 15:52:53.033674 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tvgdk_ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec/kube-rbac-proxy/0.log" Jan 29 15:52:54 crc kubenswrapper[4753]: I0129 15:52:54.686944 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tvgdk_ca38a0c1-c5a0-4e33-8ea4-165facbeb3ec/speaker/0.log" Jan 29 15:52:55 crc kubenswrapper[4753]: I0129 15:52:55.637405 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xwns_110bd7ac-0311-4b68-82b0-8d33b63a24bc/frr/0.log" Jan 29 15:53:01 crc kubenswrapper[4753]: I0129 15:53:01.149817 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:53:01 crc kubenswrapper[4753]: E0129 15:53:01.150591 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:53:07 crc kubenswrapper[4753]: I0129 15:53:07.502093 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf_11f899c0-458d-470c-862b-a13ec652365c/util/0.log" Jan 29 15:53:07 crc kubenswrapper[4753]: I0129 15:53:07.754297 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf_11f899c0-458d-470c-862b-a13ec652365c/pull/0.log" Jan 29 15:53:07 crc kubenswrapper[4753]: I0129 15:53:07.761568 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf_11f899c0-458d-470c-862b-a13ec652365c/util/0.log" Jan 29 15:53:07 crc kubenswrapper[4753]: I0129 15:53:07.767614 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf_11f899c0-458d-470c-862b-a13ec652365c/pull/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.003387 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf_11f899c0-458d-470c-862b-a13ec652365c/extract/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.015014 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf_11f899c0-458d-470c-862b-a13ec652365c/util/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.016307 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dchcwjf_11f899c0-458d-470c-862b-a13ec652365c/pull/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.235542 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg_148bd6aa-767b-4aff-9fb1-e0a34a060121/util/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.394496 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg_148bd6aa-767b-4aff-9fb1-e0a34a060121/pull/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.407613 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg_148bd6aa-767b-4aff-9fb1-e0a34a060121/pull/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.429557 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg_148bd6aa-767b-4aff-9fb1-e0a34a060121/util/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.663327 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg_148bd6aa-767b-4aff-9fb1-e0a34a060121/pull/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.690682 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg_148bd6aa-767b-4aff-9fb1-e0a34a060121/util/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.699355 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713nj6zg_148bd6aa-767b-4aff-9fb1-e0a34a060121/extract/0.log" Jan 29 15:53:08 crc kubenswrapper[4753]: I0129 15:53:08.852881 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8_f141c868-e82d-452b-9ae8-3e160d964237/util/0.log" Jan 29 15:53:09 crc kubenswrapper[4753]: I0129 15:53:09.050447 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8_f141c868-e82d-452b-9ae8-3e160d964237/util/0.log" Jan 29 15:53:09 crc kubenswrapper[4753]: I0129 15:53:09.053676 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8_f141c868-e82d-452b-9ae8-3e160d964237/pull/0.log" Jan 29 15:53:09 crc kubenswrapper[4753]: I0129 15:53:09.104924 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8_f141c868-e82d-452b-9ae8-3e160d964237/pull/0.log" Jan 29 15:53:09 crc kubenswrapper[4753]: I0129 15:53:09.293862 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8_f141c868-e82d-452b-9ae8-3e160d964237/util/0.log" Jan 29 15:53:09 crc kubenswrapper[4753]: I0129 15:53:09.317558 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8_f141c868-e82d-452b-9ae8-3e160d964237/pull/0.log" Jan 29 15:53:09 crc kubenswrapper[4753]: I0129 15:53:09.378045 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5nlvq8_f141c868-e82d-452b-9ae8-3e160d964237/extract/0.log" Jan 29 15:53:09 crc kubenswrapper[4753]: I0129 15:53:09.501732 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_ea6b2cb6-43c4-4b8d-ac86-c5522959a43b/util/0.log" Jan 29 15:53:09 crc kubenswrapper[4753]: I0129 15:53:09.699836 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_ea6b2cb6-43c4-4b8d-ac86-c5522959a43b/pull/0.log" Jan 29 15:53:09 crc kubenswrapper[4753]: I0129 15:53:09.722853 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_ea6b2cb6-43c4-4b8d-ac86-c5522959a43b/pull/0.log" Jan 29 15:53:09 crc kubenswrapper[4753]: I0129 15:53:09.770203 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_ea6b2cb6-43c4-4b8d-ac86-c5522959a43b/util/0.log" Jan 29 15:53:10 crc kubenswrapper[4753]: I0129 15:53:10.036126 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_ea6b2cb6-43c4-4b8d-ac86-c5522959a43b/util/0.log" Jan 29 15:53:10 crc kubenswrapper[4753]: I0129 15:53:10.087446 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_ea6b2cb6-43c4-4b8d-ac86-c5522959a43b/pull/0.log" Jan 29 15:53:10 crc kubenswrapper[4753]: I0129 15:53:10.130084 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084rq44_ea6b2cb6-43c4-4b8d-ac86-c5522959a43b/extract/0.log" Jan 29 15:53:10 crc kubenswrapper[4753]: I0129 15:53:10.250405 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-shp7j_78aeb34e-8507-419c-ae21-144a722afc4a/extract-utilities/0.log" Jan 29 15:53:10 crc kubenswrapper[4753]: I0129 15:53:10.403507 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-shp7j_78aeb34e-8507-419c-ae21-144a722afc4a/extract-utilities/0.log" Jan 29 15:53:10 crc kubenswrapper[4753]: I0129 15:53:10.450430 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-shp7j_78aeb34e-8507-419c-ae21-144a722afc4a/extract-content/0.log" Jan 29 15:53:10 crc kubenswrapper[4753]: I0129 15:53:10.459986 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-shp7j_78aeb34e-8507-419c-ae21-144a722afc4a/extract-content/0.log" Jan 29 15:53:10 crc kubenswrapper[4753]: I0129 15:53:10.661915 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-shp7j_78aeb34e-8507-419c-ae21-144a722afc4a/extract-content/0.log" Jan 29 15:53:10 crc kubenswrapper[4753]: I0129 15:53:10.671513 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-shp7j_78aeb34e-8507-419c-ae21-144a722afc4a/extract-utilities/0.log" Jan 29 15:53:10 crc kubenswrapper[4753]: I0129 15:53:10.898431 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bzksl_02b5ed5f-a363-45d9-b107-1d33890a617c/extract-utilities/0.log" Jan 29 15:53:11 crc kubenswrapper[4753]: I0129 15:53:11.128904 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bzksl_02b5ed5f-a363-45d9-b107-1d33890a617c/extract-content/0.log" Jan 29 15:53:11 crc kubenswrapper[4753]: I0129 15:53:11.165036 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bzksl_02b5ed5f-a363-45d9-b107-1d33890a617c/extract-utilities/0.log" Jan 29 15:53:11 crc kubenswrapper[4753]: I0129 15:53:11.167854 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bzksl_02b5ed5f-a363-45d9-b107-1d33890a617c/extract-content/0.log" Jan 29 15:53:11 crc kubenswrapper[4753]: I0129 15:53:11.422847 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bzksl_02b5ed5f-a363-45d9-b107-1d33890a617c/extract-utilities/0.log" Jan 29 15:53:11 crc kubenswrapper[4753]: I0129 15:53:11.463255 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-shp7j_78aeb34e-8507-419c-ae21-144a722afc4a/registry-server/0.log" Jan 29 15:53:11 crc kubenswrapper[4753]: I0129 15:53:11.472845 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bzksl_02b5ed5f-a363-45d9-b107-1d33890a617c/extract-content/0.log" Jan 29 15:53:11 crc kubenswrapper[4753]: I0129 15:53:11.774781 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xnjnx_62f2be0b-c83f-4b74-80cc-504f1221b322/marketplace-operator/0.log" Jan 29 15:53:11 crc kubenswrapper[4753]: I0129 15:53:11.934162 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8spmk_b9371f46-b818-44df-9f4f-4c04ac5fd78d/extract-utilities/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.148743 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8spmk_b9371f46-b818-44df-9f4f-4c04ac5fd78d/extract-content/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.149335 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:53:12 crc kubenswrapper[4753]: E0129 15:53:12.149758 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.179217 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8spmk_b9371f46-b818-44df-9f4f-4c04ac5fd78d/extract-content/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.196466 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8spmk_b9371f46-b818-44df-9f4f-4c04ac5fd78d/extract-utilities/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.422026 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8spmk_b9371f46-b818-44df-9f4f-4c04ac5fd78d/extract-utilities/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.445340 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8spmk_b9371f46-b818-44df-9f4f-4c04ac5fd78d/extract-content/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.532732 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bzksl_02b5ed5f-a363-45d9-b107-1d33890a617c/registry-server/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.663421 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l6qbh_3a8e6cb9-8e67-42dd-9827-812a46628fb5/extract-utilities/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.752897 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8spmk_b9371f46-b818-44df-9f4f-4c04ac5fd78d/registry-server/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.836894 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l6qbh_3a8e6cb9-8e67-42dd-9827-812a46628fb5/extract-content/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.863886 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l6qbh_3a8e6cb9-8e67-42dd-9827-812a46628fb5/extract-content/0.log" Jan 29 15:53:12 crc kubenswrapper[4753]: I0129 15:53:12.876400 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l6qbh_3a8e6cb9-8e67-42dd-9827-812a46628fb5/extract-utilities/0.log" Jan 29 15:53:13 crc kubenswrapper[4753]: I0129 15:53:13.021974 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l6qbh_3a8e6cb9-8e67-42dd-9827-812a46628fb5/extract-utilities/0.log" Jan 29 15:53:13 crc kubenswrapper[4753]: I0129 15:53:13.070790 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l6qbh_3a8e6cb9-8e67-42dd-9827-812a46628fb5/extract-content/0.log" Jan 29 15:53:13 crc kubenswrapper[4753]: I0129 15:53:13.957514 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-l6qbh_3a8e6cb9-8e67-42dd-9827-812a46628fb5/registry-server/0.log" Jan 29 15:53:14 crc kubenswrapper[4753]: I0129 15:53:14.675437 4753 scope.go:117] "RemoveContainer" containerID="5daee1292b901aa70277adf7b115daf106676adf5a935efd229bf172bd4b5ae9" Jan 29 15:53:14 crc kubenswrapper[4753]: I0129 15:53:14.715874 4753 scope.go:117] "RemoveContainer" containerID="8e5eb4735ff280a07a116e04bd45b5783dafc0d21b8867bbc82e0e110805be2e" Jan 29 15:53:14 crc kubenswrapper[4753]: I0129 15:53:14.760604 4753 scope.go:117] "RemoveContainer" containerID="85c759c0c34e73ab48cbd19681d76303b45380c75a0b0c24f7a82ba282c246c7" Jan 29 15:53:26 crc kubenswrapper[4753]: I0129 15:53:26.155801 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:53:26 crc kubenswrapper[4753]: E0129 15:53:26.156794 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:53:26 crc kubenswrapper[4753]: I0129 15:53:26.782286 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-k6ssj_8d9e5081-ef9a-4c43-a32a-e4917c8c1db2/prometheus-operator/0.log" Jan 29 15:53:26 crc kubenswrapper[4753]: I0129 15:53:26.877506 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b499c759f-4cprd_c0b81ce9-61fe-40de-9647-d14c933f2b13/prometheus-operator-admission-webhook/0.log" Jan 29 15:53:26 crc kubenswrapper[4753]: I0129 15:53:26.933515 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-b499c759f-5b2cp_830e25f0-520a-446b-a145-5c1f0f3ceea1/prometheus-operator-admission-webhook/0.log" Jan 29 15:53:27 crc kubenswrapper[4753]: I0129 15:53:27.215577 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dwlls_785cd806-53b9-41c5-bde3-8445651ffff5/perses-operator/0.log" Jan 29 15:53:27 crc kubenswrapper[4753]: I0129 15:53:27.242883 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-j8wsv_45d80c47-de8f-426c-a6cf-c46adcf7394a/operator/0.log" Jan 29 15:53:38 crc kubenswrapper[4753]: I0129 15:53:38.149868 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:53:38 crc kubenswrapper[4753]: E0129 15:53:38.150710 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:53:52 crc kubenswrapper[4753]: I0129 15:53:52.149835 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:53:52 crc kubenswrapper[4753]: E0129 15:53:52.150795 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:54:03 crc kubenswrapper[4753]: I0129 15:54:03.149426 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:54:03 crc kubenswrapper[4753]: E0129 15:54:03.150331 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:54:15 crc kubenswrapper[4753]: I0129 15:54:15.150987 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:54:15 crc kubenswrapper[4753]: E0129 15:54:15.152114 4753 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-x6rpz_openshift-machine-config-operator(49d14260-5f77-47b9-97e1-c843cf322a0f)\"" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" Jan 29 15:54:27 crc kubenswrapper[4753]: I0129 15:54:27.149539 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:54:27 crc kubenswrapper[4753]: I0129 15:54:27.946318 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"f0726e111d3975eab33efe41698e8b784342b6e1637dbfdf2138bc8175393c25"} Jan 29 15:55:04 crc kubenswrapper[4753]: I0129 15:55:04.304251 4753 generic.go:334] "Generic (PLEG): container finished" podID="ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" containerID="904f62a3e8ed0487f213fe5c97d5a9b335be76524de2cb8e5216f153c44b4eba" exitCode=0 Jan 29 15:55:04 crc kubenswrapper[4753]: I0129 15:55:04.304288 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5z222/must-gather-tl9tg" event={"ID":"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1","Type":"ContainerDied","Data":"904f62a3e8ed0487f213fe5c97d5a9b335be76524de2cb8e5216f153c44b4eba"} Jan 29 15:55:04 crc kubenswrapper[4753]: I0129 15:55:04.305541 4753 scope.go:117] "RemoveContainer" containerID="904f62a3e8ed0487f213fe5c97d5a9b335be76524de2cb8e5216f153c44b4eba" Jan 29 15:55:04 crc kubenswrapper[4753]: I0129 15:55:04.727666 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5z222_must-gather-tl9tg_ae568994-afdf-4e64-a3c3-86ba8e7fbbf1/gather/0.log" Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.235336 4753 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5z222/must-gather-tl9tg"] Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.236533 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5z222/must-gather-tl9tg" podUID="ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" containerName="copy" containerID="cri-o://649861c985e6502475eae3a1faf0aa84f466446c36b9a4053e44f418cf0b8d79" gracePeriod=2 Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.247773 4753 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5z222/must-gather-tl9tg"] Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.383403 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5z222_must-gather-tl9tg_ae568994-afdf-4e64-a3c3-86ba8e7fbbf1/copy/0.log" Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.384288 4753 generic.go:334] "Generic (PLEG): container finished" podID="ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" containerID="649861c985e6502475eae3a1faf0aa84f466446c36b9a4053e44f418cf0b8d79" exitCode=143 Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.774903 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5z222_must-gather-tl9tg_ae568994-afdf-4e64-a3c3-86ba8e7fbbf1/copy/0.log" Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.775916 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/must-gather-tl9tg" Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.833738 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qts6l\" (UniqueName: \"kubernetes.io/projected/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-kube-api-access-qts6l\") pod \"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1\" (UID: \"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1\") " Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.833907 4753 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-must-gather-output\") pod \"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1\" (UID: \"ae568994-afdf-4e64-a3c3-86ba8e7fbbf1\") " Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.839198 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-kube-api-access-qts6l" (OuterVolumeSpecName: "kube-api-access-qts6l") pod "ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" (UID: "ae568994-afdf-4e64-a3c3-86ba8e7fbbf1"). InnerVolumeSpecName "kube-api-access-qts6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.939435 4753 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qts6l\" (UniqueName: \"kubernetes.io/projected/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-kube-api-access-qts6l\") on node \"crc\" DevicePath \"\"" Jan 29 15:55:12 crc kubenswrapper[4753]: I0129 15:55:12.977945 4753 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" (UID: "ae568994-afdf-4e64-a3c3-86ba8e7fbbf1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 29 15:55:13 crc kubenswrapper[4753]: I0129 15:55:13.042412 4753 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 29 15:55:13 crc kubenswrapper[4753]: I0129 15:55:13.400889 4753 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5z222_must-gather-tl9tg_ae568994-afdf-4e64-a3c3-86ba8e7fbbf1/copy/0.log" Jan 29 15:55:13 crc kubenswrapper[4753]: I0129 15:55:13.401807 4753 scope.go:117] "RemoveContainer" containerID="649861c985e6502475eae3a1faf0aa84f466446c36b9a4053e44f418cf0b8d79" Jan 29 15:55:13 crc kubenswrapper[4753]: I0129 15:55:13.401957 4753 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5z222/must-gather-tl9tg" Jan 29 15:55:13 crc kubenswrapper[4753]: I0129 15:55:13.428504 4753 scope.go:117] "RemoveContainer" containerID="904f62a3e8ed0487f213fe5c97d5a9b335be76524de2cb8e5216f153c44b4eba" Jan 29 15:55:14 crc kubenswrapper[4753]: I0129 15:55:14.161714 4753 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" path="/var/lib/kubelet/pods/ae568994-afdf-4e64-a3c3-86ba8e7fbbf1/volumes" Jan 29 15:55:14 crc kubenswrapper[4753]: I0129 15:55:14.899728 4753 scope.go:117] "RemoveContainer" containerID="ffff470bf90e6ab1c4427b7c7787bf0c2b01ef052927ae444d6166de3ede347a" Jan 29 15:55:14 crc kubenswrapper[4753]: I0129 15:55:14.934762 4753 scope.go:117] "RemoveContainer" containerID="a2f9c69f1aedd55a35a0724dfa115e5098143341d0967f2a5dbfb71aed455ecf" Jan 29 15:55:14 crc kubenswrapper[4753]: I0129 15:55:14.977078 4753 scope.go:117] "RemoveContainer" containerID="85215c46458a812dac16706990d05ca42d06ff7309e5a07bbc984a3fa28452d7" Jan 29 15:56:15 crc kubenswrapper[4753]: I0129 15:56:15.117619 4753 scope.go:117] "RemoveContainer" containerID="04c3ed89d5ac7394ffc9b1a06d61f21d39c06d671a648412053630f80a79465c" Jan 29 15:56:27 crc kubenswrapper[4753]: I0129 15:56:27.055029 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:56:27 crc kubenswrapper[4753]: I0129 15:56:27.055639 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:56:57 crc kubenswrapper[4753]: I0129 15:56:57.054839 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:56:57 crc kubenswrapper[4753]: I0129 15:56:57.055566 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:57:15 crc kubenswrapper[4753]: I0129 15:57:15.187402 4753 scope.go:117] "RemoveContainer" containerID="97d3c1b5eac417f7bd468f1fc2543982fcaf77feab98122f74f4b88baf112665" Jan 29 15:57:27 crc kubenswrapper[4753]: I0129 15:57:27.054763 4753 patch_prober.go:28] interesting pod/machine-config-daemon-x6rpz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 29 15:57:27 crc kubenswrapper[4753]: I0129 15:57:27.055431 4753 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 29 15:57:27 crc kubenswrapper[4753]: I0129 15:57:27.055483 4753 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" Jan 29 15:57:27 crc kubenswrapper[4753]: I0129 15:57:27.056305 4753 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0726e111d3975eab33efe41698e8b784342b6e1637dbfdf2138bc8175393c25"} pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 29 15:57:27 crc kubenswrapper[4753]: I0129 15:57:27.056362 4753 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" podUID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerName="machine-config-daemon" containerID="cri-o://f0726e111d3975eab33efe41698e8b784342b6e1637dbfdf2138bc8175393c25" gracePeriod=600 Jan 29 15:57:27 crc kubenswrapper[4753]: I0129 15:57:27.888141 4753 generic.go:334] "Generic (PLEG): container finished" podID="49d14260-5f77-47b9-97e1-c843cf322a0f" containerID="f0726e111d3975eab33efe41698e8b784342b6e1637dbfdf2138bc8175393c25" exitCode=0 Jan 29 15:57:27 crc kubenswrapper[4753]: I0129 15:57:27.889226 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerDied","Data":"f0726e111d3975eab33efe41698e8b784342b6e1637dbfdf2138bc8175393c25"} Jan 29 15:57:27 crc kubenswrapper[4753]: I0129 15:57:27.889301 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-x6rpz" event={"ID":"49d14260-5f77-47b9-97e1-c843cf322a0f","Type":"ContainerStarted","Data":"28d909eac1b59a16369c05ff138cca1d3bf12a1b0d5c08c062e017bff29b1c17"} Jan 29 15:57:27 crc kubenswrapper[4753]: I0129 15:57:27.889335 4753 scope.go:117] "RemoveContainer" containerID="ccd1c74947f06d9156433bc7d46ee40c7d12bd9aef4249fa507cbcdff3fc743e" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.063572 4753 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7zvtk"] Jan 29 15:58:15 crc kubenswrapper[4753]: E0129 15:58:15.065828 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" containerName="gather" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.065933 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" containerName="gather" Jan 29 15:58:15 crc kubenswrapper[4753]: E0129 15:58:15.066040 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e88645-493b-49af-a4d1-f49e759b2d09" containerName="extract-utilities" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.066127 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e88645-493b-49af-a4d1-f49e759b2d09" containerName="extract-utilities" Jan 29 15:58:15 crc kubenswrapper[4753]: E0129 15:58:15.066235 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1" containerName="container-00" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.066346 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1" containerName="container-00" Jan 29 15:58:15 crc kubenswrapper[4753]: E0129 15:58:15.066446 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e88645-493b-49af-a4d1-f49e759b2d09" containerName="extract-content" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.066520 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e88645-493b-49af-a4d1-f49e759b2d09" containerName="extract-content" Jan 29 15:58:15 crc kubenswrapper[4753]: E0129 15:58:15.066641 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" containerName="copy" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.066712 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" containerName="copy" Jan 29 15:58:15 crc kubenswrapper[4753]: E0129 15:58:15.066800 4753 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e88645-493b-49af-a4d1-f49e759b2d09" containerName="registry-server" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.066878 4753 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e88645-493b-49af-a4d1-f49e759b2d09" containerName="registry-server" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.067202 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="70b32bf8-b434-4f49-ad9e-c2ffdd54f1b1" containerName="container-00" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.067306 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" containerName="gather" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.067391 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e88645-493b-49af-a4d1-f49e759b2d09" containerName="registry-server" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.067459 4753 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae568994-afdf-4e64-a3c3-86ba8e7fbbf1" containerName="copy" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.069222 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.094222 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7zvtk"] Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.130956 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbhjl\" (UniqueName: \"kubernetes.io/projected/a4da0ada-58c4-4cc5-a060-1eea6d35e8e2-kube-api-access-tbhjl\") pod \"community-operators-7zvtk\" (UID: \"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2\") " pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.131005 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4da0ada-58c4-4cc5-a060-1eea6d35e8e2-utilities\") pod \"community-operators-7zvtk\" (UID: \"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2\") " pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.131058 4753 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4da0ada-58c4-4cc5-a060-1eea6d35e8e2-catalog-content\") pod \"community-operators-7zvtk\" (UID: \"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2\") " pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.235879 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbhjl\" (UniqueName: \"kubernetes.io/projected/a4da0ada-58c4-4cc5-a060-1eea6d35e8e2-kube-api-access-tbhjl\") pod \"community-operators-7zvtk\" (UID: \"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2\") " pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.235944 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4da0ada-58c4-4cc5-a060-1eea6d35e8e2-utilities\") pod \"community-operators-7zvtk\" (UID: \"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2\") " pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.235991 4753 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4da0ada-58c4-4cc5-a060-1eea6d35e8e2-catalog-content\") pod \"community-operators-7zvtk\" (UID: \"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2\") " pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.236528 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4da0ada-58c4-4cc5-a060-1eea6d35e8e2-utilities\") pod \"community-operators-7zvtk\" (UID: \"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2\") " pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.236800 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4da0ada-58c4-4cc5-a060-1eea6d35e8e2-catalog-content\") pod \"community-operators-7zvtk\" (UID: \"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2\") " pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.256044 4753 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbhjl\" (UniqueName: \"kubernetes.io/projected/a4da0ada-58c4-4cc5-a060-1eea6d35e8e2-kube-api-access-tbhjl\") pod \"community-operators-7zvtk\" (UID: \"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2\") " pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.391204 4753 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7zvtk" Jan 29 15:58:15 crc kubenswrapper[4753]: I0129 15:58:15.960710 4753 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7zvtk"] Jan 29 15:58:16 crc kubenswrapper[4753]: I0129 15:58:16.337650 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zvtk" event={"ID":"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2","Type":"ContainerStarted","Data":"fa6953501a44f8eb45bcf517e5d884df3c13b2f9f38f0f90d6624be89e7af56b"} Jan 29 15:58:17 crc kubenswrapper[4753]: I0129 15:58:17.349103 4753 generic.go:334] "Generic (PLEG): container finished" podID="a4da0ada-58c4-4cc5-a060-1eea6d35e8e2" containerID="145b48f38e31e4409ce03ebae12974ed1cdfa6e828108fe0cf589277598ef4e6" exitCode=0 Jan 29 15:58:17 crc kubenswrapper[4753]: I0129 15:58:17.349234 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zvtk" event={"ID":"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2","Type":"ContainerDied","Data":"145b48f38e31e4409ce03ebae12974ed1cdfa6e828108fe0cf589277598ef4e6"} Jan 29 15:58:17 crc kubenswrapper[4753]: I0129 15:58:17.352091 4753 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 29 15:58:20 crc kubenswrapper[4753]: I0129 15:58:20.375478 4753 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7zvtk" event={"ID":"a4da0ada-58c4-4cc5-a060-1eea6d35e8e2","Type":"ContainerStarted","Data":"8ee86dfc09c6478b9a0f2c07486214056b02db3e5d58d7cd33c7127405499fb3"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515136701647024457 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015136701650017366 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015136663642016521 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015136663642015471 5ustar corecore